From dd5fa5516daade891827b5e534bc268cd334765c Mon Sep 17 00:00:00 2001 From: "Shahriar P. Shuvo" Date: Wed, 20 Dec 2023 23:06:46 +0600 Subject: [PATCH] Releasing - POC3 (#1509) (#1510) * Last change. * Latest changes. * Update on review. * call quality guide. * Update prerequisites * Filter out the docs without an example. * Updates on review. * Update API ref. * Update API ref. * Update docs to match the examples supplied so far. * Add the custom audio, but not published as PR is not accepted yet. * Last update. * POC3 iOS get-stared changes. * POC3 iOS call-quality changes. * Updated doc structure * Update * Updates * call quality updates * Updates * POC3 iOS authentication changes. * POC3 macOS get-started, call-quality, authentication changes. * Start merging all the PRs for POC3 * Start merging all the PRs for POC3 * Merge the other projects. * Updates * Merge the other projects. * iOS macOS updates to match the latest code. * Updates * Update to GetStarted. * Authentification workflow. * Call quality updates. * call quality doc updates * Review updates * Update * doc structure * Updates * reversed changes committed to the wrong branch * unity poc3 * update get started for vsdk poc3 web * update vsdk secure auth docs * get started * Authentication guide. * update call quality doc * Call quality * Windows POC3 Code Doc * Updated for call quality * Get started. * Call quality. * Call quality. * implement Secure Authentication Server POC3 To DOC * cloud proxy + additional updates * media encryption unity * android media player updates (#411) * Update shared/video-sdk/get-started/get-started-sdk/project-implementation/unity.mdx * Update shared/video-sdk/get-started/get-started-sdk/project-implementation/unity.mdx * cloud-proxy updates * Review updates * review updates * Updates * review update * update cloud proxy doc * media-player * audioAndVoiceEffects * Update project-implement/unity.mdx * product-workflow * updated logic in swift join channel to show each of the three methods, all included in the sample project * removed all spaces after triple ticks, which shouldn't be there * added small description comment to callback handlers, added one for voice calling. unified ios and macos examples in get started root * removed android and flutter separated implementations. added shared files for each code snippets * removed separate ios and macos implementation, bundled them with android and flutter. * removed older swift implementations * android and flutter using poc3 mdx file for ensure-channel-quality * Geofencing unity * audio and voice effects updates * updates * Product workflow updates * updated ensure channel quality to include ios and macos. * cal quality updates * added cloud proxy and encrypt media streams poc3 templates * added geofencing poc3 example for ios and macos * added reactjs to the geofencing example, but it will currently not render in the poc3 specific document due to platform wrappers * Api ref updates + additional updates * virtual background * updates * resolved build issues * POC3 docs in new doc structure * updated ios to have testing hardware options * update platform added incorrectly fixed * updated cloud proxy for ios and macos * removed duplicated lines * update ios screensharing code to use the replaykit extension * update encrypt media, play media and custom audio video to use poc3.mdx for at least ios * updated virtual background coumentation, including test your implementation section * moved platform specific code to enable-audio-publishing code asset file * added device compatibility check for virtual background * Raw Video Audio Processing iOS POC3 new structure * Fix auto-numbering, remove empty blocks from other platforms, remove platform specific code from poc3, remove unused variables, added api ref links * added examples of modifying the pixelbuffer and audio buffer * added api references for each ios and macos code blocks in cloud-proxy, encrypt-media-streams and ensure-channel-quality * multi-channel live streaming * updates * added spatial audio for ios and macos poc3 style-y * added api reference links for custom-video-and-audio, geofencing and play-media (ios/macos) * 622 changes. * added code blocks for joining and leaving secondary channel, added code for monitoring relay state, changed delegates to not be delegate2, updated extra delegate methods, fixed api reference urls, added secondConnection and destUid to variables, added media relay code block. * updated all ios links to use engine api ref constant if required. added api ref links to product-workflow (#687) * refactor web get started * added screensharing example code in poc3 for macos * update macos screenshare api reference links to open in new tab * update web cloud proxy doc (#434) * Release notes - first draft * Link to beta docs * Added a bug fix * update get started for vsdk poc3 web * update vsdk secure auth docs * update call quality doc * Get started. * Call quality. * Call quality. * fix * revert unrelated changes * revert unrelated changes * Removed the fixed issue * Updated RN * update cloud proxy doc * refactor web get started --------- * New poc3 structure for unity (#705) * updates * new structure for unity * added screensharing example code in poc3 for macos * update macos screenshare api reference links to open in new tab * update web cloud proxy doc (#434) * Release notes - first draft * Link to beta docs * Added a bug fix * update get started for vsdk poc3 web * update vsdk secure auth docs * update call quality doc * Get started. * Call quality. * Call quality. * fix * revert unrelated changes * revert unrelated changes * Removed the fixed issue * Updated RN * update cloud proxy doc * refactor web get started --------- * updates * new structure for unity * updates * Update assets/code/video-sdk/audio-voice-effects/apply-voice-effects.mdx --------- * added display media section to play-media story * Update shared/video-sdk/develop/play-media/project-implementation/poc3.mdx * Update shared/video-sdk/develop/play-media/project-implementation/poc3.mdx * fix geofencing and spatial audio layouts * updates * align ios, macos, and unity in product-workflow doc * get started * updates * updates * updates * updates * updated terminal to sh on code block * added POST request example for all, and removed GET for iOS/macOS (#732) * added api ref links to get started guide for ios macos. added permission handler code for ios/macos (#730) * update * add auth workflow poc3 * add cloud proxy * add more apps * android updates for quickstart, authentication and call quality (#736) * get-started and authentication updates * authentication updates * Update * updates * call quality updates * Update * Poc3 android code insertion (#745) * get-started and authentication updates * authentication updates * Update * updates * call quality updates * Update * product workflow updates * cloud proxy updates * media encryption updates * updates * Geofencing updates * media player updates * multiple channels I * Multiple channels update II * Custom audio and video * Raw video and audio * Raw video and audio * Audio and voice effects * Virtual background * AI noise suppression * get-started update * Update * get-started numbering fix * geofencing + cloud proxy (#735) * geofencing + cloud proxy * review updates * code indentation --------- * update play-media to include macos (#749) * Audio and voice effects + product workflow for reactjs (#746) * audio and voice effects * product workflow * svg updates * updates * audio and voice effects * product workflow * svg updates * updates * audio voice effects update * updates --------- * updates (#741) * Update * audio-voice-effects iOS + macOS (#753) * added voice effect code blocks * added api ref links to audio-and-voice-effects code blocks * added ains code for ios and macos (#754) * multichannel streaming and custom media (#755) * multichannel streaming and custom media * removed .DS_Store files --------- * Call quality reactjs poc3 (#756) * call quality * review updates * updates * updates * fixed the index file --------- * add call quality doc * Authentication workflow + ai denoiser for reactjs (#761) * Authentication workflow * ai-denoiser * denoiser * authentication updates --------- * milestone39 review updates and improvements (#762) * updates * review updates * review updates * Added android spatial audio code * Updates * updates * spatial audio + media encryption (#763) * add product workflow doc * add spatial audio docs * add denoiser docs * add custom av docs * updates * updated macos for raw video and custom audio-video (#767) * custom-audio-video macos updates * Voice calling updates/milestone39 rewrite video sdk docs (#764) * get-started updates * get-started setup and prereqs update * voice-calling updates * Update * Call quality max suggestions (#760) * updates * updates * Update --------- * Multiple Channels Updates macOS (#768) * updated macos for raw video and custom audio-video * updated multiple channels docs for macos and ios * Project Test Sections (#759) * updated project-test section for some documents to use the reference apps, covering android/ios/macos * test section improvements * Call-quality updates * stream-media and multiple-channels updates * product-workflow updates * updates * updates * virtual background updates * Updates --------- * Update * poc3.mdx header updates (#772) * Update * update * Update * Updates after final checking (#769) * updates * updates * updates * updates --------- * Docs organization changes (#771) * Rearrange docs * Rearrange docs * Review updates/milestone39-poc3 (#779) * review-updates * update * review updates * Move api ref web (#775) * refactor get started references * refactor references for auth workflow * refactor cloud proxy references * refactor play media references * refactor encrypt references * refactor call quality refs * refactor product workflow refs * refactor custom audio video refs * refactor geofencing * virtual background in poc3 * geofencing update * review updates * updates --------- * Release notes for ReactJS (#778) * Release notes * Release notes * Multi-channel updates + Custom media source + Raw audio and video (#774) * updates * updates * updates * updates * updates * Updates * updates --------- * update * Update * Live stream over multiple channela doc (#780) * refactor get started references * refactor references for auth workflow * refactor cloud proxy references * refactor play media references * refactor encrypt references * refactor call quality refs * refactor product workflow refs * refactor custom audio video refs * refactor geofencing * virtual background in poc3 * geofencing update * add live streaming over multiple channel * review updates --------- * Encryption Handler iOS (#783) * add ios event handler to encryption, and remove duplicated code in some poc3 files * remove ds store * removed-repeated step --------- * updates for voice calling (#787) * updates for voice calling unity * added back voice-calling custom-audio doc --------- * Add error codes doc/milestone39 (#788) * Updates * Update * Updates * error-codes update * updates * fix ios declared variables and probe test in call quality doc (#794) * get-started update * non-poc3 platforms update * updates * review updates * Added web demo links (#801) * Voice changes for poc3 web (#804) * voice-sdk-web-poc3 * review updates --------- * update token example code link to latest release --------- Co-authored-by: billy-the-fish Co-authored-by: Dasun Nirmitha Co-authored-by: saudsami Co-authored-by: Saud <65331551+saudsami@users.noreply.github.com> Co-authored-by: Hussain Khalid <72780625+hussain-khalid@users.noreply.github.com> Co-authored-by: Kishan Dhakan Co-authored-by: pankajg123 Co-authored-by: Max Cobb Co-authored-by: Max Cobb <5754073+maxxfrazer@users.noreply.github.com> Co-authored-by: Kishan Dhakan <42718091+Kishan-Dhakan@users.noreply.github.com> Co-authored-by: atovpeko Co-authored-by: atovpeko <114177030+atovpeko@users.noreply.github.com> --- .gitignore | 1 + .../{reference => overview}/pricing.mdx | 2 +- .../{reference => overview}/release-notes.mdx | 2 +- .../supported-platforms.mdx | 4 +- .../{reference => overview}/pricing.mdx | 0 .../{reference => overview}/release-notes.mdx | 2 +- .../supported-platforms.mdx | 2 +- .../ai-noise-suppression/configure-engine.mdx | 18 + .../configure-extension.mdx | 41 ++ .../ai-noise-suppression/enable-denoiser.mdx | 66 ++ .../ai-noise-suppression/import-library.mdx | 18 + .../ai-noise-suppression/import-plugin.mdx | 25 + .../set-noise-reduction-mode.mdx | 34 + .../set-reduction-level.mdx | 34 + .../ai-noise-suppression/setup-logging.mdx | 21 + .../apply-voice-effects.mdx | 129 ++++ .../{swift => }/configure-buttons.mdx | 4 +- .../audio-voice-effects/configure-engine.mdx | 69 ++ .../audio-voice-effects/create-ui.mdx | 51 ++ .../audio-voice-effects/event-handler.mdx | 98 +++ .../audio-voice-effects/import-library.mdx | 26 + .../audio-voice-effects/pause-play-resume.mdx | 105 +++ .../audio-voice-effects/preload-effect.mdx | 47 ++ .../audio-voice-effects/set-audio-profile.mdx | 41 ++ .../audio-voice-effects/set-audio-route.mdx | 87 +++ .../audio-voice-effects/set-variables.mdx | 29 + .../audio-voice-effects/stop-start-mixing.mdx | 145 ++++ .../swift/apply-voice-effects.mdx | 86 --- .../audio-voice-effects/swift/create-ui.mdx | 15 - .../swift/pause-play-resume.mdx | 56 -- .../swift/set-audio-route.mdx | 18 - .../swift/stop-start-mixing.mdx | 40 -- .../{swift => }/update-ui.mdx | 0 .../authentication-workflow/add-variables.mdx | 44 ++ .../authentication-workflow/event-handler.mdx | 173 +++++ .../authentication-workflow/fetch-token.mdx | 182 +++++ .../import-library.mdx | 38 ++ .../authentication-workflow/join-channel.mdx | 220 ++++++ .../authentication-workflow/renew-token.mdx | 48 ++ .../{swift => }/specify-channel.mdx | 4 +- .../swift/add-variables.mdx | 42 -- .../swift/fetch-token.mdx | 86 --- .../cloud-proxy/configure-engine.mdx | 18 + .../cloud-proxy/connection-failed.mdx | 63 ++ .../video-sdk/cloud-proxy/event-handler.mdx | 103 +++ .../video-sdk/cloud-proxy/import-library.mdx | 28 + .../video-sdk/cloud-proxy/set-cloud-proxy.mdx | 70 ++ .../video-sdk/cloud-proxy/set-variables.mdx | 27 + .../configure-engine-audio.mdx | 20 + .../configure-engine.mdx | 10 + .../create-custom-audio-track.mdx | 37 + .../create-custom-video-track.mdx | 37 + .../destroy-custom-track-audio.mdx | 24 + .../destroy-custom-track-video.mdx | 16 + .../enable-audio-publishing.mdx | 97 +++ .../enable-video-publishing.mdx | 105 +++ .../import-library-audio.mdx | 25 + .../custom-video-and-audio/import-library.mdx | 37 + .../push-audio-frames.mdx | 125 ++++ .../push-video-frames.mdx | 89 +++ .../read-audio-input.mdx | 74 ++ .../render-custom-video.mdx | 85 +++ .../set-variables-audio.mdx | 40 ++ .../custom-video-and-audio/set-variables.mdx | 39 ++ .../enable-encryption.mdx | 149 ++++ .../enable-end-to-end-encryption.mdx | 174 +++++ .../encrypt-media-streams/event-handler.mdx | 56 ++ .../encrypt-media-streams/import-library.mdx | 25 + .../encrypt-media-streams/set-variables.mdx | 18 + .../ensure-channel-quality/event-handler.mdx | 404 +++++++++++ .../implement-call-quality-view.mdx | 52 ++ .../implement-declarations.mdx | 121 ++++ .../{swift => }/implement-labels.mdx | 0 .../{swift => }/implement-network-status.mdx | 21 +- .../ensure-channel-quality/import-library.mdx | 57 ++ .../ensure-channel-quality/probe-test.mdx | 125 ++++ .../set-audio-video-profile.mdx | 28 + .../ensure-channel-quality/set-latency.mdx | 65 ++ .../ensure-channel-quality/setup-engine.mdx | 367 ++++++++++ .../ensure-channel-quality/show-stats.mdx | 45 ++ .../swift/implement-declarations.mdx | 13 - .../ensure-channel-quality/switch-quality.mdx | 118 ++++ .../ensure-channel-quality/test-hardware.mdx | 407 +++++++++++ .../video-sdk/geofencing/combine-geofence.mdx | 34 + .../video-sdk/geofencing/set-geofence.mdx | 94 +++ .../get-started-sdk/create-engine.mdx | 219 ++++++ .../get-started-sdk/declare-variables.mdx | 127 ++++ .../video-sdk/get-started-sdk/destroy.mdx | 67 ++ .../get-started-sdk/handle-events.mdx | 347 ++++++++++ .../get-started-sdk/import-library.mdx | 51 ++ .../get-started-sdk/join-channel.mdx | 365 ++++++++++ .../get-started-sdk/leave-channel.mdx | 121 ++++ .../video-sdk/get-started-sdk/local-video.mdx | 102 +++ .../get-started-sdk/remote-video.mdx | 121 ++++ .../get-started-sdk/request-permissions.mdx | 102 +++ .../get-started-sdk/set-user-role.mdx | 68 ++ .../setup-audio-video-tracks.mdx | 9 + .../get-started-sdk/swift/create-ui.mdx | 278 -------- .../get-started-sdk/swift/join-and-leave.mdx | 239 ------- .../get-started-sdk/swift/role-action.mdx | 14 - .../get-started-sdk/swift/show-message.mdx | 26 - .../swift/view-did-disappear.mdx | 19 - .../import-library.mdx | 43 ++ .../join-a-second-channel.mdx | 236 +++++++ .../leave-second-channel.mdx | 35 + .../monitor-channel-media-relay-state.mdx | 111 +++ .../receive-callbacks-from-second-channel.mdx | 94 +++ .../set-variables.mdx | 72 ++ .../start-stop-channel-media-relay.mdx | 272 ++++++++ .../swift/configure-buttons.mdx | 4 +- .../swift/create-ui.mdx | 4 +- .../swift/mc-configure-buttons.mdx | 4 +- .../swift/mc-create-ui.mdx | 4 +- .../swift/mc-join-second-channel.mdx | 4 +- .../swift/mc-second-channel-delegate.mdx | 4 +- .../swift/monitor-relay-state.mdx | 4 +- .../video-sdk/play-media/configure-engine.mdx | 21 + .../play-media/destroy-media-player.mdx | 52 ++ .../video-sdk/play-media/display-media.mdx | 68 ++ .../video-sdk/play-media/event-handler.mdx | 191 ++++++ .../video-sdk/play-media/import-library.mdx | 24 + .../play-media/play-pause-resume.mdx | 110 +++ .../video-sdk/play-media/set-variables.mdx | 39 ++ .../video-sdk/play-media/start-streaming.mdx | 102 +++ .../play-media/swift/configure-buttons.mdx | 4 +- .../video-sdk/play-media/swift/create-ui.mdx | 4 +- .../swift/open-play-pause-media.mdx | 4 +- .../update-channel-publish-options.mdx | 79 +++ .../product-workflow/import-library.mdx | 52 ++ .../product-workflow/ios-extension.mdx | 80 +++ .../product-workflow/macos-screencapture.mdx | 88 +++ .../product-workflow/media-device-changed.mdx | 86 +++ .../microphone-camera-change.mdx | 71 ++ .../product-workflow/mute-local-video.mdx | 21 + .../product-workflow/mute-remote-user.mdx | 77 +++ .../override-broadcast-started.mdx | 49 ++ .../product-workflow/preview-screen-track.mdx | 40 ++ .../product-workflow/publish-screen-track.mdx | 40 ++ .../product-workflow/screen-sharer-target.mdx | 76 +++ .../product-workflow/setup-engine.mdx | 20 + .../product-workflow/setup-volume.mdx | 153 +++++ .../product-workflow/start-sharing.mdx | 167 +++++ .../product-workflow/stop-sharing.mdx | 41 ++ .../raw-video-audio/configure-engine.mdx | 77 +++ .../raw-video-audio/import-library.mdx | 31 + .../raw-video-audio/modify-audio-video.mdx | 176 +++++ .../register-video-audio-frame-observers.mdx | 106 +++ .../set-audio-frame-observer.mdx | 168 +++++ .../raw-video-audio/set-variables.mdx | 101 +++ .../set-video-frame-observer.mdx | 104 +++ .../swift/register-frame-observers.mdx | 4 +- .../swift/unregister-frame-observers.mdx | 4 +- ...unregister-video-audio-frame-observers.mdx | 57 ++ .../spatial-audio/import-library.mdx | 35 + .../video-sdk/spatial-audio/play-media.mdx | 49 ++ .../spatial-audio/remove-spatial.mdx | 37 + .../video-sdk/spatial-audio/set-variables.mdx | 58 ++ .../video-sdk/spatial-audio/setup-local.mdx | 98 +++ .../video-sdk/spatial-audio/setup-remote.mdx | 147 ++++ .../video-sdk/spatial-audio/setup-spatial.mdx | 151 +++++ .../virtual-background/blur-background.mdx | 75 +++ .../virtual-background/color-background.mdx | 101 +++ .../virtual-background/configure-engine.mdx | 26 + .../device-compatibility.mdx | 48 ++ .../virtual-background/image-background.mdx | 66 ++ .../virtual-background/import-library.mdx | 61 ++ .../virtual-background/reset-background.mdx | 60 ++ .../set-virtual-background.mdx | 115 ++++ .../get-started-sdk/swift/create-ui.mdx | 4 +- .../get-started-sdk/swift/show-message.mdx | 4 +- .../swift/view-did-disappear.mdx | 4 +- .../images/chat/chat-call-logic-android.svg | 2 +- .../images/chat/chat-call-logic-flutter.svg | 2 +- assets/images/chat/chat-call-logic-unity.svg | 2 +- .../images/chat/chat-call-logic-windows.svg | 2 +- .../extensions-marketplace/geofencing.svg | 1 + .../extensions-marketplace/ncs-worflow.svg | 1 + .../ils-call-logic-android.svg | 491 +------------- .../ils-call-logic-flutter.svg | 489 +------------- .../ils-call-logic-ios.svg | 485 +------------- .../ils-call-logic-template.svg | 485 +------------- .../ils-call-logic-unity.svg | 495 +------------- .../ils-call-logic-web.svg | 485 +------------- .../live-streaming-over-multiple-channels.svg | 1 + assets/images/iot/iot-channel-quality.svg | 2 +- assets/images/iot/iot-get-started.svg | 2 +- assets/images/iot/iot-licensing.svg | 2 +- assets/images/iot/iot-multi-channel.svg | 2 +- .../ncs-cloud-recording-workflow.svg | 2 +- .../ncs-media-pull.svg | 2 +- .../ncs-media-push.svg | 2 +- assets/images/others/authentication-logic.svg | 451 +------------ .../others/documentation_way_of_working.svg | 511 +------------- .../images/others/media-stream-encryption.svg | 463 +------------ assets/images/others/play-media.svg | 469 +------------ assets/images/shared/ncs-worflow.svg | 1 + assets/images/video-calling/geofencing.svg | 445 +----------- .../video-calling/process-raw-video-audio.svg | 479 +------------ .../video-call-logic-android.svg | 471 +------------ .../video-call-logic-flutter.svg | 465 +------------ .../video-calling/video-call-logic-ios.svg | 469 +------------ .../video-call-logic-template.svg | 469 +------------ .../video-calling/video-call-logic-unity.svg | 473 +------------ .../video-calling/video-call-logic-web.svg | 465 +------------ .../video-calling/video_call_workflow.svg | 1 + .../video_call_workflow_run_end.svg | 1 + .../audio-and-voice-effects-web.puml | 41 ++ .../video-sdk/audio-and-voice-effects-web.svg | 1 + .../video-sdk/audio-and-voice-effects.svg | 497 +------------- .../images/video-sdk/authentication-logic.svg | 451 +------------ assets/images/video-sdk/cloud-proxy.svg | 2 +- .../video-sdk/custom-source-video-audio.svg | 471 +------------ .../video-sdk/ensure-channel-quality.svg | 481 +------------ .../video-sdk/ils-call-logic-android.svg | 491 +------------- .../video-sdk/ils-call-logic-electron.svg | 483 +------------ .../video-sdk/ils-call-logic-flutter.svg | 487 +------------- .../images/video-sdk/ils-call-logic-ios.svg | 485 +------------- .../video-sdk/ils-call-logic-template.svg | 485 +------------- .../images/video-sdk/ils-call-logic-unity.svg | 493 +------------- .../video-sdk/ils-call-logic-unreal.svg | 2 +- .../images/video-sdk/ils-call-logic-web.svg | 485 +------------- .../video-sdk/integrated-token-generation.svg | 449 +------------ .../video-sdk/media-stream-encryption.svg | 457 +------------ assets/images/video-sdk/play-drm-music.svg | 463 +------------ .../images/video-sdk/product-workflow-web.svg | 483 +------------ assets/images/video-sdk/product-workflow.svg | 1 + assets/images/video-sdk/spatial-audio-web.svg | 2 +- assets/images/video-sdk/spatial-audio.svg | 479 +------------ .../video-sdk/video-call-logic-android.svg | 2 +- .../video-sdk/video-call-logic-electron.svg | 2 +- .../video-sdk/video-call-logic-flutter.svg | 2 +- .../images/video-sdk/video-call-logic-ios.svg | 2 +- .../video-sdk/video-call-logic-reactjs.puml | 35 + .../video-sdk/video-call-logic-reactjs.svg | 1 + .../video-sdk/video-call-logic-template.svg | 469 +------------ .../video-sdk/video-call-logic-unity.svg | 2 +- .../video-sdk/video-call-logic-unreal.svg | 2 +- .../images/video-sdk/video-call-logic-web.svg | 2 +- .../images/video-sdk/video_call_workflow.svg | 1 + .../video-sdk/video_call_workflow_run_end.svg | 1 + .../images/voice-sdk/authentication-logic.svg | 451 +------------ .../images/voice-sdk/ensure-voice-quality.svg | 495 +------------- assets/images/voice-sdk/geofencing.svg | 445 +----------- .../voice-sdk/integrated-token-generation.svg | 449 +------------ assets/images/voice-sdk/process-raw-audio.svg | 1 + .../voice-sdk/product-workflow-voice-web.svg | 477 +------------ .../voice-sdk/product-workflow-voice.svg | 1 + .../voice-sdk/voice-call-logic-electron.svg | 455 +------------ .../voice-sdk/voice-call-logic-flutter.svg | 455 +------------ .../voice-sdk/voice-call-logic-unity.svg | 471 +------------ .../get-started/get-started-sdk.mdx | 2 +- .../{reference => overview}/pricing.mdx | 2 +- .../{reference => overview}/release-notes.mdx | 2 +- .../supported-platforms.mdx | 0 broadcast-streaming/reference/error-codes.mdx | 13 + .../reference/known-issues.mdx | 15 - cloud-recording/develop/individual-mode.md | 14 +- .../develop/integration-best-practices.md | 2 +- .../develop/recording-video-profile.md | 2 +- cloud-recording/develop/screen-capture.md | 2 +- cloud-recording/develop/webpage-mode.md | 2 +- .../pricing-webpage-recording.md | 6 +- .../{reference => overview}/pricing.md | 6 +- .../{reference => overview}/release-notes.mdx | 2 +- .../supported-platforms.mdx | 2 +- .../develop/integrate/banuba.mdx | 4 +- .../{reference => overview}/release-notes.mdx | 2 +- .../supported-platforms.mdx | 2 +- .../develop/authentication-workflow.mdx | 22 +- .../{reference => overview}/release-notes.mdx | 2 +- .../supported-platforms.md | 2 +- .../get-started/get-started-sdk.mdx | 2 +- .../{reference => overview}/pricing.mdx | 2 +- .../{reference => overview}/release-notes.mdx | 2 +- .../supported-platforms.mdx | 0 .../reference/error-codes.mdx | 13 + .../reference/known-issues.mdx | 15 - .../develop/enable-whiteboard.md | 2 +- .../{reference => overview}/pricing.md | 2 +- .../release-notes-uikit.mdx | 2 +- .../{reference => overview}/release-notes.mdx | 2 +- .../supported-platforms.mdx | 2 +- iot/{reference => overview}/pricing.mdx | 2 +- iot/{reference => overview}/release-notes.mdx | 2 +- .../supported-platforms.mdx | 0 .../develop/integration-best-practices.mdx | 2 +- .../{reference => overview}/pricing.mdx | 2 +- .../{reference => overview}/release-notes.mdx | 0 .../{reference => overview}/pricing.mdx | 2 +- .../{reference => overview}/billing.md | 2 +- .../{reference => overview}/release-notes.mdx | 4 +- on-premise-recording/reference/sunset.md | 2 +- .../reference/video-profile.md | 2 +- .../{reference => overview}/pricing.mdx | 2 +- .../{reference => overview}/release-notes.mdx | 2 +- shared/agora-analytics/_alarm.mdx | 8 +- shared/agora-analytics/_api.mdx | 8 +- shared/agora-analytics/_call-search.mdx | 6 +- shared/agora-analytics/_data-insight-plus.mdx | 2 +- shared/agora-analytics/_data-insight.mdx | 2 +- shared/agora-analytics/_embedded.mdx | 6 +- shared/agora-analytics/_monitor.mdx | 2 +- shared/agora-analytics/_pricing.mdx | 2 +- .../messages/_translate-messages.mdx | 2 +- shared/chat-sdk/develop/_authentication.mdx | 2 +- .../chat-sdk/develop/_content-moderation.mdx | 2 +- .../project-implementation/web.mdx | 10 +- shared/chat-sdk/get-started/_enable.mdx | 2 +- shared/chat-sdk/hide/_token-server-new.mdx | 2 +- .../chat-sdk/overview/_product-overview.mdx | 2 +- .../chat-sdk/reference/_http-status-codes.mdx | 4 +- shared/chat-sdk/reference/_pricing.mdx | 4 +- .../develop/_media-stream-encryption.mdx | 4 +- .../_agora-console-restapi.mdx | 26 +- .../manage-agora-account/_get-appid-token.mdx | 33 +- shared/common/no-uikit.mdx | 2 +- shared/common/prerequities-get-started.mdx | 77 +++ shared/common/prerequities.mdx | 25 +- shared/common/project-setup/android.mdx | 16 + .../project-setup/electron.mdx | 4 +- shared/common/project-setup/flutter.mdx | 33 + .../project-setup/index.mdx | 2 + shared/common/project-setup/ios.mdx | 17 + shared/common/project-setup/macos.mdx | 15 + shared/common/project-setup/react-js.mdx | 22 + .../project-setup/react-native.mdx | 0 shared/common/project-setup/swift.mdx | 0 shared/common/project-setup/unity.mdx | 27 + shared/common/project-setup/unreal.mdx | 3 + shared/common/project-setup/web.mdx | 26 + shared/common/project-setup/windows.mdx | 28 + shared/common/project-test/android.mdx | 28 + shared/common/project-test/clone-project.mdx | 28 + shared/common/project-test/electron.mdx | 22 + shared/common/project-test/flutter.mdx | 28 + .../project-test/generate-temp-rtc-token.mdx | 1 + shared/common/project-test/index.mdx | 22 + .../project-test}/ios.mdx | 0 shared/common/project-test/load-web-demo.mdx | 1 + .../project-test}/macos.mdx | 1 - .../common/project-test/open-config-file.mdx | 36 + shared/common/project-test/react-js.mdx | 41 ++ shared/common/project-test/react-native.mdx | 61 ++ .../common/project-test/rtc-first-steps.mdx | 26 + .../common/project-test/run-reference-app.mdx | 39 ++ shared/common/project-test/set-app-id.mdx | 1 + .../project-test/set-authentication-rtc.mdx | 11 + shared/common/project-test/swift.mdx | 24 + shared/common/project-test/unity.mdx | 25 + shared/common/project-test/windows.mdx | 32 + .../_develop-an-audio-filter.mdx | 4 +- .../_use-an-extension.mdx | 8 +- .../ai-noise-suppression.mdx | 33 +- .../project-implementation/index.mdx | 10 +- .../project-implementation/poc3.mdx | 44 ++ .../project-implementation/web.mdx | 2 +- .../project-test/poc3.mdx | 22 + .../project-test/react-js.mdx | 23 + .../ai-noise-suppression/reference/index.mdx | 1 + .../ai-noise-suppression/reference/ios.mdx | 2 - .../ai-noise-suppression/reference/macos.mdx | 2 - .../ai-noise-suppression/reference/web.mdx | 5 + .../common/_prerequities.mdx | 2 +- .../common/project-test/poc3.mdx | 14 + .../project-implementation/cpp.mdx | 4 +- .../develop-an-audio-filter/_web.mdx | 2 +- .../project-implementation/cpp.mdx | 4 +- .../drm-play/project-implementation/swift.mdx | 8 +- .../faceunity/project-implementation/ios.mdx | 2 +- .../image-enhancement.mdx | 2 +- .../reference/_ains.mdx | 2 +- .../project-implementation/android.mdx | 4 +- .../project-implementation/electron.mdx | 2 +- .../project-implementation/flutter.mdx | 6 +- .../project-implementation/react-native.mdx | 2 +- .../project-implementation/unity.mdx | 6 +- .../project-setup/android.mdx | 2 +- .../use-an-extension/project-setup/web.mdx | 2 +- .../virtual-background.mdx | 7 +- .../project-implementation/index.mdx | 14 +- .../project-implementation/poc3.mdx | 77 +++ .../project-implementation/swift.mdx | 4 +- .../project-implementation/unity.mdx | 92 +-- .../virtual-background/project-test/index.mdx | 10 +- .../virtual-background/project-test/ios.mdx | 8 - .../virtual-background/project-test/macos.mdx | 8 - .../virtual-background/project-test/poc3.mdx | 14 + .../project-test/react-js.mdx | 28 + .../virtual-background/project-test/swift.mdx | 27 - .../virtual-background/project-test/unity.mdx | 30 +- .../virtual-background/reference/android.mdx | 9 - .../virtual-background/reference/swift.mdx | 18 - .../virtual-background/reference/unity.mdx | 10 - .../virtual-background/reference/web.mdx | 2 + .../embed-custom-plugin/ios.mdx | 2 +- .../release-notes/android.mdx | 2 +- .../flexible-classroom/release-notes/ios.mdx | 2 +- .../get-started-uikit/android.mdx | 8 +- .../get-started-uikit/ios.mdx | 6 +- .../get-started-uikit/web.mdx | 14 +- .../get-started/android.mdx | 2 +- .../get-started/ios.mdx | 2 +- .../get-started/web.mdx | 4 +- .../present-files/android.mdx | 6 +- .../present-files/ios.mdx | 4 +- .../present-files/web.mdx | 12 +- shared/interactive-whiteboard/qps-pricing.mdx | 2 +- .../iot/develop/_media-stream-encryption.mdx | 4 +- .../project-implementation/android.mdx | 2 +- .../project-implementation/android.mdx | 24 +- .../get-started-sdk/project-setup/android.mdx | 2 +- shared/media-push/develop/_restful-api.mdx | 2 +- shared/media-push/reference/_pricing.mdx | 4 +- .../_notification_center_service.mdx | 2 +- .../geofencing/sample-code/android.mdx | 2 +- .../signaling/geofencing/sample-code/cpp.mdx | 2 +- .../signaling/geofencing/sample-code/java.mdx | 2 +- .../signaling/geofencing/sample-code/objc.mdx | 2 +- .../signaling/geofencing/sample-code/web.mdx | 2 +- shared/signaling/get-started-sdk/android.mdx | 12 +- shared/signaling/get-started-sdk/ios.mdx | 8 +- shared/signaling/get-started-sdk/macos.mdx | 8 +- shared/signaling/get-started-sdk/web.mdx | 16 +- shared/signaling/limitations/web.mdx | 2 +- shared/signaling/release-notes/android.mdx | 2 +- shared/signaling/release-notes/web.mdx | 2 +- .../run-the-sample-project/android.mdx | 2 +- .../run-the-sample-project/windows.mdx | 2 +- shared/variables/global.js | 14 +- shared/variables/platform.js | 6 +- shared/video-sdk/_authentication-workflow.mdx | 153 +++-- shared/video-sdk/_billing.mdx | 2 +- shared/video-sdk/_get-started-uikit.mdx | 10 +- .../project-implementation-uikit/android.mdx | 6 +- .../project-implementation-uikit/swift.mdx | 4 +- .../project-implementation-uikit/web.mdx | 2 +- .../project-implementation/android.mdx | 170 +---- .../project-implementation/cpp.mdx | 330 ++++----- .../project-implementation/csharp.mdx | 142 ++-- .../project-implementation/electron.mdx | 16 +- .../project-implementation/flutter.mdx | 218 ++---- .../project-implementation/index.mdx | 12 +- .../project-implementation/poc3.mdx | 38 ++ .../project-implementation/react-js.mdx | 62 ++ .../project-implementation/react-native.mdx | 2 +- .../project-implementation/swift.mdx | 142 ++-- .../project-implementation/unity.mdx | 18 +- .../project-implementation/web.mdx | 80 --- .../project-test/android.mdx | 52 +- .../project-test/cpp.mdx | 17 +- .../project-test/electron.mdx | 2 +- .../project-test/flutter.mdx | 32 +- .../project-test/index.mdx | 31 +- .../project-test/poc3.mdx | 38 ++ .../project-test/react-js.mdx | 21 + .../project-test/swift.mdx | 31 +- .../project-test/unity.mdx | 35 +- .../project-test/windows.mdx | 3 +- .../reference/android.mdx | 9 +- .../reference/index.mdx | 2 + .../authentication-workflow/reference/ios.mdx | 3 - .../reference/macos.mdx | 3 - .../reference/react-js.mdx | 5 + .../reference/unity.mdx | 4 - .../authentication-workflow/reference/web.mdx | 2 - .../develop/_audio-and-voice-effects.mdx | 23 +- shared/video-sdk/develop/_cloud-proxy.mdx | 32 +- .../develop/_custom-video-and-audio.mdx | 30 +- .../develop/_ensure-channel-quality.mdx | 23 +- shared/video-sdk/develop/_geofencing.mdx | 25 +- .../develop/_integrate-token-generation.mdx | 15 - ..._live-streaming-over-multiple-channels.mdx | 17 +- .../develop/_media-stream-encryption.mdx | 40 +- shared/video-sdk/develop/_migration-guide.mdx | 7 + shared/video-sdk/develop/_play-media.mdx | 24 +- .../video-sdk/develop/_product-workflow.mdx | 29 +- shared/video-sdk/develop/_spatial-audio.mdx | 25 +- .../develop/_stream-raw-audio-and-video.mdx | 35 +- .../project-implementation/flutter.mdx | 18 +- .../project-implementation/index.mdx | 13 +- .../project-implementation/poc3.mdx | 60 ++ .../project-implementation/react-js.mdx | 78 +++ .../project-implementation/react-native.mdx | 20 +- .../project-implementation/swift.mdx | 14 +- .../project-implementation/unity.mdx | 357 ++++------ .../project-test/android.mdx | 34 +- .../project-test/index.mdx | 25 +- .../project-test/poc3.mdx | 19 + .../project-test/react-js.mdx | 26 + .../project-test/swift.mdx | 36 +- .../project-test/unity.mdx | 34 +- .../reference/android.mdx | 45 -- .../reference/index.mdx | 2 + .../audio-and-voice-effects/reference/ios.mdx | 66 -- .../reference/macos.mdx | 63 -- .../reference/react-js.mdx | 4 + .../reference/unity.mdx | 37 - .../audio-and-voice-effects/reference/web.mdx | 33 +- .../project-implementation/android.mdx | 8 +- .../project-implementation/electron.mdx | 6 +- .../project-implementation/flutter.mdx | 6 +- .../project-implementation/index.mdx | 14 +- .../project-implementation/poc3.mdx | 45 ++ .../project-implementation/react-js.mdx | 29 + .../project-implementation/react-native.mdx | 6 +- .../project-implementation/unity.mdx | 68 +- .../project-implementation/web.mdx | 44 -- .../cloud-proxy/project-setup/unity.mdx | 3 + .../cloud-proxy/project-test/index.mdx | 20 +- .../develop/cloud-proxy/project-test/ios.mdx | 24 - .../cloud-proxy/project-test/macos.mdx | 20 - .../develop/cloud-proxy/project-test/poc3.mdx | 13 + .../cloud-proxy/project-test/react-js.mdx | 26 + .../cloud-proxy/project-test/unity.mdx | 27 +- .../develop/cloud-proxy/project-test/web.mdx | 37 +- .../develop/cloud-proxy/reference/android.mdx | 13 - .../develop/cloud-proxy/reference/index.mdx | 2 + .../develop/cloud-proxy/reference/ios.mdx | 9 - .../develop/cloud-proxy/reference/macos.mdx | 9 - .../cloud-proxy/reference/react-js.mdx | 3 + .../develop/cloud-proxy/reference/unity.mdx | 13 - .../develop/cloud-proxy/reference/web.mdx | 9 +- .../project-implementation/index.mdx | 16 +- .../project-implementation/poc3.mdx | 181 +++++ .../project-implementation/react-js.mdx | 70 ++ .../project-implementation/unity.mdx | 2 +- .../project-implementation/web.mdx | 53 -- .../project-implementation/windows.mdx | 2 +- .../project-test/android.mdx | 17 +- .../project-test/index.mdx | 26 +- .../project-test/poc3.mdx | 20 + .../project-test/react-js.mdx | 21 + .../project-test/swift.mdx | 22 +- .../project-test/unity.mdx | 26 +- .../project-test/web.mdx | 25 +- .../reference/android.mdx | 12 - .../reference/index.mdx | 2 + .../custom-video-and-audio/reference/ios.mdx | 13 - .../reference/macos.mdx | 14 - .../reference/react-js.mdx | 4 + .../reference/unity.mdx | 10 - .../custom-video-and-audio/reference/web.mdx | 10 +- .../project-implementation/android.mdx | 8 +- .../project-implementation/electron.mdx | 4 +- .../project-implementation/flutter.mdx | 8 +- .../project-implementation/index.mdx | 14 +- .../project-implementation/poc3.mdx | 42 ++ .../project-implementation/react-js.mdx | 63 ++ .../project-implementation/swift.mdx | 6 +- .../project-implementation/unity.mdx | 71 +- .../project-implementation/web.mdx | 72 -- .../project-implementation/windows.mdx | 2 +- .../project-test/android.mdx | 16 +- .../project-test/electron.mdx | 2 +- .../project-test/index.mdx | 49 +- .../project-test/poc3.mdx | 59 ++ .../project-test/react-js.mdx | 26 + .../project-test/swift.mdx | 8 +- .../project-test/unity.mdx | 31 +- .../project-test/web.mdx | 2 +- .../reference/android.mdx | 12 - .../encrypt-media-streams/reference/index.mdx | 2 + .../encrypt-media-streams/reference/ios.mdx | 4 - .../encrypt-media-streams/reference/macos.mdx | 5 - .../reference/react-js.mdx | 3 + .../encrypt-media-streams/reference/unity.mdx | 4 - .../reference/unreal.mdx | 7 +- .../encrypt-media-streams/reference/web.mdx | 12 - .../project-implementation/android.mdx | 461 ++++++------- .../project-implementation/electron.mdx | 4 +- .../project-implementation/flutter.mdx | 405 +++++------ .../project-implementation/index.mdx | 14 +- .../project-implementation/poc3.mdx | 104 +++ .../project-implementation/react-js.mdx | 125 ++++ .../project-implementation/react-native.mdx | 4 +- .../project-implementation/swift.mdx | 245 ++----- .../project-implementation/unity.mdx | 469 ++++++++++--- .../project-implementation/web.mdx | 252 ------- .../project-implementation/windows.mdx | 634 +++++++++--------- .../project-setup/index.mdx | 2 - .../project-setup/ios.mdx | 3 - .../project-test/android.mdx | 47 +- .../project-test/flutter.mdx | 61 +- .../project-test/index.mdx | 28 +- .../project-test/poc3.mdx | 20 + .../project-test/react-js.mdx | 45 ++ .../project-test/swift.mdx | 51 +- .../project-test/unity.mdx | 67 +- .../project-test/web.mdx | 28 +- .../project-test/windows.mdx | 62 +- .../reference/android.mdx | 31 - .../reference/index.mdx | 2 + .../reference/macos.mdx | 2 +- .../reference/react-js.mdx | 5 + .../reference/swift.mdx | 49 -- .../reference/unity.mdx | 47 -- .../ensure-channel-quality/reference/web.mdx | 21 +- .../project-implementation/electron.mdx | 2 +- .../project-implementation/flutter.mdx | 2 +- .../project-implementation/index.mdx | 14 +- .../project-implementation/poc3.mdx | 24 + .../{web.mdx => react-js.mdx} | 26 +- .../project-implementation/unity.mdx | 39 +- .../geofencing/project-test/android.mdx | 10 +- .../geofencing/project-test/flutter.mdx | 2 +- .../develop/geofencing/project-test/index.mdx | 23 +- .../develop/geofencing/project-test/poc3.mdx | 11 + .../geofencing/project-test/react-js.mdx | 31 + .../develop/geofencing/project-test/unity.mdx | 41 +- .../develop/geofencing/project-test/web.mdx | 15 +- .../develop/geofencing/reference/android.mdx | 20 - .../develop/geofencing/reference/index.mdx | 2 + .../develop/geofencing/reference/react-js.mdx | 3 + .../develop/geofencing/reference/swift.mdx | 47 +- .../develop/geofencing/reference/unity.mdx | 11 - .../develop/geofencing/reference/web.mdx | 12 +- .../project-implementation/cpp.mdx | 4 +- .../project-implementation/csharp.mdx | 4 +- .../project-implementation/index.mdx | 2 + .../project-implementation/java.mdx | 10 +- .../project-implementation/nodejs.mdx | 6 +- .../project-implementation/python.mdx | 8 +- .../project-implementation/web.mdx | 6 +- .../project-test/index.mdx | 2 + .../project-test/python.mdx | 2 +- .../project-implementation/android.mdx | 38 +- .../project-implementation/electron.mdx | 24 +- .../project-implementation/flutter.mdx | 24 +- .../project-implementation/index.mdx | 12 +- .../project-implementation/poc3.mdx | 40 ++ .../project-implementation/react-native.mdx | 24 +- .../project-implementation/swift.mdx | 16 +- .../project-implementation/unity.mdx | 401 +++++------ .../project-implementation/web.mdx | 249 ------- .../project-implementation/windows.mdx | 20 +- .../project-test/android.mdx | 61 +- .../project-test/index.mdx | 20 +- .../project-test/ios.mdx | 60 +- .../project-test/macos.mdx | 8 +- .../project-test/poc3.mdx | 20 + .../project-test/react-js.mdx | 89 +++ .../project-test/swift.mdx | 56 -- .../project-test/unity.mdx | 60 +- .../project-test/web.mdx | 91 ++- .../reference/android.mdx | 14 - .../reference/ios.mdx | 11 - .../reference/macos.mdx | 11 - .../reference/unity.mdx | 14 - .../reference/web.mdx | 13 - .../video-sdk/develop/migration-guide/web.mdx | 20 +- .../project-implementation/android.mdx | 353 ++++------ .../project-implementation/electron.mdx | 16 +- .../project-implementation/flutter.mdx | 16 +- .../project-implementation/index.mdx | 14 +- .../project-implementation/poc3.mdx | 76 +++ .../project-implementation/react-js.mdx | 59 ++ .../project-implementation/react-native.mdx | 20 +- .../project-implementation/swift.mdx | 12 +- .../project-implementation/unity.mdx | 215 +++--- .../play-media/project-implementation/web.mdx | 48 -- .../project-implementation/windows.mdx | 28 +- .../play-media/project-test/android.mdx | 40 +- .../develop/play-media/project-test/index.mdx | 26 +- .../develop/play-media/project-test/poc3.mdx | 20 + .../play-media/project-test/react-js.mdx | 23 + .../develop/play-media/project-test/swift.mdx | 38 +- .../develop/play-media/project-test/unity.mdx | 40 +- .../develop/play-media/project-test/web.mdx | 26 +- .../develop/play-media/reference/android.mdx | 9 - .../develop/play-media/reference/index.mdx | 2 + .../develop/play-media/reference/react-js.mdx | 3 + .../develop/play-media/reference/swift.mdx | 19 - .../develop/play-media/reference/unity.mdx | 8 - .../develop/play-media/reference/web.mdx | 6 +- .../project-implementation/android.mdx | 4 +- .../project-implementation/electron.mdx | 2 +- .../project-implementation/flutter.mdx | 2 +- .../project-implementation/index.mdx | 14 +- .../project-implementation/macos.mdx | 2 +- .../project-implementation/poc3.mdx | 104 +++ .../project-implementation/react-native.mdx | 2 +- .../project-implementation/swift.mdx | 2 +- .../project-implementation/unity.mdx | 276 ++++---- .../project-implementation/web.mdx | 203 ------ .../project-setup/android.mdx | 16 +- .../project-setup/flutter.mdx | 2 +- .../product-workflow/project-test/android.mdx | 51 +- .../product-workflow/project-test/index.mdx | 20 +- .../product-workflow/project-test/ios.mdx | 52 +- .../product-workflow/project-test/poc3.mdx | 20 + .../project-test/react-js.mdx | 35 + .../product-workflow/project-test/swift.mdx | 68 +- .../product-workflow/project-test/unity.mdx | 56 +- .../product-workflow/project-test/web.mdx | 53 +- .../product-workflow/project-test/windows.mdx | 4 +- .../product-workflow/reference/android.mdx | 22 - .../product-workflow/reference/react-js.mdx | 3 + .../product-workflow/reference/swift.mdx | 29 - .../product-workflow/reference/unity.mdx | 22 +- .../product-workflow/reference/web.mdx | 23 +- .../project-implementation/index.mdx | 12 +- .../project-implementation/poc3.mdx | 73 ++ .../project-implementation/swift.mdx | 112 ---- .../project-implementation/web.mdx | 178 ----- .../spatial-audio/project-test/android.mdx | 23 +- .../spatial-audio/project-test/index.mdx | 19 +- .../spatial-audio/project-test/poc3.mdx | 20 + .../spatial-audio/project-test/react-js.mdx | 25 + .../spatial-audio/project-test/swift.mdx | 26 +- .../spatial-audio/project-test/unity.mdx | 25 +- .../spatial-audio/project-test/web.mdx | 35 +- .../spatial-audio/reference/android.mdx | 14 - .../develop/spatial-audio/reference/ios.mdx | 14 - .../develop/spatial-audio/reference/macos.mdx | 14 - .../develop/spatial-audio/reference/unity.mdx | 28 - .../develop/spatial-audio/reference/web.mdx | 4 +- .../project-implementation/android.mdx | 16 +- .../project-implementation/electron.mdx | 10 +- .../project-implementation/flutter.mdx | 8 +- .../project-implementation/index.mdx | 10 +- .../project-implementation/ios.mdx | 16 +- .../project-implementation/poc3.mdx | 52 ++ .../project-implementation/react-native.mdx | 12 +- .../project-implementation/swift.mdx | 6 +- .../project-implementation/unity.mdx | 16 +- .../project-test/android.mdx | 30 +- .../project-test/electron.mdx | 4 - .../project-test/flutter.mdx | 4 - .../project-test/index.mdx | 22 +- .../project-test/ios.mdx | 67 +- .../project-test/poc3.mdx | 16 + .../project-test/react-native.mdx | 14 +- .../project-test/swift.mdx | 47 +- .../project-test/unity.mdx | 33 +- .../project-test/unreal.mdx | 8 +- .../project-test/windows.mdx | 5 +- .../reference/android.mdx | 12 - .../reference/ios.mdx | 13 - .../reference/macos.mdx | 14 - .../reference/unity.mdx | 15 - .../get-started-sdk/index.mdx} | 57 +- .../project-implementation/android.mdx | 578 ---------------- .../project-implementation/csharp.mdx | 44 +- .../project-implementation/electron.mdx | 8 +- .../project-implementation/flutter.mdx | 612 ----------------- .../project-implementation/index.mdx | 18 +- .../project-implementation/ios.mdx | 9 - .../project-implementation/macos.mdx | 8 - .../project-implementation/poc3.mdx | 191 ++++++ .../project-implementation/react-js.mdx | 92 +++ .../project-implementation/react-native.mdx | 2 +- .../project-implementation/swift.mdx | 221 ------ .../project-implementation/unity.mdx | 608 ++++++----------- .../project-implementation/web.mdx | 131 ---- .../project-implementation/windows.mdx | 500 ++++++-------- .../get-started-sdk/project-setup/android.mdx | 49 -- .../get-started-sdk/project-setup/flutter.mdx | 47 -- .../get-started-sdk/project-setup/ios.mdx | 7 - .../project-setup/react-js.mdx | 5 + .../get-started-sdk/project-setup/swift.mdx | 56 -- .../get-started-sdk/project-setup/unity.mdx | 21 - .../get-started-sdk/project-setup/windows.mdx | 59 -- .../get-started-sdk/project-test/android.mdx | 38 +- .../get-started-sdk/project-test/flutter.mdx | 39 +- .../get-started-sdk/project-test/index.mdx | 32 +- .../get-started-sdk/project-test/poc3.mdx | 46 ++ .../get-started-sdk/project-test/react-js.mdx | 35 + .../get-started-sdk/project-test/swift.mdx | 23 +- .../get-started-sdk/project-test/unity.mdx | 37 +- .../get-started-sdk/project-test/windows.mdx | 39 +- .../get-started-sdk/reference/android.mdx | 10 - .../get-started-sdk/reference/index.mdx | 2 + .../get-started-sdk/reference/ios.mdx | 12 - .../get-started-sdk/reference/macos.mdx | 13 +- .../get-started-sdk/reference/react-js.mdx | 5 + .../get-started-sdk/reference/unity.mdx | 18 - .../get-started-sdk/reference/web.mdx | 12 +- .../project-setup/android.mdx | 2 +- .../project-setup/flutter.mdx | 2 +- .../get-started-uikit/project-setup/web.mdx | 2 +- .../project-test/flutter.mdx | 2 +- shared/video-sdk/reference/_error-codes.mdx | 80 +++ shared/video-sdk/reference/_known-issues.mdx | 44 -- shared/video-sdk/reference/_release-notes.mdx | 2 + .../reference/api-reference/index.mdx | 3 + .../react-js/components-en.react.mdx | 262 ++++++++ .../react-js/data-types-en.react.mdx | 49 ++ .../api-reference/react-js/hooks-en.react.mdx | 611 +++++++++++++++++ .../api-reference/react-js/index.mdx | 19 + .../manual-install/electron.mdx | 2 +- .../manual-install/flutter.mdx | 2 +- .../manual-install/index copy.mdx | 19 - .../manual-install/index.mdx | 2 + .../manual-install/ios.mdx | 2 +- .../manual-install/macos.mdx | 2 +- .../manual-install/react-js.mdx | 14 + .../manual-install/web.mdx | 2 +- .../reference/plugin-descriptions.mdx | 2 +- .../reference/known-issues/android.mdx | 11 +- .../reference/known-issues/flutter.mdx | 19 +- .../video-sdk/reference/known-issues/ios.mdx | 10 +- .../reference/known-issues/react-native.mdx | 19 +- .../reference/known-issues/unity.mdx | 18 +- .../video-sdk/reference/known-issues/web.mdx | 521 +++++++------- .../reference/release-notes/android.mdx | 15 +- .../reference/release-notes/flutter.mdx | 22 +- .../video-sdk/reference/release-notes/ios.mdx | 3 +- .../reference/release-notes/react-js.mdx | 65 ++ .../reference/release-notes/react-native.mdx | 22 +- .../reference/release-notes/unity.mdx | 22 +- .../video-sdk/reference/release-notes/web.mdx | 8 +- .../understand/_product-overview.mdx | 2 +- .../index.mdx} | 4 +- .../project-implementation/android.mdx | 24 +- .../project-implementation/electron.mdx | 16 +- .../project-implementation/react-native.mdx | 2 +- .../project-implementation/unity.mdx | 16 +- .../project-implementation/web.mdx | 22 +- .../project-test/electron.mdx | 2 +- .../project-test/web.mdx | 2 +- shared/voice-sdk/develop/_custom-audio.mdx | 4 +- .../voice-sdk/develop/_stream-raw-audio.mdx | 18 +- .../project-implementation/index.mdx | 1 - .../project-implementation/web.mdx | 2 +- .../custom-audio-deprecated/reference/web.mdx | 4 - .../index.mdx} | 9 - .../project-implementation/index.mdx | 12 +- .../project-implementation/poc3.mdx | 67 ++ .../project-test/.web.mdx | 28 + .../project-test/android.mdx | 43 +- .../project-test/index.mdx | 20 +- .../project-test/poc3.mdx | 20 + .../project-test/react-js.mdx | 45 ++ .../project-test/swift.mdx | 31 +- .../project-test/unity.mdx | 51 +- .../project-test/web.mdx | 35 +- .../project-test/windows.mdx | 59 ++ .../reference/android.mdx | 24 - .../reference/macos.mdx | 1 - .../reference/swift.mdx | 12 - .../reference/unity.mdx | 19 - .../ensure-channel-quality/reference/web.mdx | 10 - .../project-implementation/flutter.mdx | 2 +- .../geofencing/project-test/flutter.mdx | 2 +- .../index.mdx} | 9 +- .../project-implementation/index.mdx | 14 +- .../project-implementation/poc3.mdx | 32 + .../project-implementation/unity.mdx | 2 +- .../project-implementation/web.mdx | 102 --- .../product-workflow/project-test/windows.mdx | 2 +- .../product-workflow/reference/android.mdx | 15 - .../product-workflow/reference/swift.mdx | 36 - .../product-workflow/reference/unity.mdx | 16 - .../product-workflow/reference/unreal.mdx | 16 - .../product-workflow/reference/web.mdx | 6 - .../project-implementation/electron.mdx | 8 +- .../project-implementation/flutter.mdx | 6 +- .../project-implementation/index.mdx | 10 +- .../project-implementation/poc3.mdx | 47 ++ .../project-implementation/react-native.mdx | 10 +- .../project-implementation/unity.mdx | 6 +- .../stream-raw-audio/reference/android.mdx | 7 - .../stream-raw-audio/reference/ios.mdx | 6 - .../stream-raw-audio/reference/macos.mdx | 7 - .../stream-raw-audio/reference/unity.mdx | 8 - .../index.mdx} | 30 +- .../project-implementation/android.mdx | 22 +- .../project-implementation/electron.mdx | 2 +- .../project-implementation/flutter.mdx | 14 +- .../project-implementation/index.mdx | 13 +- .../project-implementation/poc3.mdx | 106 +++ .../project-implementation/swift.mdx | 12 +- .../project-implementation/unity.mdx | 26 +- .../project-implementation/web.mdx | 2 +- .../project-implementation/windows.mdx | 2 +- .../project-setup/electron.mdx | 4 +- .../get-started-sdk/project-setup/flutter.mdx | 2 +- .../get-started-sdk/project-setup/windows.mdx | 2 +- .../get-started-sdk/project-test/index.mdx | 25 +- .../get-started-sdk/reference/android.mdx | 6 - .../get-started-sdk/reference/ios.mdx | 6 - .../get-started-sdk/reference/macos.mdx | 6 - .../get-started-sdk/reference/unity.mdx | 12 - .../get-started-sdk/reference/web.mdx | 1 - shared/voice-sdk/reference/_known-issues.mdx | 42 -- .../reference/known-issues/flutter.mdx | 10 +- .../voice-sdk/reference/known-issues/ios.mdx | 10 +- .../reference/known-issues/react-native.mdx | 10 +- .../reference/known-issues/unity.mdx | 10 +- .../reference/release-notes/android.mdx | 3 +- .../reference/release-notes/flutter.mdx | 15 + .../reference/release-notes/index.mdx | 2 + .../voice-sdk/reference/release-notes/ios.mdx | 3 +- .../reference/release-notes/react-js.mdx | 65 ++ .../reference/release-notes/react-native.mdx | 15 + .../reference/release-notes/unity.mdx | 15 + signaling/develop/authentication-workflow.mdx | 24 +- signaling/develop/best-practice.mdx | 2 +- signaling/{reference => overview}/pricing.mdx | 2 +- .../{reference => overview}/release-notes.mdx | 2 +- .../supported-platforms.mdx | 0 signaling/reference/downloads.mdx | 8 +- video-calling/develop/cloud-proxy.mdx | 4 +- video-calling/get-started/get-started-sdk.mdx | 2 +- .../{reference => overview}/pricing.mdx | 2 +- .../{reference => overview}/release-notes.mdx | 2 +- .../supported-platforms.mdx | 0 video-calling/reference/error-codes.mdx | 13 + video-calling/reference/known-issues.mdx | 14 - voice-calling/develop/cloud-proxy.mdx | 6 +- .../develop/ensure-channel-quality.mdx | 2 +- .../develop/media-stream-encryption.mdx | 2 +- voice-calling/develop/product-workflow.mdx | 2 +- .../get-started/authentication-workflow.mdx | 6 +- voice-calling/get-started/get-started-sdk.mdx | 6 +- .../{reference => overview}/pricing.mdx | 2 +- .../{reference => overview}/release-notes.mdx | 2 +- .../supported-platforms.mdx | 2 +- voice-calling/reference/error-codes.mdx | 13 + voice-calling/reference/known-issues.mdx | 15 - 920 files changed, 22300 insertions(+), 32621 deletions(-) create mode 100644 .gitignore rename agora-analytics/{reference => overview}/pricing.mdx (94%) rename agora-analytics/{reference => overview}/release-notes.mdx (93%) rename agora-analytics/{reference => overview}/supported-platforms.mdx (78%) rename agora-chat/{reference => overview}/pricing.mdx (100%) rename agora-chat/{reference => overview}/release-notes.mdx (93%) rename agora-chat/{reference => overview}/supported-platforms.mdx (93%) create mode 100644 assets/code/video-sdk/ai-noise-suppression/configure-engine.mdx create mode 100644 assets/code/video-sdk/ai-noise-suppression/configure-extension.mdx create mode 100644 assets/code/video-sdk/ai-noise-suppression/enable-denoiser.mdx create mode 100644 assets/code/video-sdk/ai-noise-suppression/import-library.mdx create mode 100644 assets/code/video-sdk/ai-noise-suppression/import-plugin.mdx create mode 100644 assets/code/video-sdk/ai-noise-suppression/set-noise-reduction-mode.mdx create mode 100644 assets/code/video-sdk/ai-noise-suppression/set-reduction-level.mdx create mode 100644 assets/code/video-sdk/ai-noise-suppression/setup-logging.mdx create mode 100644 assets/code/video-sdk/audio-voice-effects/apply-voice-effects.mdx rename assets/code/video-sdk/audio-voice-effects/{swift => }/configure-buttons.mdx (99%) create mode 100644 assets/code/video-sdk/audio-voice-effects/configure-engine.mdx create mode 100644 assets/code/video-sdk/audio-voice-effects/create-ui.mdx create mode 100644 assets/code/video-sdk/audio-voice-effects/event-handler.mdx create mode 100644 assets/code/video-sdk/audio-voice-effects/import-library.mdx create mode 100644 assets/code/video-sdk/audio-voice-effects/pause-play-resume.mdx create mode 100644 assets/code/video-sdk/audio-voice-effects/preload-effect.mdx create mode 100644 assets/code/video-sdk/audio-voice-effects/set-audio-profile.mdx create mode 100644 assets/code/video-sdk/audio-voice-effects/set-audio-route.mdx create mode 100644 assets/code/video-sdk/audio-voice-effects/set-variables.mdx create mode 100644 assets/code/video-sdk/audio-voice-effects/stop-start-mixing.mdx delete mode 100644 assets/code/video-sdk/audio-voice-effects/swift/apply-voice-effects.mdx delete mode 100644 assets/code/video-sdk/audio-voice-effects/swift/create-ui.mdx delete mode 100644 assets/code/video-sdk/audio-voice-effects/swift/pause-play-resume.mdx delete mode 100644 assets/code/video-sdk/audio-voice-effects/swift/set-audio-route.mdx delete mode 100644 assets/code/video-sdk/audio-voice-effects/swift/stop-start-mixing.mdx rename assets/code/video-sdk/audio-voice-effects/{swift => }/update-ui.mdx (100%) create mode 100644 assets/code/video-sdk/authentication-workflow/add-variables.mdx create mode 100644 assets/code/video-sdk/authentication-workflow/event-handler.mdx create mode 100644 assets/code/video-sdk/authentication-workflow/fetch-token.mdx create mode 100644 assets/code/video-sdk/authentication-workflow/import-library.mdx create mode 100644 assets/code/video-sdk/authentication-workflow/join-channel.mdx create mode 100644 assets/code/video-sdk/authentication-workflow/renew-token.mdx rename assets/code/video-sdk/authentication-workflow/{swift => }/specify-channel.mdx (95%) delete mode 100644 assets/code/video-sdk/authentication-workflow/swift/add-variables.mdx delete mode 100644 assets/code/video-sdk/authentication-workflow/swift/fetch-token.mdx create mode 100644 assets/code/video-sdk/cloud-proxy/configure-engine.mdx create mode 100644 assets/code/video-sdk/cloud-proxy/connection-failed.mdx create mode 100644 assets/code/video-sdk/cloud-proxy/event-handler.mdx create mode 100644 assets/code/video-sdk/cloud-proxy/import-library.mdx create mode 100644 assets/code/video-sdk/cloud-proxy/set-cloud-proxy.mdx create mode 100644 assets/code/video-sdk/cloud-proxy/set-variables.mdx create mode 100644 assets/code/video-sdk/custom-video-and-audio/configure-engine-audio.mdx create mode 100644 assets/code/video-sdk/custom-video-and-audio/configure-engine.mdx create mode 100644 assets/code/video-sdk/custom-video-and-audio/create-custom-audio-track.mdx create mode 100644 assets/code/video-sdk/custom-video-and-audio/create-custom-video-track.mdx create mode 100644 assets/code/video-sdk/custom-video-and-audio/destroy-custom-track-audio.mdx create mode 100644 assets/code/video-sdk/custom-video-and-audio/destroy-custom-track-video.mdx create mode 100644 assets/code/video-sdk/custom-video-and-audio/enable-audio-publishing.mdx create mode 100644 assets/code/video-sdk/custom-video-and-audio/enable-video-publishing.mdx create mode 100644 assets/code/video-sdk/custom-video-and-audio/import-library-audio.mdx create mode 100644 assets/code/video-sdk/custom-video-and-audio/import-library.mdx create mode 100644 assets/code/video-sdk/custom-video-and-audio/push-audio-frames.mdx create mode 100644 assets/code/video-sdk/custom-video-and-audio/push-video-frames.mdx create mode 100644 assets/code/video-sdk/custom-video-and-audio/read-audio-input.mdx create mode 100644 assets/code/video-sdk/custom-video-and-audio/render-custom-video.mdx create mode 100644 assets/code/video-sdk/custom-video-and-audio/set-variables-audio.mdx create mode 100644 assets/code/video-sdk/custom-video-and-audio/set-variables.mdx create mode 100644 assets/code/video-sdk/encrypt-media-streams/enable-encryption.mdx create mode 100644 assets/code/video-sdk/encrypt-media-streams/enable-end-to-end-encryption.mdx create mode 100644 assets/code/video-sdk/encrypt-media-streams/event-handler.mdx create mode 100644 assets/code/video-sdk/encrypt-media-streams/import-library.mdx create mode 100644 assets/code/video-sdk/encrypt-media-streams/set-variables.mdx create mode 100644 assets/code/video-sdk/ensure-channel-quality/event-handler.mdx create mode 100644 assets/code/video-sdk/ensure-channel-quality/implement-call-quality-view.mdx create mode 100644 assets/code/video-sdk/ensure-channel-quality/implement-declarations.mdx rename assets/code/video-sdk/ensure-channel-quality/{swift => }/implement-labels.mdx (100%) rename assets/code/video-sdk/ensure-channel-quality/{swift => }/implement-network-status.mdx (62%) create mode 100644 assets/code/video-sdk/ensure-channel-quality/import-library.mdx create mode 100644 assets/code/video-sdk/ensure-channel-quality/probe-test.mdx create mode 100644 assets/code/video-sdk/ensure-channel-quality/set-audio-video-profile.mdx create mode 100644 assets/code/video-sdk/ensure-channel-quality/set-latency.mdx create mode 100644 assets/code/video-sdk/ensure-channel-quality/setup-engine.mdx create mode 100644 assets/code/video-sdk/ensure-channel-quality/show-stats.mdx delete mode 100644 assets/code/video-sdk/ensure-channel-quality/swift/implement-declarations.mdx create mode 100644 assets/code/video-sdk/ensure-channel-quality/switch-quality.mdx create mode 100644 assets/code/video-sdk/ensure-channel-quality/test-hardware.mdx create mode 100644 assets/code/video-sdk/geofencing/combine-geofence.mdx create mode 100644 assets/code/video-sdk/geofencing/set-geofence.mdx create mode 100644 assets/code/video-sdk/get-started-sdk/create-engine.mdx create mode 100644 assets/code/video-sdk/get-started-sdk/declare-variables.mdx create mode 100644 assets/code/video-sdk/get-started-sdk/destroy.mdx create mode 100644 assets/code/video-sdk/get-started-sdk/handle-events.mdx create mode 100644 assets/code/video-sdk/get-started-sdk/import-library.mdx create mode 100644 assets/code/video-sdk/get-started-sdk/join-channel.mdx create mode 100644 assets/code/video-sdk/get-started-sdk/leave-channel.mdx create mode 100644 assets/code/video-sdk/get-started-sdk/local-video.mdx create mode 100644 assets/code/video-sdk/get-started-sdk/remote-video.mdx create mode 100644 assets/code/video-sdk/get-started-sdk/request-permissions.mdx create mode 100644 assets/code/video-sdk/get-started-sdk/set-user-role.mdx create mode 100644 assets/code/video-sdk/get-started-sdk/setup-audio-video-tracks.mdx delete mode 100644 assets/code/video-sdk/get-started-sdk/swift/create-ui.mdx delete mode 100644 assets/code/video-sdk/get-started-sdk/swift/join-and-leave.mdx delete mode 100644 assets/code/video-sdk/get-started-sdk/swift/role-action.mdx delete mode 100644 assets/code/video-sdk/get-started-sdk/swift/show-message.mdx delete mode 100644 assets/code/video-sdk/get-started-sdk/swift/view-did-disappear.mdx create mode 100644 assets/code/video-sdk/live-streaming-multiple-channels/import-library.mdx create mode 100644 assets/code/video-sdk/live-streaming-multiple-channels/join-a-second-channel.mdx create mode 100644 assets/code/video-sdk/live-streaming-multiple-channels/leave-second-channel.mdx create mode 100644 assets/code/video-sdk/live-streaming-multiple-channels/monitor-channel-media-relay-state.mdx create mode 100644 assets/code/video-sdk/live-streaming-multiple-channels/receive-callbacks-from-second-channel.mdx create mode 100644 assets/code/video-sdk/live-streaming-multiple-channels/set-variables.mdx create mode 100644 assets/code/video-sdk/live-streaming-multiple-channels/start-stop-channel-media-relay.mdx create mode 100644 assets/code/video-sdk/play-media/configure-engine.mdx create mode 100644 assets/code/video-sdk/play-media/destroy-media-player.mdx create mode 100644 assets/code/video-sdk/play-media/display-media.mdx create mode 100644 assets/code/video-sdk/play-media/event-handler.mdx create mode 100644 assets/code/video-sdk/play-media/import-library.mdx create mode 100644 assets/code/video-sdk/play-media/play-pause-resume.mdx create mode 100644 assets/code/video-sdk/play-media/set-variables.mdx create mode 100644 assets/code/video-sdk/play-media/start-streaming.mdx create mode 100644 assets/code/video-sdk/play-media/update-channel-publish-options.mdx create mode 100644 assets/code/video-sdk/product-workflow/import-library.mdx create mode 100644 assets/code/video-sdk/product-workflow/ios-extension.mdx create mode 100644 assets/code/video-sdk/product-workflow/macos-screencapture.mdx create mode 100644 assets/code/video-sdk/product-workflow/media-device-changed.mdx create mode 100644 assets/code/video-sdk/product-workflow/microphone-camera-change.mdx create mode 100644 assets/code/video-sdk/product-workflow/mute-local-video.mdx create mode 100644 assets/code/video-sdk/product-workflow/mute-remote-user.mdx create mode 100644 assets/code/video-sdk/product-workflow/override-broadcast-started.mdx create mode 100644 assets/code/video-sdk/product-workflow/preview-screen-track.mdx create mode 100644 assets/code/video-sdk/product-workflow/publish-screen-track.mdx create mode 100644 assets/code/video-sdk/product-workflow/screen-sharer-target.mdx create mode 100644 assets/code/video-sdk/product-workflow/setup-engine.mdx create mode 100644 assets/code/video-sdk/product-workflow/setup-volume.mdx create mode 100644 assets/code/video-sdk/product-workflow/start-sharing.mdx create mode 100644 assets/code/video-sdk/product-workflow/stop-sharing.mdx create mode 100644 assets/code/video-sdk/raw-video-audio/configure-engine.mdx create mode 100644 assets/code/video-sdk/raw-video-audio/import-library.mdx create mode 100644 assets/code/video-sdk/raw-video-audio/modify-audio-video.mdx create mode 100644 assets/code/video-sdk/raw-video-audio/register-video-audio-frame-observers.mdx create mode 100644 assets/code/video-sdk/raw-video-audio/set-audio-frame-observer.mdx create mode 100644 assets/code/video-sdk/raw-video-audio/set-variables.mdx create mode 100644 assets/code/video-sdk/raw-video-audio/set-video-frame-observer.mdx create mode 100644 assets/code/video-sdk/raw-video-audio/unregister-video-audio-frame-observers.mdx create mode 100644 assets/code/video-sdk/spatial-audio/import-library.mdx create mode 100644 assets/code/video-sdk/spatial-audio/play-media.mdx create mode 100644 assets/code/video-sdk/spatial-audio/remove-spatial.mdx create mode 100644 assets/code/video-sdk/spatial-audio/set-variables.mdx create mode 100644 assets/code/video-sdk/spatial-audio/setup-local.mdx create mode 100644 assets/code/video-sdk/spatial-audio/setup-remote.mdx create mode 100644 assets/code/video-sdk/spatial-audio/setup-spatial.mdx create mode 100644 assets/code/video-sdk/virtual-background/blur-background.mdx create mode 100644 assets/code/video-sdk/virtual-background/color-background.mdx create mode 100644 assets/code/video-sdk/virtual-background/configure-engine.mdx create mode 100644 assets/code/video-sdk/virtual-background/device-compatibility.mdx create mode 100644 assets/code/video-sdk/virtual-background/image-background.mdx create mode 100644 assets/code/video-sdk/virtual-background/import-library.mdx create mode 100644 assets/code/video-sdk/virtual-background/reset-background.mdx create mode 100644 assets/code/video-sdk/virtual-background/set-virtual-background.mdx create mode 100644 assets/images/extensions-marketplace/geofencing.svg create mode 100644 assets/images/extensions-marketplace/ncs-worflow.svg create mode 100644 assets/images/interactive-live-streaming/live-streaming-over-multiple-channels.svg create mode 100644 assets/images/shared/ncs-worflow.svg create mode 100644 assets/images/video-calling/video_call_workflow.svg create mode 100644 assets/images/video-calling/video_call_workflow_run_end.svg create mode 100644 assets/images/video-sdk/audio-and-voice-effects-web.puml create mode 100644 assets/images/video-sdk/audio-and-voice-effects-web.svg create mode 100644 assets/images/video-sdk/product-workflow.svg create mode 100644 assets/images/video-sdk/video-call-logic-reactjs.puml create mode 100644 assets/images/video-sdk/video-call-logic-reactjs.svg create mode 100644 assets/images/video-sdk/video_call_workflow.svg create mode 100644 assets/images/video-sdk/video_call_workflow_run_end.svg create mode 100644 assets/images/voice-sdk/process-raw-audio.svg create mode 100644 assets/images/voice-sdk/product-workflow-voice.svg rename broadcast-streaming/{reference => overview}/pricing.mdx (94%) rename broadcast-streaming/{reference => overview}/release-notes.mdx (94%) rename broadcast-streaming/{reference => overview}/supported-platforms.mdx (100%) create mode 100644 broadcast-streaming/reference/error-codes.mdx delete mode 100644 broadcast-streaming/reference/known-issues.mdx rename cloud-recording/{reference => overview}/pricing-webpage-recording.md (94%) rename cloud-recording/{reference => overview}/pricing.md (99%) rename cloud-recording/{reference => overview}/release-notes.mdx (93%) rename cloud-recording/{reference => overview}/supported-platforms.mdx (93%) rename extensions-marketplace/{reference => overview}/release-notes.mdx (94%) rename extensions-marketplace/{reference => overview}/supported-platforms.mdx (99%) rename flexible-classroom/{reference => overview}/release-notes.mdx (94%) rename flexible-classroom/{reference => overview}/supported-platforms.md (99%) rename interactive-live-streaming/{reference => overview}/pricing.mdx (95%) rename interactive-live-streaming/{reference => overview}/release-notes.mdx (94%) rename interactive-live-streaming/{reference => overview}/supported-platforms.mdx (100%) create mode 100644 interactive-live-streaming/reference/error-codes.mdx delete mode 100644 interactive-live-streaming/reference/known-issues.mdx rename interactive-whiteboard/{reference => overview}/pricing.md (99%) rename interactive-whiteboard/{reference => overview}/release-notes-uikit.mdx (94%) rename interactive-whiteboard/{reference => overview}/release-notes.mdx (94%) rename interactive-whiteboard/{reference => overview}/supported-platforms.mdx (93%) rename iot/{reference => overview}/pricing.mdx (94%) rename iot/{reference => overview}/release-notes.mdx (93%) rename iot/{reference => overview}/supported-platforms.mdx (100%) rename media-pull/{reference => overview}/pricing.mdx (94%) rename media-pull/{reference => overview}/release-notes.mdx (100%) rename media-push/{reference => overview}/pricing.mdx (94%) rename on-premise-recording/{reference => overview}/billing.md (99%) rename on-premise-recording/{reference => overview}/release-notes.mdx (99%) rename server-gateway/{reference => overview}/pricing.mdx (93%) rename server-gateway/{reference => overview}/release-notes.mdx (94%) create mode 100644 shared/common/prerequities-get-started.mdx create mode 100644 shared/common/project-setup/android.mdx rename shared/{video-sdk/get-started/get-started-sdk => common}/project-setup/electron.mdx (95%) create mode 100644 shared/common/project-setup/flutter.mdx rename shared/{video-sdk/get-started/get-started-sdk => common}/project-setup/index.mdx (90%) create mode 100644 shared/common/project-setup/ios.mdx create mode 100644 shared/common/project-setup/macos.mdx create mode 100644 shared/common/project-setup/react-js.mdx rename shared/{video-sdk/get-started/get-started-sdk => common}/project-setup/react-native.mdx (100%) create mode 100644 shared/common/project-setup/swift.mdx create mode 100644 shared/common/project-setup/unity.mdx create mode 100644 shared/common/project-setup/unreal.mdx create mode 100644 shared/common/project-setup/web.mdx create mode 100644 shared/common/project-setup/windows.mdx create mode 100644 shared/common/project-test/android.mdx create mode 100644 shared/common/project-test/clone-project.mdx create mode 100644 shared/common/project-test/electron.mdx create mode 100644 shared/common/project-test/flutter.mdx create mode 100644 shared/common/project-test/generate-temp-rtc-token.mdx rename shared/{video-sdk/develop/spatial-audio/project-implementation => common/project-test}/ios.mdx (100%) create mode 100644 shared/common/project-test/load-web-demo.mdx rename shared/{video-sdk/get-started/get-started-sdk/project-setup => common/project-test}/macos.mdx (99%) create mode 100644 shared/common/project-test/open-config-file.mdx create mode 100644 shared/common/project-test/react-js.mdx create mode 100644 shared/common/project-test/react-native.mdx create mode 100644 shared/common/project-test/rtc-first-steps.mdx create mode 100644 shared/common/project-test/run-reference-app.mdx create mode 100644 shared/common/project-test/set-app-id.mdx create mode 100644 shared/common/project-test/set-authentication-rtc.mdx create mode 100644 shared/common/project-test/swift.mdx create mode 100644 shared/common/project-test/unity.mdx create mode 100644 shared/common/project-test/windows.mdx create mode 100644 shared/extensions-marketplace/ai-noise-suppression/project-implementation/poc3.mdx create mode 100644 shared/extensions-marketplace/ai-noise-suppression/project-test/poc3.mdx create mode 100644 shared/extensions-marketplace/ai-noise-suppression/project-test/react-js.mdx create mode 100644 shared/extensions-marketplace/common/project-test/poc3.mdx create mode 100644 shared/extensions-marketplace/virtual-background/project-implementation/poc3.mdx delete mode 100644 shared/extensions-marketplace/virtual-background/project-test/ios.mdx delete mode 100644 shared/extensions-marketplace/virtual-background/project-test/macos.mdx create mode 100644 shared/extensions-marketplace/virtual-background/project-test/poc3.mdx create mode 100644 shared/extensions-marketplace/virtual-background/project-test/react-js.mdx delete mode 100644 shared/extensions-marketplace/virtual-background/project-test/swift.mdx create mode 100644 shared/video-sdk/authentication-workflow/project-implementation/poc3.mdx create mode 100644 shared/video-sdk/authentication-workflow/project-implementation/react-js.mdx delete mode 100644 shared/video-sdk/authentication-workflow/project-implementation/web.mdx create mode 100644 shared/video-sdk/authentication-workflow/project-test/poc3.mdx create mode 100644 shared/video-sdk/authentication-workflow/project-test/react-js.mdx create mode 100644 shared/video-sdk/authentication-workflow/reference/react-js.mdx create mode 100644 shared/video-sdk/develop/audio-and-voice-effects/project-implementation/poc3.mdx create mode 100644 shared/video-sdk/develop/audio-and-voice-effects/project-implementation/react-js.mdx create mode 100644 shared/video-sdk/develop/audio-and-voice-effects/project-test/poc3.mdx create mode 100644 shared/video-sdk/develop/audio-and-voice-effects/project-test/react-js.mdx create mode 100644 shared/video-sdk/develop/audio-and-voice-effects/reference/react-js.mdx create mode 100644 shared/video-sdk/develop/cloud-proxy/project-implementation/poc3.mdx create mode 100644 shared/video-sdk/develop/cloud-proxy/project-implementation/react-js.mdx delete mode 100644 shared/video-sdk/develop/cloud-proxy/project-implementation/web.mdx create mode 100644 shared/video-sdk/develop/cloud-proxy/project-setup/unity.mdx delete mode 100644 shared/video-sdk/develop/cloud-proxy/project-test/ios.mdx delete mode 100644 shared/video-sdk/develop/cloud-proxy/project-test/macos.mdx create mode 100644 shared/video-sdk/develop/cloud-proxy/project-test/poc3.mdx create mode 100644 shared/video-sdk/develop/cloud-proxy/project-test/react-js.mdx create mode 100644 shared/video-sdk/develop/cloud-proxy/reference/react-js.mdx create mode 100644 shared/video-sdk/develop/custom-video-and-audio/project-implementation/poc3.mdx create mode 100644 shared/video-sdk/develop/custom-video-and-audio/project-implementation/react-js.mdx delete mode 100644 shared/video-sdk/develop/custom-video-and-audio/project-implementation/web.mdx create mode 100644 shared/video-sdk/develop/custom-video-and-audio/project-test/poc3.mdx create mode 100644 shared/video-sdk/develop/custom-video-and-audio/project-test/react-js.mdx create mode 100644 shared/video-sdk/develop/custom-video-and-audio/reference/react-js.mdx create mode 100644 shared/video-sdk/develop/encrypt-media-streams/project-implementation/poc3.mdx create mode 100644 shared/video-sdk/develop/encrypt-media-streams/project-implementation/react-js.mdx delete mode 100644 shared/video-sdk/develop/encrypt-media-streams/project-implementation/web.mdx create mode 100644 shared/video-sdk/develop/encrypt-media-streams/project-test/poc3.mdx create mode 100644 shared/video-sdk/develop/encrypt-media-streams/project-test/react-js.mdx create mode 100644 shared/video-sdk/develop/encrypt-media-streams/reference/react-js.mdx create mode 100644 shared/video-sdk/develop/ensure-channel-quality/project-implementation/poc3.mdx create mode 100644 shared/video-sdk/develop/ensure-channel-quality/project-implementation/react-js.mdx delete mode 100644 shared/video-sdk/develop/ensure-channel-quality/project-implementation/web.mdx delete mode 100644 shared/video-sdk/develop/ensure-channel-quality/project-setup/ios.mdx create mode 100644 shared/video-sdk/develop/ensure-channel-quality/project-test/poc3.mdx create mode 100644 shared/video-sdk/develop/ensure-channel-quality/project-test/react-js.mdx create mode 100644 shared/video-sdk/develop/ensure-channel-quality/reference/react-js.mdx create mode 100644 shared/video-sdk/develop/geofencing/project-implementation/poc3.mdx rename shared/video-sdk/develop/geofencing/project-implementation/{web.mdx => react-js.mdx} (51%) create mode 100644 shared/video-sdk/develop/geofencing/project-test/poc3.mdx create mode 100644 shared/video-sdk/develop/geofencing/project-test/react-js.mdx create mode 100644 shared/video-sdk/develop/geofencing/reference/react-js.mdx create mode 100644 shared/video-sdk/develop/live-streaming-over-multiple-channels/project-implementation/poc3.mdx delete mode 100644 shared/video-sdk/develop/live-streaming-over-multiple-channels/project-implementation/web.mdx create mode 100644 shared/video-sdk/develop/live-streaming-over-multiple-channels/project-test/poc3.mdx create mode 100644 shared/video-sdk/develop/live-streaming-over-multiple-channels/project-test/react-js.mdx delete mode 100644 shared/video-sdk/develop/live-streaming-over-multiple-channels/project-test/swift.mdx create mode 100644 shared/video-sdk/develop/play-media/project-implementation/poc3.mdx create mode 100644 shared/video-sdk/develop/play-media/project-implementation/react-js.mdx delete mode 100644 shared/video-sdk/develop/play-media/project-implementation/web.mdx create mode 100644 shared/video-sdk/develop/play-media/project-test/poc3.mdx create mode 100644 shared/video-sdk/develop/play-media/project-test/react-js.mdx create mode 100644 shared/video-sdk/develop/play-media/reference/react-js.mdx create mode 100644 shared/video-sdk/develop/product-workflow/project-implementation/poc3.mdx delete mode 100644 shared/video-sdk/develop/product-workflow/project-implementation/web.mdx create mode 100644 shared/video-sdk/develop/product-workflow/project-test/poc3.mdx create mode 100644 shared/video-sdk/develop/product-workflow/project-test/react-js.mdx create mode 100644 shared/video-sdk/develop/product-workflow/reference/react-js.mdx create mode 100644 shared/video-sdk/develop/spatial-audio/project-implementation/poc3.mdx delete mode 100644 shared/video-sdk/develop/spatial-audio/project-implementation/swift.mdx delete mode 100644 shared/video-sdk/develop/spatial-audio/project-implementation/web.mdx create mode 100644 shared/video-sdk/develop/spatial-audio/project-test/poc3.mdx create mode 100644 shared/video-sdk/develop/spatial-audio/project-test/react-js.mdx create mode 100644 shared/video-sdk/develop/stream-raw-audio-and-video/project-implementation/poc3.mdx create mode 100644 shared/video-sdk/develop/stream-raw-audio-and-video/project-test/poc3.mdx rename shared/video-sdk/{_get-started-sdk.mdx => get-started/get-started-sdk/index.mdx} (70%) delete mode 100644 shared/video-sdk/get-started/get-started-sdk/project-implementation/android.mdx delete mode 100644 shared/video-sdk/get-started/get-started-sdk/project-implementation/flutter.mdx delete mode 100644 shared/video-sdk/get-started/get-started-sdk/project-implementation/ios.mdx delete mode 100644 shared/video-sdk/get-started/get-started-sdk/project-implementation/macos.mdx create mode 100644 shared/video-sdk/get-started/get-started-sdk/project-implementation/poc3.mdx create mode 100644 shared/video-sdk/get-started/get-started-sdk/project-implementation/react-js.mdx delete mode 100644 shared/video-sdk/get-started/get-started-sdk/project-implementation/swift.mdx delete mode 100644 shared/video-sdk/get-started/get-started-sdk/project-implementation/web.mdx delete mode 100644 shared/video-sdk/get-started/get-started-sdk/project-setup/android.mdx delete mode 100644 shared/video-sdk/get-started/get-started-sdk/project-setup/flutter.mdx delete mode 100644 shared/video-sdk/get-started/get-started-sdk/project-setup/ios.mdx create mode 100644 shared/video-sdk/get-started/get-started-sdk/project-setup/react-js.mdx delete mode 100644 shared/video-sdk/get-started/get-started-sdk/project-setup/swift.mdx delete mode 100644 shared/video-sdk/get-started/get-started-sdk/project-setup/unity.mdx delete mode 100644 shared/video-sdk/get-started/get-started-sdk/project-setup/windows.mdx create mode 100644 shared/video-sdk/get-started/get-started-sdk/project-test/poc3.mdx create mode 100644 shared/video-sdk/get-started/get-started-sdk/project-test/react-js.mdx create mode 100644 shared/video-sdk/get-started/get-started-sdk/reference/react-js.mdx create mode 100644 shared/video-sdk/reference/_error-codes.mdx delete mode 100644 shared/video-sdk/reference/_known-issues.mdx create mode 100644 shared/video-sdk/reference/api-reference/index.mdx create mode 100644 shared/video-sdk/reference/api-reference/react-js/components-en.react.mdx create mode 100644 shared/video-sdk/reference/api-reference/react-js/data-types-en.react.mdx create mode 100644 shared/video-sdk/reference/api-reference/react-js/hooks-en.react.mdx create mode 100644 shared/video-sdk/reference/api-reference/react-js/index.mdx delete mode 100644 shared/video-sdk/reference/app-size-optimization/manual-install/index copy.mdx create mode 100644 shared/video-sdk/reference/app-size-optimization/manual-install/react-js.mdx create mode 100644 shared/video-sdk/reference/release-notes/react-js.mdx rename shared/voice-sdk/{_authentication-workflow.mdx => authentication-workflow/index.mdx} (98%) rename shared/voice-sdk/develop/{_ensure-channel-quality.mdx => ensure-channel-quality/index.mdx} (93%) create mode 100644 shared/voice-sdk/develop/ensure-channel-quality/project-implementation/poc3.mdx create mode 100644 shared/voice-sdk/develop/ensure-channel-quality/project-test/.web.mdx create mode 100644 shared/voice-sdk/develop/ensure-channel-quality/project-test/poc3.mdx create mode 100644 shared/voice-sdk/develop/ensure-channel-quality/project-test/react-js.mdx create mode 100644 shared/voice-sdk/develop/ensure-channel-quality/project-test/windows.mdx rename shared/voice-sdk/develop/{_product-workflow.mdx => product-workflow/index.mdx} (89%) create mode 100644 shared/voice-sdk/develop/product-workflow/project-implementation/poc3.mdx delete mode 100644 shared/voice-sdk/develop/product-workflow/project-implementation/web.mdx create mode 100644 shared/voice-sdk/develop/stream-raw-audio/project-implementation/poc3.mdx rename shared/voice-sdk/get-started/{_get-started-sdk.mdx => get-started-sdk/index.mdx} (77%) create mode 100644 shared/voice-sdk/get-started/get-started-sdk/project-implementation/poc3.mdx delete mode 100644 shared/voice-sdk/reference/_known-issues.mdx create mode 100644 shared/voice-sdk/reference/release-notes/react-js.mdx rename signaling/{reference => overview}/pricing.mdx (99%) rename signaling/{reference => overview}/release-notes.mdx (97%) rename signaling/{reference => overview}/supported-platforms.mdx (100%) rename video-calling/{reference => overview}/pricing.mdx (94%) rename video-calling/{reference => overview}/release-notes.mdx (93%) rename video-calling/{reference => overview}/supported-platforms.mdx (100%) create mode 100644 video-calling/reference/error-codes.mdx delete mode 100644 video-calling/reference/known-issues.mdx rename voice-calling/{reference => overview}/pricing.mdx (94%) rename voice-calling/{reference => overview}/release-notes.mdx (93%) rename voice-calling/{reference => overview}/supported-platforms.mdx (93%) create mode 100644 voice-calling/reference/error-codes.mdx delete mode 100644 voice-calling/reference/known-issues.mdx diff --git a/.gitignore b/.gitignore new file mode 100644 index 000000000..496ee2ca6 --- /dev/null +++ b/.gitignore @@ -0,0 +1 @@ +.DS_Store \ No newline at end of file diff --git a/agora-analytics/reference/pricing.mdx b/agora-analytics/overview/pricing.mdx similarity index 94% rename from agora-analytics/reference/pricing.mdx rename to agora-analytics/overview/pricing.mdx index fdf3655ac..eb7fa58d0 100644 --- a/agora-analytics/reference/pricing.mdx +++ b/agora-analytics/overview/pricing.mdx @@ -1,6 +1,6 @@ --- title: 'Pricing' -sidebar_position: 1 +sidebar_position: 3 platform_selector: false description: > Provides you with information on billing, fee deductions, free-of-charge policy, and any suspension to your account based on the account type. diff --git a/agora-analytics/reference/release-notes.mdx b/agora-analytics/overview/release-notes.mdx similarity index 93% rename from agora-analytics/reference/release-notes.mdx rename to agora-analytics/overview/release-notes.mdx index 23f7ab745..24f479199 100644 --- a/agora-analytics/reference/release-notes.mdx +++ b/agora-analytics/overview/release-notes.mdx @@ -1,6 +1,6 @@ --- title: 'Release notes' -sidebar_position: 2.5 +sidebar_position: 4 type: docs platform_selector: false description: > diff --git a/agora-analytics/reference/supported-platforms.mdx b/agora-analytics/overview/supported-platforms.mdx similarity index 78% rename from agora-analytics/reference/supported-platforms.mdx rename to agora-analytics/overview/supported-platforms.mdx index 8aaa981a3..625f0fe41 100644 --- a/agora-analytics/reference/supported-platforms.mdx +++ b/agora-analytics/overview/supported-platforms.mdx @@ -1,10 +1,10 @@ --- title: 'Supported platforms' -sidebar_position: 6 +sidebar_position: 5 type: docs platform_selector: false description: > - A list of terms used in Agora documentation. + A list of platforms supported by Agora Analytics. --- import SupportedPlatform from '@docs/shared/common/_supported-platforms.mdx'; diff --git a/agora-chat/reference/pricing.mdx b/agora-chat/overview/pricing.mdx similarity index 100% rename from agora-chat/reference/pricing.mdx rename to agora-chat/overview/pricing.mdx diff --git a/agora-chat/reference/release-notes.mdx b/agora-chat/overview/release-notes.mdx similarity index 93% rename from agora-chat/reference/release-notes.mdx rename to agora-chat/overview/release-notes.mdx index 5c3af1c7b..c05ffe6c0 100644 --- a/agora-chat/reference/release-notes.mdx +++ b/agora-chat/overview/release-notes.mdx @@ -1,6 +1,6 @@ --- title: 'Release notes' -sidebar_position: 1 +sidebar_position: 4 type: docs description: > Provides release notes of Agora Chat. diff --git a/agora-chat/reference/supported-platforms.mdx b/agora-chat/overview/supported-platforms.mdx similarity index 93% rename from agora-chat/reference/supported-platforms.mdx rename to agora-chat/overview/supported-platforms.mdx index 76197536b..bb3522782 100644 --- a/agora-chat/reference/supported-platforms.mdx +++ b/agora-chat/overview/supported-platforms.mdx @@ -1,6 +1,6 @@ --- title: 'Supported platforms' -sidebar_position: 2 +sidebar_position: 5 type: docs description: > Lists the platform that Chat supports. diff --git a/assets/code/video-sdk/ai-noise-suppression/configure-engine.mdx b/assets/code/video-sdk/ai-noise-suppression/configure-engine.mdx new file mode 100644 index 000000000..91ce71e72 --- /dev/null +++ b/assets/code/video-sdk/ai-noise-suppression/configure-engine.mdx @@ -0,0 +1,18 @@ + + ```typescript + export function AINoiseReduction() { + const agoraEngine = useRTCClient(AgoraRTC.createClient({ codec: "vp8", mode: "rtc" })); + + return ( +
+

AI Noise Suppression

+ + + + + +
+ ); + } + ``` +
\ No newline at end of file diff --git a/assets/code/video-sdk/ai-noise-suppression/configure-extension.mdx b/assets/code/video-sdk/ai-noise-suppression/configure-extension.mdx new file mode 100644 index 000000000..d5a504394 --- /dev/null +++ b/assets/code/video-sdk/ai-noise-suppression/configure-extension.mdx @@ -0,0 +1,41 @@ + + ```typescript + const extension = useRef(new AIDenoiserExtension({assetsPath:'./node_modules/agora-extension-ai-denoiser/external'})); + const processor = useRef(); + + useEffect(() => { + const initializeAIDenoiserProcessor = async () => { + AgoraRTC.registerExtensions([extension.current]); + if (!extension.current.checkCompatibility()) { + console.error("Does not support AI Denoiser!"); + return; + } + + if (agoraContext.localMicrophoneTrack) + { + console.log("Initializing an ai noise processor..."); + try { + processor.current = extension.current.createProcessor(); + agoraContext.localMicrophoneTrack.pipe(processor.current).pipe(agoraContext.localMicrophoneTrack.processorDestination); + await processor.current.enable(); + } catch (error) { + console.error("Error applying noise reduction:", error); + } + } + }; + void initializeAIDenoiserProcessor(); + + return () => { + const disableAIDenoiser = async () => { + processor.current?.unpipe(); + agoraContext.localMicrophoneTrack.unpipe(); + await processor.current?.disable(); + }; + void disableAIDenoiser(); + }; + }, [agoraContext.localMicrophoneTrack]); + ``` + * pipe + * unpipe + + \ No newline at end of file diff --git a/assets/code/video-sdk/ai-noise-suppression/enable-denoiser.mdx b/assets/code/video-sdk/ai-noise-suppression/enable-denoiser.mdx new file mode 100644 index 000000000..9ec46f0ae --- /dev/null +++ b/assets/code/video-sdk/ai-noise-suppression/enable-denoiser.mdx @@ -0,0 +1,66 @@ + + ```javascript + // Create an AIDenoiserExtension instance, and pass in the host URL of the Wasm files + const denoiser = new AIDenoiserExtension({ assetsPath: "/node_modules/agora-extension-ai-denoiser/external/" }); + // Check compatibility + if (!denoiser.checkCompatibility()) { + // The extension might not be supported in the current browser. You can stop executing further code logic + console.error("Does not support AI Denoiser!"); + } + // Register the extension + AgoraRTC.registerExtensions([denoiser]); + // (Optional) Listen for the callback reporting that the Wasm files fail to load + denoiser.onloaderror = (e) => { + // If the Wasm files fail to load, you can disable the plugin, for example: + // openDenoiserButton.enabled = false; + console.log(e); + }; + + // Create a processor + const processor = denoiser.createProcessor(); + + // Inject the extension to the audio processing pipeline + channelParameters.localAudioTrack + .pipe(processor) + .pipe(channelParameters.localAudioTrack.processorDestination); + + await processor.enable(); + ``` + - [createProcessor](#createprocessor) + - [enable](#enable) + + + ```kotlin + override fun setupAgoraEngine(): Boolean { + val result = super.setupAgoraEngine() + + // Enable AI noise suppression + val mode = 2 + // Choose a noise suppression mode from the following: + // 0: (Default) Balanced noise reduction mode + // 1: Aggressive mode + // 2: Aggressive mode with low latency + agoraEngine!!.setAINSMode(true, mode) + + return result + } + ``` + - setAINSMode + + + + ```swift + func setNoiseSuppression(_ enable: Bool, mode: AUDIO_AINS_MODE) -> Int32 { + self.agoraEngine.setAINSMode(enable, mode: mode) + } + ``` + + - setAINSMode(_:mode:) + - AUDIO_AINS_MODE + + + - setAINSMode(_:mode:) + - AUDIO_AINS_MODE + + + \ No newline at end of file diff --git a/assets/code/video-sdk/ai-noise-suppression/import-library.mdx b/assets/code/video-sdk/ai-noise-suppression/import-library.mdx new file mode 100644 index 000000000..835062d3c --- /dev/null +++ b/assets/code/video-sdk/ai-noise-suppression/import-library.mdx @@ -0,0 +1,18 @@ + +```javascript +import AgoraManager from "../agora_manager/agora_manager.js"; +import AgoraRTC from "agora-rtc-sdk-ng"; +import { AIDenoiserExtension } from "agora-extension-ai-denoiser"; +``` + + +```typescript +import AgoraRTC from "agora-rtc-sdk-ng"; +import { useRTCClient, AgoraRTCProvider } from "agora-rtc-react"; +import { useEffect, useRef, useState } from "react"; +import AuthenticationWorkflowManager from "../authentication-workflow/authenticationWorkflowManager"; +import {AIDenoiserExtension, AIDenoiserProcessorLevel, AIDenoiserProcessorMode, IAIDenoiserProcessor} from "agora-extension-ai-denoiser"; +import { useConnectionState } from 'agora-rtc-react'; +import { useAgoraContext } from "../agora-manager/agoraManager"; +``` + diff --git a/assets/code/video-sdk/ai-noise-suppression/import-plugin.mdx b/assets/code/video-sdk/ai-noise-suppression/import-plugin.mdx new file mode 100644 index 000000000..a6385d0ec --- /dev/null +++ b/assets/code/video-sdk/ai-noise-suppression/import-plugin.mdx @@ -0,0 +1,25 @@ + + To enable , you must import the plugin to your target. + + * **Swift Package Manager** + + Add the product "AINS" to your app target. This is part of the AgoraRtcEngine Swift Package. + + * **CocoaPods** + + Include "AINS" in the subspecs in your Podfile: + + ```rb + target 'Your App' do + pod 'AgoraRtcEngine_iOS', '~> 4.2', :subspecs => ['RtcBasic', 'AINS'] + end + ``` + + + ```rb + target 'Your App' do + pod 'AgoraRtcEngine_macOS', '~> 4.2', :subspecs => ['RtcBasic', 'AINS'] + end + ``` + + \ No newline at end of file diff --git a/assets/code/video-sdk/ai-noise-suppression/set-noise-reduction-mode.mdx b/assets/code/video-sdk/ai-noise-suppression/set-noise-reduction-mode.mdx new file mode 100644 index 000000000..5be7929b3 --- /dev/null +++ b/assets/code/video-sdk/ai-noise-suppression/set-noise-reduction-mode.mdx @@ -0,0 +1,34 @@ + +```typescript +const changeNoiseReductionMode = (selectedOption: string) => { + if (!processor.current) { + console.error("AI noise reduction processor not initialized"); + return; + } + if(selectedOption === "STATIONARY_NS") + { + processor.current.setMode(AIDenoiserProcessorMode.STATIONARY_NS) + .then(() => + { + console.log("Mode set to:", selectedOption); + }) + .catch((error) => + { + console.log(error); + }); + } + else + { + processor.current.setMode(AIDenoiserProcessorMode.NSNG) + .then(() => + { + console.log("Mode set to:", selectedOption); + }) + .catch((error) => + { + console.log(error); + }); + } +} +``` + \ No newline at end of file diff --git a/assets/code/video-sdk/ai-noise-suppression/set-reduction-level.mdx b/assets/code/video-sdk/ai-noise-suppression/set-reduction-level.mdx new file mode 100644 index 000000000..45720af9d --- /dev/null +++ b/assets/code/video-sdk/ai-noise-suppression/set-reduction-level.mdx @@ -0,0 +1,34 @@ + +```typescript +const changeNoiseReductionLevel = (selectedOption: string) => { + if (!processor.current) { + console.error("AI noise reduction processor not initialized"); + return; + } + if(selectedOption === "aggressive") + { + processor.current.setLevel(AIDenoiserProcessorLevel.AGGRESSIVE) + .then(() => + { + console.log("Level set to:", selectedOption); + }) + .catch((error) => + { + console.log(error); + }); + } + else + { + processor.current.setLevel(AIDenoiserProcessorLevel.SOFT) + .then(() => + { + console.log("Level set to:", selectedOption); + }) + .catch((error) => + { + console.log(error); + }); + } +} +``` + \ No newline at end of file diff --git a/assets/code/video-sdk/ai-noise-suppression/setup-logging.mdx b/assets/code/video-sdk/ai-noise-suppression/setup-logging.mdx new file mode 100644 index 000000000..0b95a8006 --- /dev/null +++ b/assets/code/video-sdk/ai-noise-suppression/setup-logging.mdx @@ -0,0 +1,21 @@ + + ```javascript + // Setup logging + processor.ondump = (blob, name) => { + // Dump the audio data to a local folder in PCM format + const objectURL = URL.createObjectURL(blob); + const tag = document.createElement("a"); + tag.download = name; + tag.href = objectURL; + tag.click(); + setTimeout(() => {URL.revokeObjectURL(objectURL);}, 0); + } + + processor.ondumpend = () => { + console.log("dump ended!!"); + } + + processor.dump(); + ``` + - [ondump](#ondump) + diff --git a/assets/code/video-sdk/audio-voice-effects/apply-voice-effects.mdx b/assets/code/video-sdk/audio-voice-effects/apply-voice-effects.mdx new file mode 100644 index 000000000..63d29c1a9 --- /dev/null +++ b/assets/code/video-sdk/audio-voice-effects/apply-voice-effects.mdx @@ -0,0 +1,129 @@ + + ```kotlin + fun applyVoiceBeautifierPreset(beautifier: Int) { + // Use a preset value from Constants. For example, Constants.CHAT_BEAUTIFIER_MAGNETIC + agoraEngine?.setVoiceBeautifierPreset(beautifier); + } + + fun applyAudioEffectPreset(preset: Int) { + // Use a preset value from Constants. For example, Constants.VOICE_CHANGER_EFFECT_HULK + agoraEngine?.setAudioEffectPreset(preset) + } + + fun applyVoiceConversionPreset(preset: Int) { + // Use a preset value from Constants. For example, Constants.VOICE_CHANGER_CARTOON + agoraEngine?.setVoiceConversionPreset(preset) + } + + fun applyLocalVoiceFormant(preset: Double) { + // The value range is [-1.0, 1.0]. The default value is 0.0, + agoraEngine?.setLocalVoiceFormant(preset) + } + + fun setVoiceEqualization(bandFrequency: AUDIO_EQUALIZATION_BAND_FREQUENCY, bandGain: Int) { + // Set local voice equalization. + // The first parameter sets the band frequency. The value ranges between 0 and 9. + // Each value represents the center frequency of the band: 31, 62, 125, 250, 500, 1k, 2k, 4k, 8k, and 16k Hz. + // The second parameter sets the gain of each band. The value ranges between -15 and 15 dB. + // The default value is 0. + agoraEngine?.setLocalVoiceEqualization(bandFrequency, bandGain) + } + + fun setVoicePitch(value: Double) { + // The value range is [0.5,2.0] default value is 1.0 + agoraEngine?.setLocalVoicePitch(value) + } + + ``` + - setLocalVoiceEqualization + + - setAudioEffectPreset + + - setAudioEffectParameters + + - setVoiceBeautifierPreset + + - setVoiceConversionPreset + + - setVoiceBeautifierParameters + + - setLocalVoiceReverb + + - setLocalVoiceReverbPreset + + - setLocalVoicePitch + + + ```swift + func applyVoiceBeautifierPreset(beautifier: AgoraVoiceBeautifierPreset) { + // Use a preset value from Constants. For example, Constants.CHAT_BEAUTIFIER_MAGNETIC + agoraEngine.setVoiceBeautifierPreset(beautifier) + } + + func applyAudioEffectPreset(preset: AgoraAudioEffectPreset) { + // Use a preset value from Constants. For example, Constants.VOICE_CHANGER_EFFECT_HULK + agoraEngine.setAudioEffectPreset(preset) + } + + func applyVoiceConversionPreset(preset: AgoraVoiceConversionPreset) { + // Use a preset value from Constants. For example, Constants.VOICE_CHANGER_CARTOON + agoraEngine.setVoiceConversionPreset(preset) + } + + func applyLocalVoiceFormant(preset: Double) { + // The value range is [-1.0, 1.0]. The default value is 0.0, + agoraEngine.setLocalVoiceFormant(preset) + } + + func setVoiceEqualization(bandFrequency: AgoraAudioEqualizationBandFrequency, bandGain: Int) { + // Set local voice equalization. + // The first parameter sets the band frequency. Ranges from 0 to 9. + // Each value represents the center frequency of the band: + // 31, 62, 125, 250, 500, 1k, 2k, 4k, 8k, and 16k Hz. + // The second parameter sets the gain of each band. Ranges from -15 to 15 dB. + // The default value is 0. + agoraEngine.setLocalVoiceEqualizationOf(bandFrequency, withGain: bandGain) + } + + func setVoicePitch(value: Double) { + // The value range is [0.5,2.0] default value is 1.0 + agoraEngine.setLocalVoicePitch(value) + } + ``` + + + - setVoiceBeautifierPreset(_:) + - setAudioEffectPreset(_:) + - setVoiceConversionPreset(_:) + - setLocalVoiceFormant(_:) + - setLocalVoiceEqualizationOf(_:withGain:) + - setLocalVoicePitch(_:) + + + - setVoiceBeautifierPreset(_:) + - setAudioEffectPreset(_:) + - setVoiceConversionPreset(_:) + - setLocalVoiceFormant(_:) + - setLocalVoiceEqualizationOf(_:withGain:) + - setLocalVoicePitch(_:) + + + + ```csharp + // Method to apply voice effects + public void ApplyVoiceEffect(VOICE_BEAUTIFIER_PRESET effect) + { + agoraEngine.SetVoiceBeautifierPreset(effect); + } + ``` + + - SetVoiceBeautifierPreset + - SetAudioEffectPreset + - SetVoiceBeautifierPreset + + + - SetVoiceBeautifierPreset + - SetAudioEffectPreset + - SetVoiceBeautifierPreset + + \ No newline at end of file diff --git a/assets/code/video-sdk/audio-voice-effects/swift/configure-buttons.mdx b/assets/code/video-sdk/audio-voice-effects/configure-buttons.mdx similarity index 99% rename from assets/code/video-sdk/audio-voice-effects/swift/configure-buttons.mdx rename to assets/code/video-sdk/audio-voice-effects/configure-buttons.mdx index 399526d00..24d5b9034 100644 --- a/assets/code/video-sdk/audio-voice-effects/swift/configure-buttons.mdx +++ b/assets/code/video-sdk/audio-voice-effects/configure-buttons.mdx @@ -1,5 +1,5 @@ -``` swift +```swift // Button to start audio mixing audioMixingBtn = NSButton() audioMixingBtn.frame = CGRect(x: 230, y:240, width:80, height:20) @@ -27,7 +27,7 @@ self.view.addSubview(applyVoiceEffectBtn) ``` -``` swift +```swift // Button to start audio mixing audioMixingBtn = UIButton(type: .system) audioMixingBtn.frame = CGRect(x: 60, y:500, width:250, height:50) diff --git a/assets/code/video-sdk/audio-voice-effects/configure-engine.mdx b/assets/code/video-sdk/audio-voice-effects/configure-engine.mdx new file mode 100644 index 000000000..2c1149fbd --- /dev/null +++ b/assets/code/video-sdk/audio-voice-effects/configure-engine.mdx @@ -0,0 +1,69 @@ + +```csharp + // Method to set up the Agora engine + public override void SetupAgoraEngine() + { + base.SetupAgoraEngine(); + + // Pre-load sound effects to improve performance + agoraEngine.PreloadEffect(soundEffectId, configData.soundEffectFileURL); + + // Specify the audio scenario and audio profile + agoraEngine.SetAudioProfile(AUDIO_PROFILE_TYPE.AUDIO_PROFILE_DEFAULT, AUDIO_SCENARIO_TYPE.AUDIO_SCENARIO_CHATROOM); + + // Initialize event handling for Agora + agoraEngine.InitEventHandler(new AudioVoiceEffectEventHandler(this)); + +#if (UNITY_ANDROID || UNITY_IOS) + agoraEngine.SetDefaultAudioRouteToSpeakerphone(!enableSpeakerPhone); // Disables the default audio route. + agoraEngine.SetEnableSpeakerphone(enableSpeakerPhone); // Enables or disables the speakerphone temporarily. +#endif + } +``` +The `SetDefaultAudioRouteToSpeakerphone` and `SetEnableSpeakerphone` methods applies to Android and iOS only. + +For more details, see the following: + - PreloadEffect + - SetAudioProfile + - SetDefaultAudioRouteToSpeakerphone + - SetEnableSpeakerphone + + + + ```typescript + function AudioAndVoiceEffects() { + const agoraEngine = useRTCClient(AgoraRTC.createClient({ codec: "vp8", mode: config.selectedProduct })); + + return ( +
+

Audio and voice effects

+ + + + + +
+ ); + } + ``` + - useRTCClient + - AgoraRTCProvider +
+ + ```swift + override func setupEngine() -> AgoraRtcEngineKit { + let eng = super.setupEngine() + eng.setAudioProfile(.musicHighQualityStereo) + eng.setAudioScenario(.gameStreaming) + return eng + } + ``` + + - setAudioProfile(_:) + - setAudioScenario(_:) + + + - setAudioProfile(_:) + - setAudioScenario(_:) + + \ No newline at end of file diff --git a/assets/code/video-sdk/audio-voice-effects/create-ui.mdx b/assets/code/video-sdk/audio-voice-effects/create-ui.mdx new file mode 100644 index 000000000..057c1356a --- /dev/null +++ b/assets/code/video-sdk/audio-voice-effects/create-ui.mdx @@ -0,0 +1,51 @@ + +```swift +var audioMixingBtn: NSButton! +var playAudioEffectBtn: NSButton! +var applyVoiceEffectBtn: NSButton! +``` + + +```swift +var audioMixingBtn: UIButton! +var playAudioEffectBtn: UIButton! +var applyVoiceEffectBtn: UIButton! +var speakerphoneSwitch: UISwitch! +``` + + +```typescript + return ( +
+
+ + +

+ +

+ {showDropdown && ( +
+ + +
+ )} + {isAudioMixing && audioFileTrack && } +
+
+ ); +``` +
\ No newline at end of file diff --git a/assets/code/video-sdk/audio-voice-effects/event-handler.mdx b/assets/code/video-sdk/audio-voice-effects/event-handler.mdx new file mode 100644 index 000000000..8c7de4faa --- /dev/null +++ b/assets/code/video-sdk/audio-voice-effects/event-handler.mdx @@ -0,0 +1,98 @@ + + ```kotlin + override val iRtcEngineEventHandler: IRtcEngineEventHandler + get() = object : IRtcEngineEventHandler() { + + // Occurs when the audio effect playback finishes. + override fun onAudioEffectFinished(soundId: Int) { + super.onAudioEffectFinished(soundId) + sendMessage("Audio effect finished") + audioEffectManager!!.stopEffect(soundId) + // Notify the UI + val eventArgs: Map = mapOf("soundId" to soundId) + mListener?.onEngineEvent("onAudioEffectFinished", eventArgs) + } + } + ``` + - onAudioEffectFinished + + + +```csharp +// Event handler class to handle the events raised by Agora's RtcEngine instance +internal class AudioVoiceEffectEventHandler : UserEventHandler +{ + private AudioVoiceEffectsManager audioVoiceEffectsManager; + + internal AudioVoiceEffectEventHandler(AudioVoiceEffectsManager videoSample) : base(videoSample) + { + audioVoiceEffectsManager = videoSample; + } + + // Occurs when the audio effect playback finishes + public override void OnAudioEffectFinished(int soundId) + { + // Handle the event, stop the audio effect, and reset its status + Debug.Log("Audio effect finished"); + audioVoiceEffectsManager.isEffectFinished = true; + audioVoiceEffectsManager.StopAudioMixing(); + } + + // Occurs when you start audio mixing, with different states + public override void OnAudioMixingStateChanged(AUDIO_MIXING_STATE_TYPE state, AUDIO_MIXING_REASON_TYPE reason) + { + audioVoiceEffectsManager.audioMixingState = state; + // Handle audio mixing state changes, such as failure, pause, play, or stop + if (state == AUDIO_MIXING_STATE_TYPE.AUDIO_MIXING_STATE_FAILED) + { + Debug.Log("Audio mixing failed: " + reason); + } + else if (state == AUDIO_MIXING_STATE_TYPE.AUDIO_MIXING_STATE_PAUSED) + { + Debug.Log("Audio mixing paused : " + reason); + } + else if (state == AUDIO_MIXING_STATE_TYPE.AUDIO_MIXING_STATE_PLAYING) + { + Debug.Log("Audio mixing started: " + reason); + } + else if (state == AUDIO_MIXING_STATE_TYPE.AUDIO_MIXING_STATE_STOPPED) + { + Debug.Log("Audio mixing stopped: " + reason); + } + } + + // Occurs when the audio route changes + public override void OnAudioRoutingChanged(int routing) + { + if (routing != (int)AudioRoute.ROUTE_DEFAULT) + { + Debug.Log("Audio route changed"); + } + } +} +``` + + - OnAudioEffectFinished + - OnAudioMixingStateChanged + - OnAudioRoutingChanged + + + - OnAudioEffectFinished + - OnAudioMixingStateChanged + - OnAudioRoutingChanged + + + + ```swift + func rtcEngineDidAudioEffectFinish(_ engine: AgoraRtcEngineKit, soundId: Int32) { + // Occurs when the audio effect playback finishes. + } + ``` + + + - rtcEngineDidAudioEffectFinish(_:soundId:) + + + - rtcEngineDidAudioEffectFinish(_:soundId:) + + diff --git a/assets/code/video-sdk/audio-voice-effects/import-library.mdx b/assets/code/video-sdk/audio-voice-effects/import-library.mdx new file mode 100644 index 000000000..a29a7ed3c --- /dev/null +++ b/assets/code/video-sdk/audio-voice-effects/import-library.mdx @@ -0,0 +1,26 @@ + + ```kotlin + import io.agora.rtc2.Constants + import io.agora.rtc2.Constants.AUDIO_EQUALIZATION_BAND_FREQUENCY + import io.agora.rtc2.IAudioEffectManager + import io.agora.rtc2.IRtcEngineEventHandler + ``` + + +```csharp +using Agora.Rtc; +``` + + + ```typescript + import { AgoraRTCProvider, useRTCClient, usePublish, useConnectionState } from "agora-rtc-react"; + import AgoraRTC, {IBufferSourceAudioTrack} from "agora-rtc-sdk-ng"; + import AuthenticationWorkflowManager from "../authentication-workflow/authenticationWorkflowManager"; + import config from "../agora-manager/config"; + ``` + + + ```swift + import AgoraRtcKit + ``` + \ No newline at end of file diff --git a/assets/code/video-sdk/audio-voice-effects/pause-play-resume.mdx b/assets/code/video-sdk/audio-voice-effects/pause-play-resume.mdx new file mode 100644 index 000000000..6d3dbcca6 --- /dev/null +++ b/assets/code/video-sdk/audio-voice-effects/pause-play-resume.mdx @@ -0,0 +1,105 @@ + + ```kotlin + fun playEffect(soundEffectId: Int, soundEffectFilePath: String) { + audioEffectManager!!.playEffect( + soundEffectId, // The ID of the sound effect file. + soundEffectFilePath, // The path of the sound effect file. + 0, 1.0, // The pitch of the audio effect. 1 represents the original pitch. + 0.0, 100.0, // The volume of the audio effect. 100 represents the original volume. + true, // Whether to publish the audio effect to remote users. + 0 // The playback starting position of the audio effect file in ms. + ) + } + + fun pauseEffect(soundEffectId: Int) { + audioEffectManager!!.pauseEffect(soundEffectId) + } + + fun resumeEffect(soundEffectId: Int) { + audioEffectManager!!.resumeEffect(soundEffectId) + } + ``` + - playEffect + - pauseEffect + - resumeEffect + + + ```csharp + // Method to play the sound effect + public void PlaySoundEffect() + { + agoraEngine.PlayEffect( + soundEffectId, // The ID of the sound effect file. + configData.soundEffectFileURL, // The path of the sound effect file. + 0, // The number of sound effect loops. -1 means an infinite loop. 0 means once. + 1, // The pitch of the audio effect. 1 represents the original pitch. + 0.0, // The spatial position of the audio effect. 0.0 represents that the audio effect plays in the front. + 100, // The volume of the audio effect. 100 represents the original volume. + true,// Whether to publish the audio effect to remote users. + 0 // The playback starting position of the audio effect file in ms. + ); + } + + // Pause the sound effect + public void PauseSoundEffect() + { + agoraEngine.PauseEffect(soundEffectId); + } + + // Resume the sound effect + public void ResumeSoundEffect() + { + agoraEngine.ResumeEffect(soundEffectId); + } + // Method to get the current voice effect state + public bool GetSoundEffectState() + { + return isEffectFinished; + } + ``` + + - PlayEffect + - PauseEffect + - ResumeEffect + + + - PlayEffect + - PauseEffect + - ResumeEffect + + + + ```swift + func playEffect(soundEffectId: Int32, effectFilePath: String) { + agoraEngine.playEffect( + soundEffectId, // The ID of the sound effect file. + filePath: effectFilePath, // The path of the sound effect file. + loopCount: 0, + pitch: 1.0, // The pitch of the audio effect. 1 = original pitch. + pan: 0.0, // The spatial position of the audio effect (-1 to 1) + gain: 100, // The volume of the audio effect. 100 = original volume. + publish: true, // Whether to publish the audio effect to remote users. + startPos: 0 // The playback starting position (in ms). + ) + } + + func pauseEffect(soundEffectId: Int32) { + agoraEngine.pauseEffect(soundEffectId) + } + + func resumeEffect(soundEffectId: Int32) { + agoraEngine.resumeEffect(soundEffectId) + } + ``` + + + - playEffect(_:filePath:loopCount:pitch:pan:gain:publish:startPos:) + - pauseEffect(_:) + - resumeEffect(_:) + + + - playEffect(_:filePath:loopCount:pitch:pan:gain:publish:startPos:) + - pauseEffect(_:) + - resumeEffect(_:) + + \ No newline at end of file diff --git a/assets/code/video-sdk/audio-voice-effects/preload-effect.mdx b/assets/code/video-sdk/audio-voice-effects/preload-effect.mdx new file mode 100644 index 000000000..d9af2c9db --- /dev/null +++ b/assets/code/video-sdk/audio-voice-effects/preload-effect.mdx @@ -0,0 +1,47 @@ + + ```kotlin + if (audioEffectManager == null) { + // Set up the audio effects manager + audioEffectManager = agoraEngine?.audioEffectManager + // Pre-load sound effects to improve performance + audioEffectManager?.preloadEffect(soundEffectId, soundEffectFilePath) + } + ``` + - IAudioEffectManager + - preloadEffect + + + ```csharp + public override void SetupAgoraEngine() + { + base.SetupAgoraEngine(); + + // Pre-load sound effects to improve performance + agoraEngine.PreloadEffect(soundEffectId, configData.soundEffectFileURL); + + // Initialize event handling for Agora + agoraEngine.InitEventHandler(new AudioVoiceEffectEventHandler(this)); + } + ``` + + - PreloadEffect + + + - PreloadEffect + + + + ```swift + func preloadEffect(soundEffectId: Int32, effectFilePath: String) { + // Pre-load sound effects to improve performance + agoraEngine.preloadEffect(soundEffectId, filePath: effectFilePath) + } + ``` + + + - preloadEffect(_:filePath:) + + + - preloadEffect(_:filePath:) + + diff --git a/assets/code/video-sdk/audio-voice-effects/set-audio-profile.mdx b/assets/code/video-sdk/audio-voice-effects/set-audio-profile.mdx new file mode 100644 index 000000000..d1cccb0cb --- /dev/null +++ b/assets/code/video-sdk/audio-voice-effects/set-audio-profile.mdx @@ -0,0 +1,41 @@ + + ```kotlin + override fun setupAgoraEngine(): Boolean { + val result = super.setupAgoraEngine() + + // Set the audio scenario and audio profile + agoraEngine?.setAudioProfile(Constants.AUDIO_PROFILE_MUSIC_HIGH_QUALITY_STEREO); + agoraEngine?.setAudioScenario(Constants.AUDIO_SCENARIO_GAME_STREAMING); + return result + } + ``` + - setAudioProfile + - setAudioScenario + + + ```csharp + // Specify the audio scenario and audio profile + agoraEngine.SetAudioProfile(AUDIO_PROFILE_TYPE.AUDIO_PROFILE_DEFAULT, AUDIO_SCENARIO_TYPE.AUDIO_SCENARIO_CHATROOM); + ``` + + - SetAudioProfile + + + - SetAudioProfile + + + + ```swift + agoraEngine.setAudioProfile(.musicHighQualityStereo) + agoraEngine.setAudioScenario(.gameStreaming) + ``` + + + - setAudioProfile(_:) + - setAudioScenario(_:) + + + - setAudioProfile(_:) + - setAudioScenario(_:) + + diff --git a/assets/code/video-sdk/audio-voice-effects/set-audio-route.mdx b/assets/code/video-sdk/audio-voice-effects/set-audio-route.mdx new file mode 100644 index 000000000..d5309f7b9 --- /dev/null +++ b/assets/code/video-sdk/audio-voice-effects/set-audio-route.mdx @@ -0,0 +1,87 @@ + + ```kotlin + fun setAudioRoute(enableSpeakerPhone: Boolean) { + // Disable the default audio route + agoraEngine?.setDefaultAudioRoutetoSpeakerphone(false) + // Enable or disable the speakerphone temporarily + agoraEngine?.setEnableSpeakerphone(enableSpeakerPhone) + } + ``` + - setDefaultAudioRoutetoSpeakerphone + - setEnableSpeakerphone + + + ```swift + func setAudioRoute(enableSpeakerPhone: Bool) { + // Disable the default audio route + agoraEngine.setDefaultAudioRouteToSpeakerphone(false) + // Enable or disable the speakerphone temporarily + agoraEngine.setEnableSpeakerphone(enableSpeakerPhone) + } + ``` + + + - setDefaultAudioRouteToSpeakerphone(_:) + - setEnableSpeakerphone(_:) + + + - setDefaultAudioRouteToSpeakerphone(_:) + - setEnableSpeakerphone(_:) + + + + ```csharp + #if (UNITY_ANDROID || UNITY_IOS) + agoraEngine.SetDefaultAudioRouteToSpeakerphone(!enableSpeakerPhone); // Disables the default audio route. + agoraEngine.SetEnableSpeakerphone(enableSpeakerPhone); // Enables or disables the speakerphone temporarily. + #endif + ``` + The `SetDefaultAudioRouteToSpeakerphone` and `SetEnableSpeakerphone` methods applies to Android and iOS only. + + - SetDefaultAudioRouteToSpeakerphone + - SetEnableSpeakerphone + + + - SetDefaultAudioRouteToSpeakerphone + - SetEnableSpeakerphone + + + + ```typescript + // Fetch the available audio playback devices when the component mounts + useEffect(() => { + navigator.mediaDevices?.enumerateDevices?.().then((devices) => { + try { + const playbackDevices = devices.filter((device) => device.kind === "audiooutput"); + setPlaybackDevices(playbackDevices); + setShowDropdown(playbackDevices.length > 0); + } catch (error) { + console.error("Error enumerating playback devices:", error); + } + }) + .catch((error) => { + console.error(error); + }); + }, []); + + // Event handler for changing the audio playback device + const handleAudioRouteChange = () => { + if (audioFileTrack) { + const deviceID = playoutDeviceRef.current?.value; + if (deviceID) { + console.log("The selected device id is: " + deviceID); + try { + audioFileTrack.setPlaybackDevice(deviceID) + .then(() => {console.log("Audio route changed")}) + .catch((error) => {console.error(error);}); + } catch (error) { + console.error("Error setting playback device:", error); + } + } + } + }; + ``` + + - setPlaybackDevice + + diff --git a/assets/code/video-sdk/audio-voice-effects/set-variables.mdx b/assets/code/video-sdk/audio-voice-effects/set-variables.mdx new file mode 100644 index 000000000..2d186262b --- /dev/null +++ b/assets/code/video-sdk/audio-voice-effects/set-variables.mdx @@ -0,0 +1,29 @@ + + ```kotlin + private var audioEffectManager: IAudioEffectManager? = null + ``` + + + ```csharp + // Internal fields for managing audio and voice effects + internal int soundEffectId = 1; // Unique identifier for the sound effect file + internal AUDIO_MIXING_STATE_TYPE audioMixingState; + internal bool isEffectFinished = false; + internal bool enableSpeakerPhone = false; + ``` + + + ```typescript + const [isAudioMixing, setAudioMixing] = useState(false); + const [audioFileTrack, setAudioFileTrack] = useState(null); + const [showDropdown, setShowDropdown] = useState(false); + const [playbackDevices, setPlaybackDevices] = useState([]); + const playoutDeviceRef = useRef(null); + const connectionState = useConnectionState(); + ``` + + + ```swift + var audioEffectId: Int32 = .random(in: 1000...10_000) + ``` + diff --git a/assets/code/video-sdk/audio-voice-effects/stop-start-mixing.mdx b/assets/code/video-sdk/audio-voice-effects/stop-start-mixing.mdx new file mode 100644 index 000000000..4b2027af0 --- /dev/null +++ b/assets/code/video-sdk/audio-voice-effects/stop-start-mixing.mdx @@ -0,0 +1,145 @@ + + ```kotlin + fun startMixing(audioFilePath: String, loopBack: Boolean, cycle: Int, startPos: Int) { + agoraEngine?.startAudioMixing(audioFilePath, loopBack, cycle, startPos) + } + + fun stopMixing() { + agoraEngine?.stopAudioMixing() + } + ``` + - startAudioMixing + - stopAudioMixing + + + ```swift + func startMixing( + audioFilePath: String, loopBack: Bool, + cycle: Int, startPos: Int + ) { + agoraEngine.startAudioMixing( + audioFilePath, loopback: loopBack, + cycle: cycle, startPos: startPos + ) + } + + func stopMixing() { + agoraEngine.stopAudioMixing() + } + ``` + + + - startAudioMixing(_:loopback:cycle:startPos:) + - stopAudioMixing() + + + - startAudioMixing(_:loopback:cycle:startPos:) + - stopAudioMixing() + + + + + ```csharp + + // Method to start audio mixing + public void StartAudioMixing() + { + agoraEngine.StartAudioMixing(configData.audioFileURL, false, 1); + } + + // Method to pause audio mixing. + public void PauseAudioMixing() + { + agoraEngine.PauseAudioMixing(); + } + + // Method to resume audio mixing + public void ResumeAudioMixing() + { + agoraEngine.ResumeAudioMixing(); + } + + // Method to stop audio mixing + public void StopAudioMixing() + { + agoraEngine.StopAudioMixing(); + } + + // Return the audio mixing state + public AUDIO_MIXING_STATE_TYPE GetAudioMixingState() + { + return audioMixingState; + } + ``` + + - StartAudioMixing + - PauseAudioMixing + - ResumeAudioMixing + - StopAudioMixing + + + - StartAudioMixing + - PauseAudioMixing + - ResumeAudioMixing + - StopAudioMixing + + + + ```typescript + // Event handler for selecting an audio file + const handleFileChange = (event: React.ChangeEvent) => { + if (event.target.files && event.target.files.length > 0) { + const selectedFile = event.target.files[0]; + try + { + AgoraRTC.createBufferSourceAudioTrack({ source: selectedFile }) + .then((track) => {setAudioFileTrack(track)}) + .catch((error) => {console.error(error);}) + } catch (error) { + console.error("Error creating buffer source audio track:", error); + } + } + }; + + const AudioMixing: React.FC<{ track: IBufferSourceAudioTrack }> = ({ track }) => { + usePublish([track]); + const agoraEngine = useRTCClient(); + + useEffect(() => { + track.startProcessAudioBuffer(); + track.play(); // to play the track for the local user + agoraEngine.publish(track) + .then(() => { + console.log("Audio mixing track published"); + }) + .catch((error) => { + console.log(console.log(error)); + }); + return () => { + track.stopProcessAudioBuffer(); + track.stop(); + agoraEngine.unpublish(track) + .then(() => { + console.log("Audio mixing track unpublished"); + }) + .catch((error) => { + console.log(console.log(error)); + }); + }; + }, [track]); + return
Audio mixing is in progress
; + }; + ``` + - createBufferSourceAudioTrack + + - startProcessAudioBuffer + + - stopProcessAudioBuffer + + - stop + + - publish + + - unpublish + +
\ No newline at end of file diff --git a/assets/code/video-sdk/audio-voice-effects/swift/apply-voice-effects.mdx b/assets/code/video-sdk/audio-voice-effects/swift/apply-voice-effects.mdx deleted file mode 100644 index 1936c62d0..000000000 --- a/assets/code/video-sdk/audio-voice-effects/swift/apply-voice-effects.mdx +++ /dev/null @@ -1,86 +0,0 @@ - -```swift -@objc func applyVoiceEffectBtnClicked() { - voiceEffectIndex += 1 - // Turn off any previous effects - agoraEngine.setVoiceBeautifierPreset(AgoraVoiceBeautifierPreset.presetOff) - agoraEngine.setAudioEffectPreset(AgoraAudioEffectPreset.off) - agoraEngine.setVoiceConversionPreset(AgoraVoiceConversionPreset.off) - agoraEngine.setLocalVoiceFormant(0.0) - - if (voiceEffectIndex == 1) { - agoraEngine.setVoiceBeautifierPreset(AgoraVoiceBeautifierPreset.presetChatBeautifierMagnetic) - applyVoiceEffectBtn.title = "Chat Beautifier" - } else if (voiceEffectIndex == 2) { - agoraEngine.setVoiceBeautifierPreset(AgoraVoiceBeautifierPreset.presetSingingBeautifier) - applyVoiceEffectBtn.title = "Singing Beautifier" - } else if (voiceEffectIndex == 3) { - // Modify the timbre using the formantRatio - // Range is [-1.0, 1.0], [giant, child] default value is 0. - agoraEngine.setLocalVoiceFormant(0.6) - applyVoiceEffectBtn.title = "Voice effect: Adjust Formant" - } else if (voiceEffectIndex == 4) { - agoraEngine.setAudioEffectPreset(AgoraAudioEffectPreset.voiceChangerEffectHulk) - applyVoiceEffectBtn.title = "Hulk" - } else if (voiceEffectIndex == 5) { - agoraEngine.setVoiceConversionPreset(AgoraVoiceConversionPreset.changerBass) - applyVoiceEffectBtn.title = "Voice Changer" - } else if (voiceEffectIndex == 6) { - // Sets the local voice equalization. - // The first parameter sets the band frequency. The value ranges between 0 and 9. - // Each value represents the center frequency of the band: - // 31, 62, 125, 250, 500, 1k, 2k, 4k, 8k, and 16k Hz. - // The second parameter sets the gain of each band between -15 and 15 dB. - // The default value is 0. - agoraEngine.setLocalVoiceEqualizationOf(AgoraAudioEqualizationBandFrequency.band500, withGain: 3) - agoraEngine.setLocalVoicePitch(0.5) - applyVoiceEffectBtn.title = "Voice Equalization" - } else if (voiceEffectIndex > 6) { // Remove all effects - voiceEffectIndex = 0 - agoraEngine.setLocalVoicePitch(1.0) - agoraEngine.setLocalVoiceEqualizationOf(AgoraAudioEqualizationBandFrequency.band500, withGain: 0) - applyVoiceEffectBtn.title = "Voice effect" - } -} -``` - - -```swift -@objc func applyVoiceEffectBtnClicked() { - voiceEffectIndex += 1 - // Turn off any previous effects - agoraEngine.setVoiceBeautifierPreset(AgoraVoiceBeautifierPreset.presetOff) - agoraEngine.setAudioEffectPreset(AgoraAudioEffectPreset.off) - agoraEngine.setVoiceConversionPreset(AgoraVoiceConversionPreset.off) - - if (voiceEffectIndex == 1) { - agoraEngine.setVoiceBeautifierPreset(AgoraVoiceBeautifierPreset.presetChatBeautifierMagnetic) - applyVoiceEffectBtn.setTitle("Voice effect: Chat Beautifier", for: .normal) - } else if (voiceEffectIndex == 2) { - agoraEngine.setVoiceBeautifierPreset(AgoraVoiceBeautifierPreset.presetSingingBeautifier) - applyVoiceEffectBtn.setTitle("Voice effect: Singing Beautifier", for: .normal) - } else if (voiceEffectIndex == 3) { - agoraEngine.setAudioEffectPreset(AgoraAudioEffectPreset.voiceChangerEffectHulk) - applyVoiceEffectBtn.setTitle("Audio effect: Hulk", for: .normal) - } else if (voiceEffectIndex == 4) { - agoraEngine.setVoiceConversionPreset(AgoraVoiceConversionPreset.changerBass) - applyVoiceEffectBtn.setTitle("Audio effect: Voice Changer", for: .normal) - } else if (voiceEffectIndex == 5) { - // Sets the local voice equalization. - // The first parameter sets the band frequency. The value ranges between 0 and 9. - // Each value represents the center frequency of the band: - // 31, 62, 125, 250, 500, 1k, 2k, 4k, 8k, and 16k Hz. - // The second parameter sets the gain of each band between -15 and 15 dB. - // The default value is 0. - agoraEngine.setLocalVoiceEqualizationOf(AgoraAudioEqualizationBandFrequency.band500, withGain: 3) - agoraEngine.setLocalVoicePitch(0.5) - applyVoiceEffectBtn.setTitle("Audio effect: Voice Equalization", for: .normal) - } else if (voiceEffectIndex > 5) { // Remove all effects - voiceEffectIndex = 0 - agoraEngine.setLocalVoicePitch(1.0) - agoraEngine.setLocalVoiceEqualizationOf(AgoraAudioEqualizationBandFrequency.band500, withGain: 0) - applyVoiceEffectBtn.setTitle("Apply voice effect", for: .normal) - } -} -``` - \ No newline at end of file diff --git a/assets/code/video-sdk/audio-voice-effects/swift/create-ui.mdx b/assets/code/video-sdk/audio-voice-effects/swift/create-ui.mdx deleted file mode 100644 index 889d022c0..000000000 --- a/assets/code/video-sdk/audio-voice-effects/swift/create-ui.mdx +++ /dev/null @@ -1,15 +0,0 @@ - -```swift -var audioMixingBtn: NSButton! -var playAudioEffectBtn: NSButton! -var applyVoiceEffectBtn: NSButton! -``` - - -```swift -var audioMixingBtn: UIButton! -var playAudioEffectBtn: UIButton! -var applyVoiceEffectBtn: UIButton! -var speakerphoneSwitch: UISwitch! -``` - \ No newline at end of file diff --git a/assets/code/video-sdk/audio-voice-effects/swift/pause-play-resume.mdx b/assets/code/video-sdk/audio-voice-effects/swift/pause-play-resume.mdx deleted file mode 100644 index 48bff3e88..000000000 --- a/assets/code/video-sdk/audio-voice-effects/swift/pause-play-resume.mdx +++ /dev/null @@ -1,56 +0,0 @@ - -```swift -@objc func playAudioEffectBtnClicked() { - if (soundEffectStatus == 0) { // Stopped - agoraEngine.playEffect( - soundEffectId, // The ID of the sound effect file. - filePath: soundEffectFilePath, // The path of the sound effect file. - loopCount: 0, // The number of sound effect loops. -1 means an infinite loop. 0 means once. - pitch: 1, // The pitch of the audio effect. 1 represents the original pitch. - pan: 0.0, // The spatial position of the audio effect. 0.0 represents that the audio effect plays in the front. - gain: 100, // The volume of the audio effect. 100 represents the original volume. - publish: true,// Whether to publish the audio effect to remote users. - startPos: 0 // The playback starting position of the audio effect file in ms. - ); - playAudioEffectBtn.title = "Pause" - soundEffectStatus = 1 - } else if (soundEffectStatus == 1) { // Playing - agoraEngine.pauseEffect(soundEffectId) - soundEffectStatus = 2 - playAudioEffectBtn.title = "Resume" - } else if (soundEffectStatus == 2) { // Paused - agoraEngine.resumeEffect(soundEffectId) - soundEffectStatus = 1 - playAudioEffectBtn.title = "Pause" - } -} -``` - - -```swift -@objc func playAudioEffectBtnClicked() { - if (soundEffectStatus == 0) { // Stopped - agoraEngine.playEffect( - soundEffectId, // The ID of the sound effect file. - filePath: soundEffectFilePath, // The path of the sound effect file. - loopCount: 0, // The number of sound effect loops. -1 means an infinite loop. 0 means once. - pitch: 1, // The pitch of the audio effect. 1 represents the original pitch. - pan: 0.0, // The spatial position of the audio effect. 0.0 represents that the audio effect plays in the front. - gain: 100, // The volume of the audio effect. 100 represents the original volume. - publish: true,// Whether to publish the audio effect to remote users. - startPos: 0 // The playback starting position of the audio effect file in ms. - ); - playAudioEffectBtn.setTitle("Pause sound effect", for: .normal) - soundEffectStatus = 1 - } else if (soundEffectStatus == 1) { // Playing - agoraEngine.pauseEffect(soundEffectId) - soundEffectStatus = 2 - playAudioEffectBtn.setTitle("Resume sound effect", for: .normal) - } else if (soundEffectStatus == 2) { // Paused - agoraEngine.resumeEffect(soundEffectId) - soundEffectStatus = 1 - playAudioEffectBtn.setTitle("Pause sound effect", for: .normal) - } -} -``` - \ No newline at end of file diff --git a/assets/code/video-sdk/audio-voice-effects/swift/set-audio-route.mdx b/assets/code/video-sdk/audio-voice-effects/swift/set-audio-route.mdx deleted file mode 100644 index a2477e020..000000000 --- a/assets/code/video-sdk/audio-voice-effects/swift/set-audio-route.mdx +++ /dev/null @@ -1,18 +0,0 @@ - - ```swift - @objc func speakerphoneSwitchValueChanged(sender: NSSwitch) { - agoraEngine.setDefaultAudioRouteToSpeakerphone(false) // Disables the default audio route. - print("Changing speakerphone enabled state to \(sender.isEnabled)") - agoraEngine.setEnableSpeakerphone(sender.isOn) // Enables or disables the speakerphone temporarily. - } - ``` - - - ```swift - @objc func speakerphoneSwitchValueChanged(sender: UISwitch) { - agoraEngine.setDefaultAudioRouteToSpeakerphone(false) // Disables the default audio route. - print("Changing speakerphone enabled state to \(sender.isOn)") - agoraEngine.setEnableSpeakerphone(sender.isOn) // Enables or disables the speakerphone temporarily. -} - ``` - \ No newline at end of file diff --git a/assets/code/video-sdk/audio-voice-effects/swift/stop-start-mixing.mdx b/assets/code/video-sdk/audio-voice-effects/swift/stop-start-mixing.mdx deleted file mode 100644 index 5150a6eff..000000000 --- a/assets/code/video-sdk/audio-voice-effects/swift/stop-start-mixing.mdx +++ /dev/null @@ -1,40 +0,0 @@ - -```swift -@objc func audioMixingBtnClicked() { - audioPlaying = !audioPlaying - - if (audioPlaying) { - audioMixingBtn.title = "Stop" - let result = agoraEngine.startAudioMixing(audioFilePath, loopback: false, cycle: -1, startPos: 0) - if (result == 0) { - showMessage(title: "Audio Mixing", text: "Audio playing") - } else { - showMessage(title: "Audio Mixing", text: "Failed to play audio") - } - } else { - agoraEngine.stopAudioMixing() - audioMixingBtn.title = "Play" - } -} -``` - - -```swift -@objc func audioMixingBtnClicked() { - audioPlaying = !audioPlaying - - if (audioPlaying) { - audioMixingBtn.setTitle("Stop playing audio", for: .normal) - let result = agoraEngine.startAudioMixing(audioFilePath, loopback: false, replace: false, cycle: -1, startPos: 0) - if (result == 0) { - showMessage(title: "Audio Mixing", text: "Audio playing") - } else { - showMessage(title: "Audio Mixing", text: "Failed to play audio") - } - } else { - agoraEngine.stopAudioMixing() - audioMixingBtn.setTitle("Play Audio", for: .normal) - } -} -``` - \ No newline at end of file diff --git a/assets/code/video-sdk/audio-voice-effects/swift/update-ui.mdx b/assets/code/video-sdk/audio-voice-effects/update-ui.mdx similarity index 100% rename from assets/code/video-sdk/audio-voice-effects/swift/update-ui.mdx rename to assets/code/video-sdk/audio-voice-effects/update-ui.mdx diff --git a/assets/code/video-sdk/authentication-workflow/add-variables.mdx b/assets/code/video-sdk/authentication-workflow/add-variables.mdx new file mode 100644 index 000000000..974f6f29c --- /dev/null +++ b/assets/code/video-sdk/authentication-workflow/add-variables.mdx @@ -0,0 +1,44 @@ + + + ```swift + // Update with the temporary token generated in Agora Console. + var token = "" + + // Update with the channel name you used to generate the token in Agora Console. + var channelName = "" + + // Store the name of the channel to join + var channelTextField: NSTextField! + + // Add the base URL to your token server. For example, https://agora-token-service-production-92ff.up.railway.app" + var serverUrl = "" + // The ID of the app user + var userID = 0 + // Expire time in Seconds. + var tokenExpireTime = 40 + ``` + + + ```swift + // Update with the temporary token generated in Agora Console. + var token = "" + + // Update with the channel name you used to generate the token in Agora Console. + var channelName = "" + + // Store the name of the channel to join + var channelTextField: UITextField! + + // Add the base URL to your token server. For example, https://agora-token-service-production-92ff.up.railway.app" + var serverUrl = "" + // The ID of the app user + var userID = 0 + // Expire time in Seconds. + var tokenExpireTime = 40 + ``` + + + ```javascript + let role = "publisher"; // set the role to "publisher" or "subscriber" as appropriate + ``` + \ No newline at end of file diff --git a/assets/code/video-sdk/authentication-workflow/event-handler.mdx b/assets/code/video-sdk/authentication-workflow/event-handler.mdx new file mode 100644 index 000000000..f5f76cd38 --- /dev/null +++ b/assets/code/video-sdk/authentication-workflow/event-handler.mdx @@ -0,0 +1,173 @@ + + ```kotlin + // Listen for the event that a token is about to expire + override val iRtcEngineEventHandler: IRtcEngineEventHandler + get() = object : IRtcEngineEventHandler() { + // Listen for the event that the token is about to expire + override fun onTokenPrivilegeWillExpire(token: String) { + sendMessage("Token is about to expire") + // Get a new token + fetchToken(channelName, object : TokenCallback { + override fun onTokenReceived(rtcToken: String?) { + // Use the token to renew + agoraEngine!!.renewToken(rtcToken) + sendMessage("Token renewed") + } + + override fun onError(errorMessage: String) { + // Handle the error + sendMessage("Error: $errorMessage") + } + }) + super.onTokenPrivilegeWillExpire(token) + } + } + ``` + + - renewToken + - onTokenPrivilegeWillExpire + + + + ```swift + func rtcEngine( + _ engine: AgoraRtcEngineKit, tokenPrivilegeWillExpire token: String + ) { + Task { + if let token = try? await fetchToken( + from: DocsAppConfig.shared.tokenUrl, + channel: DocsAppConfig.shared.channel, + role: .broadcaster + ) { self.agoraEngine.renewToken(token) } + } + } + ``` + + + - renewToken(_:) + - rtcEngine(\_:tokenPrivilegeWillExpire:)(\_:) + + + - renewToken(_:) + - rtcEngine(\_:tokenPrivilegeWillExpire:)(\_:) + + + + + ```csharp + internal class AuthenticationWorkflowEventHandler : UserEventHandler + { + private AuthenticationWorkflowManager authenticationWorkflowManager; + + internal AuthenticationWorkflowEventHandler(AuthenticationWorkflowManager refAuthenticationWorkflow) : base(refAuthenticationWorkflow) + { + authenticationWorkflowManager = + refAuthenticationWorkflow; + } + + public override async void OnTokenPrivilegeWillExpire(RtcConnection connection, string token) + { + Debug.Log("Token Expired"); + // Retrieve a fresh token from the token server. + await authenticationWorkflowManager.FetchToken(); + authenticationWorkflowManager.RenewToken(); + } + + public override async void OnClientRoleChanged(RtcConnection connection, CLIENT_ROLE_TYPE oldRole, CLIENT_ROLE_TYPE newRole, ClientRoleOptions newRoleOptions) + { + // Retrieve a fresh token from the token server for the new role. + Debug.Log("Role is set to " + newRole.ToString()); + await authenticationWorkflowManager.FetchToken(); + authenticationWorkflowManager.RenewToken(); + } + } + ``` + + - OnTokenPrivilegeWillExpire + - OnClientRoleChanged + + + + ```js + // The following code is solely related to UI implementation and not Agora-specific code + window.onload = async () => { + // Set the project selector + setupProjectSelector(); + + const handleVSDKEvents = (eventName, ...args) => { + switch (eventName) { + case "user-published": + if (args[1] == "video") { + // Retrieve the remote video track. + channelParameters.remoteVideoTrack = args[0].videoTrack; + // Retrieve the remote audio track. + channelParameters.remoteAudioTrack = args[0].audioTrack; + // Save the remote user id for reuse. + channelParameters.remoteUid = args[0].uid.toString(); + // Specify the ID of the DIV container. You can use the uid of the remote user. + remotePlayerContainer.id = args[0].uid.toString(); + channelParameters.remoteUid = args[0].uid.toString(); + remotePlayerContainer.textContent = + "Remote user " + args[0].uid.toString(); + // Append the remote container to the page body. + document.body.append(remotePlayerContainer); + // Play the remote video track. + channelParameters.remoteVideoTrack.play(remotePlayerContainer); + } + // Subscribe and play the remote audio track If the remote user publishes the audio track only. + if (args[1] == "audio") { + // Get the RemoteAudioTrack object in the AgoraRTCRemoteUser object. + channelParameters.remoteAudioTrack = args[0].audioTrack; + // Play the remote audio track. No need to pass any DOM element. + channelParameters.remoteAudioTrack.play(); + } + } + }; + ``` + + + ```js + // The following code is solely related to UI implementation and not Agora-specific code + window.onload = async () => { + // Set the project selector + setupProjectSelector(); + + const handleVSDKEvents = (eventName, ...args) => { + switch (eventName) { + case "user-published": + // Subscribe and play the remote audio track If the remote user publishes the audio track only. + if (args[1] == "audio") { + // Get the RemoteAudioTrack object in the AgoraRTCRemoteUser object. + channelParameters.remoteAudioTrack = args[0].audioTrack; + // Play the remote audio track. No need to pass any DOM element. + channelParameters.remoteAudioTrack.play(); + } + } + }; + ``` + + + + ```typescript + const useTokenWillExpire = () => { + const agoraEngine = useRTCClient(); + useClientEvent(agoraEngine, "token-privilege-will-expire", () => { + if (config.serverUrl !== "") { + fetchRTCToken(config.channelName) + .then((token) => { + console.log("RTC token fetched from server: ", token); + if (token) return agoraEngine.renewToken(token); + }) + .catch((error) => { + console.error(error); + }); + } else { + console.log("Please make sure you specified the token server URL in the configuration file"); + } + }); + }; + ``` + - useRTCClient + - useClientEvent + - renewToken + diff --git a/assets/code/video-sdk/authentication-workflow/fetch-token.mdx b/assets/code/video-sdk/authentication-workflow/fetch-token.mdx new file mode 100644 index 000000000..fa3ebf199 --- /dev/null +++ b/assets/code/video-sdk/authentication-workflow/fetch-token.mdx @@ -0,0 +1,182 @@ + + ```kotlin + fun fetchToken(channelName: String, uid: Int, callback: TokenCallback) { + val tokenRole = if (isBroadcaster) 1 else 2 + // Prepare the Url + val urlLString = "$serverUrl/rtc/$channelName/$tokenRole/uid/$uid/?expiry=$tokenExpiryTime" + + val client = OkHttpClient() + + // Create a request + val request: Request = Builder() + .url(urlLString) + .header("Content-Type", "application/json; charset=UTF-8") + .get() + .build() + + // Send the async http request + val call = client.newCall(request) + call.enqueue(object : Callback { + // Receive the response in a callback + @Throws(IOException::class) + override fun onResponse(call: Call, response: Response) { + if (response.isSuccessful) { + try { + // Extract rtcToken from the response + val responseBody = response.body!!.string() + val jsonObject = JSONObject(responseBody) + val rtcToken = jsonObject.getString("rtcToken") + // Return the token + callback.onTokenReceived(rtcToken) + } catch (e: JSONException) { + e.printStackTrace() + callback.onError("Invalid token response") + } + } else { + callback.onError("Token request failed") + } + } + + override fun onFailure(call: Call, e: IOException) { + callback.onError("IOException: $e") + } + }) + } + ``` + + + + ```swift + func fetchToken( + from tokenUrl: String, channel: String, + role: AgoraClientRole, userId: UInt = 0 + ) async throws -> String { + guard let url = URL(string: "\(tokenUrl)/getToken") else { + throw URLError(.badURL) + } + + var request = URLRequest(url: url) + request.httpMethod = "POST" + request.addValue("application/json", forHTTPHeaderField: "Content-Type") + + var userData = [ + "tokenType": "rtc", + "uid": String(userId), + "role": role == .broadcaster ? "publisher" : "subscriber", + "channel": channel + ] + + let requestData = try JSONEncoder().encode(userData) + request.httpBody = requestData + + let (data, _) = try await URLSession.shared.data(for: request) + let tokenResponse = try JSONDecoder().decode(TokenResponse.self, from: data) + + return tokenResponse.token + } + + /// A Codable struct representing the token server response. + struct TokenResponse: Codable { + /// Value of the RTC Token. + public let token: String + } + ``` + + + + ```csharp + public async Task FetchToken() + { + if(userRole == "Host") + { + role = 1; + } + else if (userRole == "Audience") + { + role = 2; + } + + string url = string.Format("{0}/rtc/{1}/{2}/uid/{3}/?expiry={4}", configData.tokenUrl, configData.channelName, role ,configData.uid, configData.tokenExpiryTime); + + UnityWebRequest request = UnityWebRequest.Get(url); + + var operation = request.SendWebRequest(); + + while (!operation.isDone) + { + await Task.Yield(); + } + + if (request.isNetworkError || request.isHttpError) + { + Debug.Log(request.error); + return; + } + + TokenStruct tokenInfo = JsonUtility.FromJson(request.downloadHandler.text); + Debug.Log("Retrieved token : " + tokenInfo.rtcToken); + _token = tokenInfo.rtcToken; + _channelName = configData.channelName; + } + ``` + + +```javascript + // Get the config + const config = agoraManager.config; + + // Fetches the RTC token for stream channels + async function fetchToken(uid, channelName) { + if (config.serverUrl !== "") { + try { + const res = await fetch( + config.proxyUrl + + config.serverUrl + + "/rtc/" + + channelName + + "/" + + role + + "/uid/" + + uid + + "/?expiry=" + + config.tokenExpiryTime, + { + headers: { + "X-Requested-With": "XMLHttpRequest", + }, + } + ); + const data = await res.text(); + const json = await JSON.parse(data); + console.log("Video SDK token fetched from server: ", json.rtcToken); + return json.rtcToken; + } catch (err) { + console.log(err); + } + } else { + return config.token; + } + } +``` + + + ```typescript + async function fetchRTCToken(channelName: string) { + if (config.serverUrl !== "") { + try { + const response = await fetch( + `${config.proxyUrl}${config.serverUrl}/rtc/${channelName}/publisher/uid/${config.uid}/?expiry=${config.tokenExpiryTime}` + ); + const data = await response.json() as { rtcToken: string }; + console.log("RTC token fetched from server: ", data.rtcToken); + return data.rtcToken; + } catch (error) { + console.error(error); + throw error; + } + } else { + return config.rtcToken; + } + } + ``` + diff --git a/assets/code/video-sdk/authentication-workflow/import-library.mdx b/assets/code/video-sdk/authentication-workflow/import-library.mdx new file mode 100644 index 000000000..6b8340b2d --- /dev/null +++ b/assets/code/video-sdk/authentication-workflow/import-library.mdx @@ -0,0 +1,38 @@ + + ```kotlin + import io.agora.rtc2.* + import io.agora.agora_manager.AgoraManager + import okhttp3.* + import okhttp3.Request.* + import android.content.Context + import java.io.IOException + import java.net.MalformedURLException + import java.net.URL + ``` + + + ```swift + import SwiftUI + import AgoraRtcKit + ``` + + + ```csharp + using UnityEngine.Networking; + using Agora.Rtc; + using System.Threading.Tasks; + ``` + + + ```javascript + import AgoraManager from "../agora_manager/agora_manager.js"; + import AgoraRTC from "agora-rtc-sdk-ng"; + ``` + + + ```typescript + import { AgoraManager } from "../agora-manager/agoraManager"; + import config from "../agora-manager/config"; + import { useClientEvent, useRTCClient } from "agora-rtc-react"; + ``` + diff --git a/assets/code/video-sdk/authentication-workflow/join-channel.mdx b/assets/code/video-sdk/authentication-workflow/join-channel.mdx new file mode 100644 index 000000000..4ed4cab46 --- /dev/null +++ b/assets/code/video-sdk/authentication-workflow/join-channel.mdx @@ -0,0 +1,220 @@ + + + ```kotlin + open fun joinChannelWithToken(channelName: String): Int { + if (agoraEngine == null) setupAgoraEngine() + return if (isValidURL(serverUrl)) { // A valid server url is available + // Fetch a token from the server for channelName + fetchToken(channelName, object : TokenCallback { + override fun onTokenReceived(rtcToken: String?) { + // Handle the received rtcToken + joinChannel(channelName, rtcToken) + } + + override fun onError(errorMessage: String) { + // Handle the error + sendMessage("Error: $errorMessage") + } + }) + 0 + } else { // use the token from the config.json file + val token = config!!.optString("rtcToken") + joinChannel(channelName, token) + } + } + ``` + + - joinChannel + + + + + ```swift + // This method is specifically used by the sample app. If there is a tokenURL, it will attempt to retrieve a token from there. + internal func joinChannel(_ channel: String, uid: UInt? = nil) async -> Int32 { + let userId = uid ?? DocsAppConfig.shared.uid + var token = DocsAppConfig.shared.rtcToken + if !DocsAppConfig.shared.tokenUrl.isEmpty { + do { + token = try await self.fetchToken( + from: DocsAppConfig.shared.tokenUrl, channel: channel, + role: self.role, userId: userId + ) + } catch { + print("token server fetch failed: \(error.localizedDescription)") + } + } + return self.joinChannel(channel, token: token, uid: userId, info: nil) + } + + // Joins a channel, starting the connection to a session. + open func joinChannel( + _ channel: String, token: String? = nil, uid: UInt = 0, info: String? = nil + ) async -> Int32 { + if await !AgoraManager.checkForPermissions() { + DispatchQueue.main.async { + self.label = """ + Camera and microphone permissions were not granted. + Check your security settings and try again. + """ + } + return -3 + } + + return self.agoraEngine.joinChannel( + byToken: token, channelId: channel, + info: info, uid: uid + ) + } + ``` + - joinChannel(byToken:channelId:info:uid:joinSuccess:) + + + + + ```swift + // This method is specifically used by the sample app. If there is a tokenURL, it will attempt to retrieve a token from there. + open func joinChannel(_ channel: String) async -> Int32 { + if let rtcToken = DocsAppConfig.shared.rtcToken, !rtcToken.isEmpty { + return self.joinChannel( + channel, token: DocsAppConfig.shared.rtcToken, + uid: DocsAppConfig.shared.uid, info: nil + ) + } + var token: String? + if !DocsAppConfig.shared.tokenUrl.isEmpty { + do { + token = try await self.fetchToken( + from: DocsAppConfig.shared.tokenUrl, channel: channel, role: self.role + ) + } catch { + print("token server fetch failed: \(error.localizedDescription)") + } + } + return self.joinChannel(channel, token: token, uid: DocsAppConfig.shared.uid, info: nil) + } + // Joins a channel, starting the connection to a session. + open func joinChannel( + _ channel: String, token: String? = nil, uid: UInt = 0, info: String? = nil + ) -> Int32 { + self.agoraEngine.joinChannel(byToken: token, channelId: channel, info: info, uid: uid) + } + ``` + - joinChannel(byToken:channelId:info:uid:joinSuccess:) + + + + + ```csharp + public override async void Join() + { + if (configData.tokenUrl == "") + { + Debug.Log("Specify a valid token server URL inside `config.json` if you wish to fetch token from the server"); + } + else + { + await FetchToken(); + } + + // Join the channel. + base.Join(); + } + ``` + + + + +```javascript +const joinWithToken = async (localPlayerContainer, channelParameters) => { + const token = await fetchToken(config.uid, config.channelName); + await agoraManager + .getAgoraEngine() + .join(config.appId, config.channelName, token, config.uid); + // Create a local audio track from the audio sampled by a microphone. + channelParameters.localAudioTrack = + await AgoraRTC.createMicrophoneAudioTrack(); + // Create a local video track from the video captured by a camera. + channelParameters.localVideoTrack = await AgoraRTC.createCameraVideoTrack(); + // Append the local video container to the page body. + document.body.append(localPlayerContainer); + // Publish the local audio and video tracks in the channel. + await agoraManager + .getAgoraEngine() + .publish([ + channelParameters.localAudioTrack, + channelParameters.localVideoTrack, + ]); + // Play the local video track. + channelParameters.localVideoTrack.play(localPlayerContainer); + }; +``` + + +```javascript +const joinWithToken = async (localPlayerContainer, channelParameters) => { + const token = await fetchToken(config.uid, config.channelName); + await agoraManager + .getAgoraEngine() + .join(config.appId, config.channelName, token, config.uid); + // Create a local audio track from the audio sampled by a microphone. + channelParameters.localAudioTrack = + await AgoraRTC.createMicrophoneAudioTrack(); + // Publish the local audio and video tracks in the channel. + await agoraManager + .getAgoraEngine() + .publish([ + channelParameters.localAudioTrack, + ]); + }; +``` + + + + ```typescript + function AuthenticationWorkflowManager(props: { children?: React.ReactNode }) { + const [channelName, setChannelName] = useState(""); + const [joined, setJoined] = useState(false); + + useTokenWillExpire(); + + const fetchTokenFunction = async () => { + if (config.serverUrl !== "" && channelName !== "") { + try { + const token = await fetchRTCToken(channelName) as string; + config.rtcToken = token; + config.channelName = channelName; + setJoined(true) + } catch (error) { + console.error(error); + } + } else { + console.log("Please make sure you specified the token server URL in the configuration file"); + } + }; + + return ( +
+ {!joined ? ( + <> + setChannelName(e.target.value)} + placeholder="Channel name" /> + + {props.children} + + ) : ( + <> + + + {props.children} + + + )} +
+ ); + } + ``` +
diff --git a/assets/code/video-sdk/authentication-workflow/renew-token.mdx b/assets/code/video-sdk/authentication-workflow/renew-token.mdx new file mode 100644 index 000000000..1d516345b --- /dev/null +++ b/assets/code/video-sdk/authentication-workflow/renew-token.mdx @@ -0,0 +1,48 @@ + + ```kotlin + agoraEngine!!.renewToken(rtcToken) + ``` + + + + ```swift + self.agoraEngine.renewToken(token) + ``` + + + - renewToken(_:) + + + - renewToken(_:) + + + + + ```csharp + public void RenewToken() + { + if(_token == "") + { + Debug.Log("Token was not retrieved"); + return; + } + + // Update RTC Engine with new token + agoraEngine.RenewToken(_token); + } + ``` + - RenewToken + + + ```javascript + // Renew tokens + agoraManager + .getAgoraEngine() + .on("token-privilege-will-expire", async function () { + options.token = await fetchToken(config.uid, config.channelName); + await agoraManager.getAgoraEngine().renewToken(options.token); + }); + ``` + - renewToken + - token-privilege-will-expire + diff --git a/assets/code/video-sdk/authentication-workflow/swift/specify-channel.mdx b/assets/code/video-sdk/authentication-workflow/specify-channel.mdx similarity index 95% rename from assets/code/video-sdk/authentication-workflow/swift/specify-channel.mdx rename to assets/code/video-sdk/authentication-workflow/specify-channel.mdx index bc84d3da3..ac54a1e40 100644 --- a/assets/code/video-sdk/authentication-workflow/swift/specify-channel.mdx +++ b/assets/code/video-sdk/authentication-workflow/specify-channel.mdx @@ -1,6 +1,6 @@ -``` swift +```swift channelTextField = NSTextField(frame: CGRect(x: 30, y: 30, width: 300, height: 40)) channelTextField.stringValue = "Enter channel name" self.view.addSubview(channelTextField) @@ -10,7 +10,7 @@ self.view.addSubview(channelTextField) -``` swift +```swift channelTextField = UITextField(frame: CGRect(x: 300, y: 20, width: 300, height: 30)) channelTextField.placeholder = "Enter channel name" self.view.addSubview(channelTextField) diff --git a/assets/code/video-sdk/authentication-workflow/swift/add-variables.mdx b/assets/code/video-sdk/authentication-workflow/swift/add-variables.mdx deleted file mode 100644 index ea5705c01..000000000 --- a/assets/code/video-sdk/authentication-workflow/swift/add-variables.mdx +++ /dev/null @@ -1,42 +0,0 @@ - - -``` swift -// Update with the temporary token generated in Agora Console. -var token = "" - -// Update with the channel name you used to generate the token in Agora Console. -var channelName = "" - -// Store the name of the channel to join -var channelTextField: NSTextField! - -// Add the base URL to your token server. For example, https://agora-token-service-production-92ff.up.railway.app" -var serverUrl = "" -// The ID of the app user -var userID = 0 -// Expire time in Seconds. -var tokenExpireTime = 40 -``` - - - - -``` swift -// Update with the temporary token generated in Agora Console. -var token = "" - -// Update with the channel name you used to generate the token in Agora Console. -var channelName = "" - -// Store the name of the channel to join -var channelTextField: UITextField! - -// Add the base URL to your token server. For example, https://agora-token-service-production-92ff.up.railway.app" -var serverUrl = "" -// The ID of the app user -var userID = 0 -// Expire time in Seconds. -var tokenExpireTime = 40 -``` - - \ No newline at end of file diff --git a/assets/code/video-sdk/authentication-workflow/swift/fetch-token.mdx b/assets/code/video-sdk/authentication-workflow/swift/fetch-token.mdx deleted file mode 100644 index 9215b11f8..000000000 --- a/assets/code/video-sdk/authentication-workflow/swift/fetch-token.mdx +++ /dev/null @@ -1,86 +0,0 @@ - - - ``` swift - func fetchToken() -> Bool { - // Construct the endpoint URL - channelName = channelTextField.stringValue ?? "" - if channelName.isEmpty { - showMessage(title: "Channel", text: "Please set a channel to join") - return false - } - guard let tokenServerURL = URL(string: "\(serverUrl)/rtc/\(channelName)/\(userRole.rawValue)/uid/\(userID)/?expiry=\(tokenExpireTime)") else { - return false - } - /// Semaphore waits for the request to complete, before returning the token. - let semaphore = DispatchSemaphore(value: 0) - var request = URLRequest(url: tokenServerURL, timeoutInterval: 10) - request.httpMethod = "GET" - - // Construct the GET request - let task = URLSession.shared.dataTask(with: request) { data, response, err in - defer { - // Signal that the request has completed - semaphore.signal() - } - guard let data = data else { - // No data, no token - return - } - let responseJSON = try? JSONSerialization.jsonObject(with: data, options: []) - if let responseDict = responseJSON as? [String: Any], let tokenToReturn = responseDict["rtcToken"] as? String { - // Key "rtcToken" found in response, assigning to tokenToReturn - self.token = tokenToReturn - } - } - - task.resume() - - // Waiting for signal found inside the GET request handler - semaphore.wait() - return true - } - ``` - - - - ``` swift - func fetchToken() -> Bool { - // Construct the endpoint URL - channelName = channelTextField.text ?? "" - if channelName.isEmpty { - showMessage(title: "Channel", text: "Please set a channel to join") - return false - } - guard let tokenServerURL = URL(string: "\(serverUrl)/rtc/\(channelName)/\(userRole.rawValue)/uid/\(userID)/?expiry=\(tokenExpireTime)") else { - return false - } - /// Semaphore waits for the request to complete, before returning the token. - let semaphore = DispatchSemaphore(value: 0) - var request = URLRequest(url: tokenServerURL, timeoutInterval: 10) - request.httpMethod = "GET" - - // Construct the GET request - let task = URLSession.shared.dataTask(with: request) { data, response, err in - defer { - // Signal that the request has completed - semaphore.signal() - } - guard let data = data else { - // No data, no token - return - } - let responseJSON = try? JSONSerialization.jsonObject(with: data, options: []) - if let responseDict = responseJSON as? [String: Any], let tokenToReturn = responseDict["rtcToken"] as? String { - // Key "rtcToken" found in response, assigning to tokenToReturn - self.token = tokenToReturn - } - } - - task.resume() - - // Waiting for signal found inside the GET request handler - semaphore.wait() - return true - } - ``` - \ No newline at end of file diff --git a/assets/code/video-sdk/cloud-proxy/configure-engine.mdx b/assets/code/video-sdk/cloud-proxy/configure-engine.mdx new file mode 100644 index 000000000..ccf790f89 --- /dev/null +++ b/assets/code/video-sdk/cloud-proxy/configure-engine.mdx @@ -0,0 +1,18 @@ + + ```typescript + export function CloudProxy() { + const agoraEngine = useRTCClient(AgoraRTC.createClient({ codec: "vp8", mode: config.selectedProduct })); + + return ( +
+

Connect through restricted networks with Cloud Proxy

+ + + +
+ ); + } + ``` + - useRTCClient + +
\ No newline at end of file diff --git a/assets/code/video-sdk/cloud-proxy/connection-failed.mdx b/assets/code/video-sdk/cloud-proxy/connection-failed.mdx new file mode 100644 index 000000000..6948c4cf3 --- /dev/null +++ b/assets/code/video-sdk/cloud-proxy/connection-failed.mdx @@ -0,0 +1,63 @@ + Use this event to see when an attempt to connect directly to fails. + + + ```kotlin + override fun onConnectionStateChanged(state: Int, reason: Int) { + if (state == CONNECTION_STATE_FAILED + && reason == CONNECTION_CHANGED_JOIN_FAILED ) { + directConnectionFailed = true + sendMessage("Join failed, reason: $reason") + } else if (state == CONNECTION_CHANGED_SETTING_PROXY_SERVER) { + sendMessage("Proxy server setting changed") + } + } + ``` + - onConnectionStateChanged + + + + ```csharp + // Event handler class to handle the events raised by Agora's RtcEngine instance + internal class CloudProxyEventHandler : UserEventHandler + { + private CloudProxyManager cloudProxy; + internal CloudProxyEventHandler(CloudProxyManager videoSample):base(videoSample) + { + cloudProxy = videoSample; + } + public override void OnConnectionStateChanged(RtcConnection connection, CONNECTION_STATE_TYPE state, CONNECTION_CHANGED_REASON_TYPE reason) + { + if(state == CONNECTION_STATE_TYPE.CONNECTION_STATE_FAILED && reason == CONNECTION_CHANGED_REASON_TYPE.CONNECTION_CHANGED_JOIN_FAILED) + { + cloudProxy.directConnectionFailed = true; + Debug.Log("Join failed, reason: " + reason); + } + else if (reason == CONNECTION_CHANGED_REASON_TYPE.CONNECTION_CHANGED_SETTING_PROXY_SERVER) + { + Debug.Log("Proxy server setting changed"); + } + } + } + ``` + - OnConnectionStateChanged + + + ```swift + func rtcEngine( + _ engine: AgoraRtcEngineKit, + connectionChangedTo state: AgoraConnectionState, + reason: AgoraConnectionChangedReason + ) { + if state == .failed, reason == .reasonJoinFailed { + // connection failed, try connect with proxy + } + } + ``` + + + - rtcEngine(_:connectionChangedTo:reason:) + + + - rtcEngine(_:connectionChangedTo:reason:) + + diff --git a/assets/code/video-sdk/cloud-proxy/event-handler.mdx b/assets/code/video-sdk/cloud-proxy/event-handler.mdx new file mode 100644 index 000000000..4bdfd6c0f --- /dev/null +++ b/assets/code/video-sdk/cloud-proxy/event-handler.mdx @@ -0,0 +1,103 @@ + + ```kotlin + override fun onProxyConnected(channel: String?, uid: Int, proxyType: Int, + localProxyIp: String?, elapsed: Int) { + // Connected to proxyType + } + ``` + - onProxyConnected + + + ```csharp + public override void OnProxyConnected(string channel, uint uid, PROXY_TYPE proxyType, string localProxyIp, int elapsed) + { + Debug.Log("Cloud proxy service enabled"); + } + ``` + - OnProxyConnected + + + + ```swift + func rtcEngine( + _ engine: AgoraRtcEngineKit, didProxyConnected channel: String, + withUid uid: UInt, proxyType: AgoraProxyType, + localProxyIp: String, elapsed: Int + ) { + // proxy type changed to `proxyType` + self.proxyState = proxyType + } + ``` + + + - rtcEngine(_:didProxyConnected:withUid:proxyType:localProxyIp:elapsed:) + + + - rtcEngine(_:didProxyConnected:withUid:proxyType:localProxyIp:elapsed:) + + + + + ```javascript + const handleVSDKEvents = (eventName, ...args) => { + switch (eventName) { + case "user-published": + if (args[1] == "video") { + // Retrieve the remote video track. + channelParameters.remoteVideoTrack = args[0].videoTrack; + // Retrieve the remote audio track. + channelParameters.remoteAudioTrack = args[0].audioTrack; + // Save the remote user id for reuse. + channelParameters.remoteUid = args[0].uid.toString(); + // Specify the ID of the DIV container. You can use the uid of the remote user. + remotePlayerContainer.id = args[0].uid.toString(); + channelParameters.remoteUid = args[0].uid.toString(); + remotePlayerContainer.textContent = + "Remote user " + args[0].uid.toString(); + // Append the remote container to the page body. + document.body.append(remotePlayerContainer); + // Play the remote video track. + channelParameters.remoteVideoTrack.play(remotePlayerContainer); + } + // Subscribe and play the remote audio track If the remote user publishes the audio track only. + if (args[1] == "audio") { + // Get the RemoteAudioTrack object in the AgoraRTCRemoteUser object. + channelParameters.remoteAudioTrack = args[0].audioTrack; + // Play the remote audio track. No need to pass any DOM element. + channelParameters.remoteAudioTrack.play(); + } + } + }; + ``` + + + ```javascript + const handleVSDKEvents = (eventName, ...args) => { + switch (eventName) { + case "user-published": + // Subscribe and play the remote audio track If the remote user publishes the audio track only. + if (args[1] == "audio") { + // Get the RemoteAudioTrack object in the AgoraRTCRemoteUser object. + channelParameters.remoteAudioTrack = args[0].audioTrack; + // Play the remote audio track. No need to pass any DOM element. + channelParameters.remoteAudioTrack.play(); + } + } + }; + ``` + + + + ```typescript + useClientEvent(agoraEngine, "is-using-cloud-proxy", (isUsingProxy) => { + // Display the proxy server state based on the isUsingProxy Boolean variable. + if (isUsingProxy) { + console.log("Cloud proxy service activated"); + } else { + console.log("Proxy service failed") + } + }); + ``` + - useClientEvent + + diff --git a/assets/code/video-sdk/cloud-proxy/import-library.mdx b/assets/code/video-sdk/cloud-proxy/import-library.mdx new file mode 100644 index 000000000..f8b8c2a07 --- /dev/null +++ b/assets/code/video-sdk/cloud-proxy/import-library.mdx @@ -0,0 +1,28 @@ + + ```kotlin + import io.agora.rtc2.Constants.* + ``` + + + ```swift + import AgoraRtcKit + ``` + + + ```csharp + using Agora.Rtc; + ``` + + + ```javascript + import AgoraManager from "../agora_manager/agora_manager.js"; + ``` + + + ```typescript + import AgoraRTC from "agora-rtc-sdk-ng"; + import { useRTCClient, AgoraRTCProvider, useClientEvent } from "agora-rtc-react"; + import AuthenticationWorkflowManager from "../authentication-workflow/authenticationWorkflowManager"; + import config from "../agora-manager/config"; + ``` + diff --git a/assets/code/video-sdk/cloud-proxy/set-cloud-proxy.mdx b/assets/code/video-sdk/cloud-proxy/set-cloud-proxy.mdx new file mode 100644 index 000000000..95ba5bedb --- /dev/null +++ b/assets/code/video-sdk/cloud-proxy/set-cloud-proxy.mdx @@ -0,0 +1,70 @@ + + ```kotlin + override fun joinChannel(channelName: String, token: String?): Int { + // Check if a proxy connection is required + if (directConnectionFailed) { + // Start cloud proxy service and set automatic UDP mode. + val proxyStatus = agoraEngine!!.setCloudProxy(TRANSPORT_TYPE_UDP_PROXY) + if (proxyStatus == 0) { + sendMessage("Proxy service setup successful") + } else { + sendMessage("Proxy service setup failed with error :$proxyStatus") + } + } + return super.joinChannel(channelName, token) + } + ``` + - setCloudProxy + + + ```swift + func setCloudProxy(to proxyType: AgoraCloudProxyType) { + self.agoraEngine.setCloudProxy(proxyType) + } + ``` + + - setCloudProxy(_:) + + + - setCloudProxy(_:) + + + + ```csharp + agoraEngine.SetCloudProxy(proxyType); + ``` + - SetCloudProxy + - CLOUD_PROXY_TYPE + + + ```javascript + if (config.cloudProxy == true) { + // Start cloud proxy service in the forced UDP mode. + agoraEngine.startProxyServer(3); + agoraEngine.on("is-using-cloud-proxy", (isUsingProxy) => { + // Display the proxy server state based on the isUsingProxy Boolean variable. + if (isUsingProxy == true) { + console.log("Cloud proxy service activated"); + } else { + console.log("Proxy service failed"); + } + }); + } + + const stopProxyServer = () => { + agoraEngine.stopProxyServer(); + } + ``` + - startProxyServer + + + ```typescript + const agoraEngine = useRTCClient(); + useEffect(() => { + agoraEngine.startProxyServer(3); + }, []); + ``` + - useRTCClient + - startProxyServer + + diff --git a/assets/code/video-sdk/cloud-proxy/set-variables.mdx b/assets/code/video-sdk/cloud-proxy/set-variables.mdx new file mode 100644 index 000000000..d7b6c57db --- /dev/null +++ b/assets/code/video-sdk/cloud-proxy/set-variables.mdx @@ -0,0 +1,27 @@ + + Use a variable to keep record of any failed attempts to connect directly. + ```kotlin + boolean directConnectionFailed = false; + ``` + + + ```swift + var proxyState: AgoraProxyType? + ``` + + - AgoraProxyType + + + - AgoraProxyType + + + + ```csharp + private CLOUD_PROXY_TYPE proxyType = CLOUD_PROXY_TYPE.UDP_PROXY; + ``` + + + ```javascript + const agoraEngine = agoraManager.getAgoraEngine(); + ``` + diff --git a/assets/code/video-sdk/custom-video-and-audio/configure-engine-audio.mdx b/assets/code/video-sdk/custom-video-and-audio/configure-engine-audio.mdx new file mode 100644 index 000000000..dd67a0e78 --- /dev/null +++ b/assets/code/video-sdk/custom-video-and-audio/configure-engine-audio.mdx @@ -0,0 +1,20 @@ + +Override the inherited `SetupAgoraEngine` and set an audio profile for optimal audio quality: + ```csharp + public override void SetupAgoraEngine() + { + InitTexture(); + base.SetupAgoraEngine(); + agoraEngine.SetAudioProfile(AUDIO_PROFILE_TYPE.AUDIO_PROFILE_MUSIC_HIGH_QUALITY, + AUDIO_SCENARIO_TYPE.AUDIO_SCENARIO_DEFAULT); + SetExternalAudioSource(); + InitEventHandler(); + } + ``` + + - SetAudioProfile + + + - SetAudioProfile + + \ No newline at end of file diff --git a/assets/code/video-sdk/custom-video-and-audio/configure-engine.mdx b/assets/code/video-sdk/custom-video-and-audio/configure-engine.mdx new file mode 100644 index 000000000..93d2c4c59 --- /dev/null +++ b/assets/code/video-sdk/custom-video-and-audio/configure-engine.mdx @@ -0,0 +1,10 @@ + + ```csharp + public override void SetupAgoraEngine() + { + base.SetupAgoraEngine(); + SetExternalVideoSource(); + InitEventHandler(); + } + ``` + \ No newline at end of file diff --git a/assets/code/video-sdk/custom-video-and-audio/create-custom-audio-track.mdx b/assets/code/video-sdk/custom-video-and-audio/create-custom-audio-track.mdx new file mode 100644 index 000000000..8c77d378e --- /dev/null +++ b/assets/code/video-sdk/custom-video-and-audio/create-custom-audio-track.mdx @@ -0,0 +1,37 @@ + + ```typescript + // Create custom audio track using the user's media devices + const createCustomAudioTrack = () => { + navigator.mediaDevices + .getUserMedia({ audio: true }) + .then((stream) => { + const audioMediaStreamTracks = stream.getAudioTracks(); + // For demonstration purposes, we used the default audio device. In a real-time scenario, you can use the dropdown to select the audio device of your choice. + setCustomAudioTrack(AgoraRTC.createCustomAudioTrack({ mediaStreamTrack: audioMediaStreamTracks[0] })); + }).catch((error) => console.error(error)); + }; + useEffect(() => { + if (connectionState === "CONNECTED") { + createCustomAudioTrack(); + } + }, [connectionState]); + ``` + - createCustomAudioTrack + + + ```csharp + private void SetExternalAudioSource() + { + lock (_rtcLock) + { + audioTrackID = agoraEngine.CreateCustomAudioTrack(AUDIO_TRACK_TYPE.AUDIO_TRACK_MIXABLE, new AudioTrackConfig(false)); + } + } + ``` + + - CreateCustomAudioTrack + + + - CreateCustomAudioTrack + + \ No newline at end of file diff --git a/assets/code/video-sdk/custom-video-and-audio/create-custom-video-track.mdx b/assets/code/video-sdk/custom-video-and-audio/create-custom-video-track.mdx new file mode 100644 index 000000000..e9c90d21d --- /dev/null +++ b/assets/code/video-sdk/custom-video-and-audio/create-custom-video-track.mdx @@ -0,0 +1,37 @@ + + ```typescript + // Create custom video track using the user's media devices + const createCustomVideoTrack = () => { + navigator.mediaDevices + .getUserMedia({ video: true }) + .then((stream) => { + const videoMediaStreamTracks = stream.getVideoTracks(); + // For demonstration purposes, we used the default video device. In a real-time scenario, you can use the dropdown to select the video device of your choice. + setCustomVideoTrack(AgoraRTC.createCustomVideoTrack({ mediaStreamTrack: videoMediaStreamTracks[0] })); + }) + .catch((error) => console.error(error)); + }; + useEffect(() => { + if (connectionState === "CONNECTED") { + createCustomVideoTrack(); + } + }, [connectionState]); + ``` + - createCustomVideoTrack + + + ```csharp + private void SetExternalVideoSource() + { + var ret = agoraEngine.SetExternalVideoSource(true, false, EXTERNAL_VIDEO_SOURCE_TYPE.VIDEO_FRAME, new SenderOptions()); + videoTrackID = agoraEngine.CreateCustomVideoTrack(); + agoraEngine.DisableVideo(); + LocalView.SetForUser(configData.uid, configData.channelName, VIDEO_SOURCE_TYPE.VIDEO_SOURCE_CUSTOM); + } + ``` + - SetExternalVideoSource + - CreateCustomVideoTrack + - DisableVideo + - SetForUser + + \ No newline at end of file diff --git a/assets/code/video-sdk/custom-video-and-audio/destroy-custom-track-audio.mdx b/assets/code/video-sdk/custom-video-and-audio/destroy-custom-track-audio.mdx new file mode 100644 index 000000000..b16f3d18a --- /dev/null +++ b/assets/code/video-sdk/custom-video-and-audio/destroy-custom-track-audio.mdx @@ -0,0 +1,24 @@ + +Override `DestroyEngine` to destroy the track before you destroy the engine: + ```csharp + public override void DestroyEngine() + { + if (agoraEngine == null) + { + return; + } + // Abort the audio that pushes audio frames in the channel. + _pushAudioFrameThread.Abort(); + // Destroy the custom audio track. + agoraEngine.DestroyCustomAudioTrack(audioTrackID); + base.DestroyEngine(); + } + ``` + + - DestroyCustomAudioTrack + + + - DestroyCustomAudioTrack + + + \ No newline at end of file diff --git a/assets/code/video-sdk/custom-video-and-audio/destroy-custom-track-video.mdx b/assets/code/video-sdk/custom-video-and-audio/destroy-custom-track-video.mdx new file mode 100644 index 000000000..4179d9622 --- /dev/null +++ b/assets/code/video-sdk/custom-video-and-audio/destroy-custom-track-video.mdx @@ -0,0 +1,16 @@ + +Override `DestroyEngine` to destroy the track before you destroy the engine: + ```csharp + public override void DestroyEngine() + { + if (agoraEngine == null) + { + return; + } + agoraEngine.DestroyCustomVideoTrack(videoTrackID); + base.DestroyEngine(); + } + ``` + - DestroyCustomVideoTrack + + \ No newline at end of file diff --git a/assets/code/video-sdk/custom-video-and-audio/enable-audio-publishing.mdx b/assets/code/video-sdk/custom-video-and-audio/enable-audio-publishing.mdx new file mode 100644 index 000000000..f33cc4f2b --- /dev/null +++ b/assets/code/video-sdk/custom-video-and-audio/enable-audio-publishing.mdx @@ -0,0 +1,97 @@ + + ```kotlin + fun playCustomAudio() { + // Create a custom audio track + val audioTrackConfig = AudioTrackConfig() + audioTrackConfig.enableLocalPlayback = true + + customAudioTrackId = agoraEngine!!.createCustomAudioTrack( + Constants.AudioTrackType.AUDIO_TRACK_MIXABLE, + audioTrackConfig + ) + + // Set custom audio publishing options + val options = ChannelMediaOptions() + options.publishCustomAudioTrack = true // Enable publishing custom audio + options.publishCustomAudioTrackId = customAudioTrackId + options.publishMicrophoneTrack = false // Disable publishing microphone audio + agoraEngine!!.updateChannelMediaOptions(options) + + // Open the audio file + openAudioFile() + + // Start the pushing task + pushingTask = Thread(PushingTask(this)) + pushingAudio = true + pushingTask?.start() + } + + private fun openAudioFile() { + // Open the audio file + try { + inputStream = mContext.resources.assets.open(audioFile) + // Use the inputStream as needed + } catch (e: IOException) { + e.printStackTrace() + } + } + + fun stopCustomAudio() { + pushingAudio = false + pushingTask?.interrupt() + } + ``` + - createCustomAudioTrack + - updateChannelMediaOptions + - setExternalAudioSource + + + + + Add the following line to the `init()` method of `AgoraManager` to set the external audio source to `true` before joining a channel: + + ```swift + self.agoraEngine.setExternalAudioSource(true, sampleRate: 44100, channels: 1) + ``` + + + - setExternalAudioSource(_:sampleRate:channels:localPlayback:publish:) + + + - setExternalAudioSource(_:sampleRate:channels:localPlayback:publish:) + + + + +```javascript +// Retrieve the available audio tracks. +var audioTracks = stream.getAudioTracks(); +console.log("Using video device: " + audioTracks[0].label); +// Create custom audio track. +channelParameters.localAudioTrack = AgoraRTC.createCustomAudioTrack({ + mediaStreamTrack: audioTracks[0], +}); +``` +- createCustomAudioTrack + + + + ```typescript + const CustomAudioComponent: React.FC<{ customAudioTrack: ILocalAudioTrack | null }> = ({ customAudioTrack }) => { + const agoraContext = useAgoraContext(); + + useEffect(() => { + if (customAudioTrack && agoraContext.localMicrophoneTrack) { + agoraContext.localMicrophoneTrack.stop(); // Stop the default microphone track + customAudioTrack?.play(); // Play the custom audio track for the local user + } + return () => { + customAudioTrack?.stop(); // Stop the custom audio track when the component unmounts + }; + }, [customAudioTrack, agoraContext.localMicrophoneTrack]); + return <>; + }; + ``` + - play + - stop + diff --git a/assets/code/video-sdk/custom-video-and-audio/enable-video-publishing.mdx b/assets/code/video-sdk/custom-video-and-audio/enable-video-publishing.mdx new file mode 100644 index 000000000..31940aa17 --- /dev/null +++ b/assets/code/video-sdk/custom-video-and-audio/enable-video-publishing.mdx @@ -0,0 +1,105 @@ + + ```kotlin + fun setupCustomVideo () { + // Enable publishing of the captured video from a custom source + val options = ChannelMediaOptions() + options.publishCustomVideoTrack = true + options.publishCameraTrack = false + + agoraEngine!!.updateChannelMediaOptions(options) + + // Configure the external video source. + agoraEngine!!.setExternalVideoSource( + true, + true, + Constants.ExternalVideoSourceType.VIDEO_FRAME + ) + + // Check whether texture encoding is supported + sendMessage(if (agoraEngine!!.isTextureEncodeSupported) "Texture encoding is supported" else "Texture encoding is not supported") + } + ``` + - updateChannelMediaOptions + - setExternalVideoSource + - isTextureEncodeSupported + + + ```swift + self.agoraEngine.setExternalVideoSource(true, useTexture: true, sourceType: .videoFrame) + ``` + + + - setExternalVideoSource(_:useTexture:sourceType:) + + + - setExternalVideoSource(_:useTexture:sourceType:) + + + + ```csharp + private void SetExternalVideoSource() + { + var ret = agoraEngine.SetExternalVideoSource(true, false, EXTERNAL_VIDEO_SOURCE_TYPE.VIDEO_FRAME, new SenderOptions()); + videoTrackID = agoraEngine.CreateCustomVideoTrack(); + agoraEngine.DisableVideo(); + LocalView.SetForUser(configData.uid, configData.channelName, VIDEO_SOURCE_TYPE.VIDEO_SOURCE_CUSTOM); + } + private void InitTexture() + { + rect = new Rect(0, 0, Screen.width, Screen.height); + texture = new Texture2D((int)rect.width, (int)rect.height, TextureFormat.RGBA32, false); + } + public void SetVideoEncoderConfiguration() + { + VideoEncoderConfiguration videoEncoderConfiguration = new VideoEncoderConfiguration(); + videoEncoderConfiguration.dimensions = new VideoDimensions((int)rect.width, (int)rect.height); + agoraEngine.SetVideoEncoderConfiguration(videoEncoderConfiguration); + } + ``` + + +```javascript +const startCustomVideoAndAudio = async (channelParameters) => { + await agoraEngine.join( + config.appId, + config.channelName, + config.token, + config.uid + ); + // Create a local audio track from the audio sampled by a microphone. + channelParameters.localAudioTrack = + await AgoraRTC.createMicrophoneAudioTrack(); + + // An object specifying the types of media to request. + var constraints = (window.constraints = { audio: true, video: true }); + // A method to request media stream object. + await navigator.mediaDevices + .getUserMedia(constraints) + .then(function (stream) { + // Get all the available video tracks. + var videoTracks = stream.getVideoTracks(); + console.log("Using video device: " + videoTracks[0].label); + // Create a custom video track. + channelParameters.localVideoTrack = AgoraRTC.createCustomVideoTrack({ + mediaStreamTrack: videoTracks[0], + }); + }) + .catch(function (error) { + console.log(error); + }); + + // Append the local video container to the page body. + document.body.append(channelParameters.localPlayerContainer); + // Publish the local audio and video tracks in the channel. + await agoraEngine.publish([ + channelParameters.localAudioTrack, + channelParameters.localVideoTrack, + ]); + // Play the local video track. + channelParameters.localVideoTrack.play( + channelParameters.localPlayerContainer + ); + }; +``` +- createCustomVideoTrack + diff --git a/assets/code/video-sdk/custom-video-and-audio/import-library-audio.mdx b/assets/code/video-sdk/custom-video-and-audio/import-library-audio.mdx new file mode 100644 index 000000000..bab20fa69 --- /dev/null +++ b/assets/code/video-sdk/custom-video-and-audio/import-library-audio.mdx @@ -0,0 +1,25 @@ + + ```kotlin + import io.agora.rtc2.ChannelMediaOptions + import io.agora.rtc2.Constants + import io.agora.rtc2.audio.AudioTrackConfig + import java.io.IOException + import java.io.InputStream + ``` + + + ```typescript + import { AgoraRTCProvider, useRTCClient, useConnectionState } from "agora-rtc-react"; + import AgoraRTC, {ILocalAudioTrack } from "agora-rtc-sdk-ng"; + import AuthenticationWorkflowManager from "../authentication-workflow/authenticationWorkflowManager"; + import { useAgoraContext } from "../agora-manager/agoraManager"; + import config from "../agora-manager/config"; + ``` + + + ```csharp + using Agora.Rtc; + using RingBuffer; + using System.Threading; + ``` + \ No newline at end of file diff --git a/assets/code/video-sdk/custom-video-and-audio/import-library.mdx b/assets/code/video-sdk/custom-video-and-audio/import-library.mdx new file mode 100644 index 000000000..b8222ce37 --- /dev/null +++ b/assets/code/video-sdk/custom-video-and-audio/import-library.mdx @@ -0,0 +1,37 @@ + + ```kotlin + import io.agora.base.VideoFrame + import io.agora.rtc2.ChannelMediaOptions + import io.agora.rtc2.Constants + import android.graphics.SurfaceTexture + import android.graphics.SurfaceTexture.OnFrameAvailableListener + import java.io.IOException + import java.io.InputStream + ``` + + + ```swift + import AVKit + import AgoraRtcKit + ``` + + + ```csharp + using Agora.Rtc; + ``` + + +```javascript +import AgoraManager from "../agora_manager/agora_manager.js"; +import AgoraRTC from "agora-rtc-sdk-ng"; +``` + + + ```typescript + import { AgoraRTCProvider, useRTCClient, useConnectionState } from "agora-rtc-react"; + import AgoraRTC, {ILocalVideoTrack } from "agora-rtc-sdk-ng"; + import AuthenticationWorkflowManager from "../authentication-workflow/authenticationWorkflowManager"; + import { useAgoraContext } from "../agora-manager/agoraManager"; + import config from "../agora-manager/config"; + ``` + diff --git a/assets/code/video-sdk/custom-video-and-audio/push-audio-frames.mdx b/assets/code/video-sdk/custom-video-and-audio/push-audio-frames.mdx new file mode 100644 index 000000000..a6f0b82bb --- /dev/null +++ b/assets/code/video-sdk/custom-video-and-audio/push-audio-frames.mdx @@ -0,0 +1,125 @@ + + ```kotlin + internal class PushingTask(private val manager: CustomVideoAudioManager) : Runnable { + override fun run() { + Process.setThreadPriority(Process.THREAD_PRIORITY_URGENT_AUDIO) + while (manager.pushingAudio) { + val before = System.currentTimeMillis() + manager.agoraEngine?.pushExternalAudioFrame(manager.readBuffer(), + System.currentTimeMillis(), + manager.sampleRate, + manager.numberOfChannels, + Constants.BytesPerSample.TWO_BYTES_PER_SAMPLE, + manager.customAudioTrackId + ) + val now = System.currentTimeMillis() + val consuming = now - before + if (consuming < manager.pushInterval) { + try { + Thread.sleep(manager.pushInterval - consuming) + } catch (e: InterruptedException) { + e.printStackTrace() + } + } + } + } + } + ``` + - pushExternalAudioFrame + + + ```swift + func audioFrameCaptured(buf: CMSampleBuffer) { + agoraEngine.pushExternalAudioFrameSampleBuffer(buf) + } + ``` + + - pushExternalAudioFrameSampleBuffer(_:) + + + - pushExternalAudioFrameSampleBuffer(_:) + + + + ```csharp + private void StartPushAudioFrame() + { + // 1-sec-length buffer + var bufferLength = SAMPLE_RATE * CHANNEL; + _audioBuffer = new RingBuffer(bufferLength, true); + _startConvertSignal = true; + _pushAudioFrameThread = new Thread(PushAudioFrameThread); + _pushAudioFrameThread.Start(); + } + private void PushAudioFrameThread() + { + var bytesPerSample = 2; + var type = AUDIO_FRAME_TYPE.FRAME_TYPE_PCM16; + var channels = CHANNEL; + var samples = SAMPLE_RATE / PUSH_FREQ_PER_SEC; + var samplesPerSec = SAMPLE_RATE; + + var freq = 1000 / PUSH_FREQ_PER_SEC; + + var audioFrame = new AudioFrame + { + bytesPerSample = BYTES_PER_SAMPLE.TWO_BYTES_PER_SAMPLE, + type = type, + samplesPerChannel = samples, + samplesPerSec = samplesPerSec, + channels = channels, + RawBuffer = new byte[samples * bytesPerSample * CHANNEL], + renderTimeMs = freq + }; + + double startMillisecond = GetTimestamp(); + long tick = 0; + + while (true) + { + lock (_rtcLock) + { + if (agoraEngine == null) + { + break; + } + + int nRet = -1; + lock (_audioBuffer) + { + if (_audioBuffer.Size > samples * bytesPerSample * CHANNEL) + { + for (var j = 0; j < samples * bytesPerSample * CHANNEL; j++) + { + audioFrame.RawBuffer[j] = _audioBuffer.Get(); + } + nRet = agoraEngine.PushAudioFrame(audioFrame, audioTrackID); + //Debug.Log("PushAudioFrame returns: " + nRet); + + } + } + + if (nRet == 0) + { + tick++; + double nextMillisecond = startMillisecond + tick * freq; + double curMillisecond = GetTimestamp(); + int sleepMillisecond = (int)Math.Ceiling(nextMillisecond - curMillisecond); + //Debug.Log("sleepMillisecond : " + sleepMillisecond); + if (sleepMillisecond > 0) + { + Thread.Sleep(sleepMillisecond); + } + } + } + + } + } + ``` + + - PushAudioFrame + + + - PushAudioFrame + + \ No newline at end of file diff --git a/assets/code/video-sdk/custom-video-and-audio/push-video-frames.mdx b/assets/code/video-sdk/custom-video-and-audio/push-video-frames.mdx new file mode 100644 index 000000000..8e5d1b623 --- /dev/null +++ b/assets/code/video-sdk/custom-video-and-audio/push-video-frames.mdx @@ -0,0 +1,89 @@ + + ```kotlin + private val onFrameAvailableListener = OnFrameAvailableListener { + // Callback to notify that a new stream video frame is available. + if (isJoined) { + // Configure the external video frames and send them to the SDK + val videoFrame: VideoFrame? = null + + // Add code here to convert the surfaceTexture data to a VideoFrame object + + // Send VideoFrame to the SDK + agoraEngine!!.pushExternalVideoFrame(videoFrame) + } + } + ``` + - pushExternalVideoFrame + + + ```swift + func myVideoCapture(_ pixelBuffer: CVPixelBuffer, rotation: Int, timeStamp: CMTime) { + let videoFrame = AgoraVideoFrame() + videoFrame.format = 12 + videoFrame.textureBuf = pixelBuffer + videoFrame.time = timeStamp + videoFrame.rotation = Int32(rotation) + + // Push the video frame to the Agora SDK + let framePushed = self.agoraEngine.pushExternalVideoFrame(videoFrame) + if !framePushed { + print("Frame could not be pushed.") + } + } + ``` + + - AgoraVideoFrame + - pushExternalVideoFrame(_:) + + + - AgoraVideoFrame + - pushExternalVideoFrame(_:) + + + + ```csharp + public IEnumerator ShareScreen() + { + yield return new WaitForEndOfFrame(); + if (agoraEngine != null) + { + texture.ReadPixels(rect, 0, 0); + texture.Apply(); + +#if UNITY_2018_1_OR_NEWER + NativeArray nativeByteArray = texture.GetRawTextureData(); + if (shareData?.Length != nativeByteArray.Length) + { + shareData = new byte[nativeByteArray.Length]; + } + nativeByteArray.CopyTo(shareData); +#else + _shareData = _texture.GetRawTextureData(); +#endif + + ExternalVideoFrame externalVideoFrame = new ExternalVideoFrame(); + externalVideoFrame.type = VIDEO_BUFFER_TYPE.VIDEO_BUFFER_RAW_DATA; + externalVideoFrame.format = VIDEO_PIXEL_FORMAT.VIDEO_PIXEL_RGBA; + externalVideoFrame.buffer = shareData; + externalVideoFrame.stride = (int)rect.width; + externalVideoFrame.height = (int)rect.height; + externalVideoFrame.cropLeft = 10; + externalVideoFrame.cropTop = 10; + externalVideoFrame.cropRight = 10; + externalVideoFrame.cropBottom = 10; + externalVideoFrame.rotation = 180; + externalVideoFrame.timestamp = DateTime.Now.Ticks / 10000; + var ret = agoraEngine.PushVideoFrame(externalVideoFrame, videoTrackID); + Debug.Log("PushVideoFrame ret = " + ret + "time: " + DateTime.Now.Millisecond); + } + } + // Get timestamp millisecond + private double GetTimestamp() + { + TimeSpan ts = DateTime.UtcNow - new DateTime(1970, 1, 1, 0, 0, 0, 0); + return ts.TotalMilliseconds; + } + ``` + - PushVideoFrame + + \ No newline at end of file diff --git a/assets/code/video-sdk/custom-video-and-audio/read-audio-input.mdx b/assets/code/video-sdk/custom-video-and-audio/read-audio-input.mdx new file mode 100644 index 000000000..ae9fcc6d2 --- /dev/null +++ b/assets/code/video-sdk/custom-video-and-audio/read-audio-input.mdx @@ -0,0 +1,74 @@ + + ```kotlin + private fun readBuffer(): ByteArray? { + // Read the audio file buffer + val byteSize = bufferSize + val buffer = ByteArray(byteSize) + try { + if (inputStream!!.read(buffer) < 0) { + inputStream!!.reset() + return readBuffer() + } + } catch (e: IOException) { + e.printStackTrace() + } + return buffer + } + ``` + + +1. Dynamically access the audio source attached to the scene and play it: + ```csharp + public void SetupAudioSource() + { + + // Find the Canvas GameObject + GameObject canvas = GameObject.Find("Canvas"); + AudioSource audioSource = canvas.GetComponent(); + if(audioSource) + { + // Play the audio + audioSource.Play(); + } + else + { + Debug.Log("Audio source not found"); + } + + } + ``` +1. Listen to the `OnAudioFilterRead` callback and extract the audio source raw data that you pushes in the channel using `StartPushAudioFrame`: + ```csharp + public void OnAudioFilterRead(float[] data, int channels) + { + if (!_startConvertSignal) return; + var rescaleFactor = 32767; + lock (_audioBuffer) + { + foreach (var t in data) + { + var sample = t; + if (sample > 1) sample = 1; + else if (sample < -1) sample = -1; + + var shortData = (short)(sample * rescaleFactor); + var byteArr = new byte[2]; + byteArr = BitConverter.GetBytes(shortData); + + _audioBuffer.Put(byteArr[0]); + _audioBuffer.Put(byteArr[1]); + } + } + } + ``` +1. Stop the audio source upon leaving the channel or quitting the : + ```csharp + private void StopAudioFile() + { + // Find the Canvas GameObject + GameObject canvas = GameObject.Find("Canvas"); + AudioSource audioSource = canvas.GetComponent(); + audioSource.Stop(); + } + ``` + diff --git a/assets/code/video-sdk/custom-video-and-audio/render-custom-video.mdx b/assets/code/video-sdk/custom-video-and-audio/render-custom-video.mdx new file mode 100644 index 000000000..a178df2f8 --- /dev/null +++ b/assets/code/video-sdk/custom-video-and-audio/render-custom-video.mdx @@ -0,0 +1,85 @@ + + ```kotlin + fun customLocalVideoPreview(): TextureView { + // Create TextureView + previewTextureView = TextureView(mContext) + // Add a SurfaceTextureListener + previewTextureView.surfaceTextureListener = surfaceTextureListener + + return previewTextureView + } + + private val surfaceTextureListener: SurfaceTextureListener = object : SurfaceTextureListener { + @RequiresApi(Build.VERSION_CODES.O) + override fun onSurfaceTextureAvailable( + surface: SurfaceTexture, + width: Int, + height: Int + ) { + // Invoked when a TextureView's SurfaceTexture is ready for use. + if (mPreviewing) { + // Already previewing custom video + return + } + sendMessage("Surface Texture Available") + mTextureDestroyed = false + + // Set up previewSurfaceTexture + previewSurfaceTexture = SurfaceTexture(true) + previewSurfaceTexture!!.setOnFrameAvailableListener(onFrameAvailableListener) + + // Add code here to: + // * set up and configure the custom video source + // * set SurfaceTexture of the custom video source to previewSurfaceTexture + sendMessage("Add your code to configure a custom video source") + + // Start preview + mPreviewing = true + } + + override fun onSurfaceTextureSizeChanged( + surface: SurfaceTexture, + width: Int, + height: Int + ) { + } + + override fun onSurfaceTextureDestroyed(surface: SurfaceTexture): Boolean { + mTextureDestroyed = true + return false + } + + override fun onSurfaceTextureUpdated(surface: SurfaceTexture) {} + } + ``` + + + does not support rendering video frames captured in the push mode. You need to implement a custom video renderer using methods from outside the SDK. + + For this, [`AVCaptureDevice`](https://developer.apple.com/documentation/avfoundation/avcapturedevice) and [`AVCaptureSession`](https://developer.apple.com/documentation/avfoundation/avcapturesession) can be used to capture frames and manage capturing sessions. + + Have a look at [`CustomAudioVideoView`](https://github.com/AgoraIO/video-sdk-samples-ios/blob/main/custom-video-and-audio/CustomAudioVideoView.swift) for more details. + + + ```typescript + const CustomVideoComponent: React.FC<{ customVideoTrack: ILocalVideoTrack | null }> = ({ customVideoTrack }) => { + const agoraContext = useAgoraContext(); + useEffect(() => { + if (customVideoTrack && agoraContext.localCameraTrack) { + const mediaStreamTrack = customVideoTrack.getMediaStreamTrack(); + agoraContext.localCameraTrack.replaceTrack(mediaStreamTrack, true) + .then(() => console.log("The default local video track has been changed")) + .catch((error) => { console.log(error) }) + } + return () => { + customVideoTrack?.stop(); // Stop the custom video track when the component unmounts + }; + }, [agoraContext.localCameraTrack, customVideoTrack]); + return <>; + }; + ``` + - getMediaStreamTrack + - replaceTrack + - stop + + \ No newline at end of file diff --git a/assets/code/video-sdk/custom-video-and-audio/set-variables-audio.mdx b/assets/code/video-sdk/custom-video-and-audio/set-variables-audio.mdx new file mode 100644 index 000000000..7c5d9bad5 --- /dev/null +++ b/assets/code/video-sdk/custom-video-and-audio/set-variables-audio.mdx @@ -0,0 +1,40 @@ + + ```kotlin + // Custom audio parameters + private var customAudioTrackId = -1 + private val audioFile = "applause.wav" // raw audio file + private val sampleRate = 44100 + private val numberOfChannels = 2 + private val bitsPerSample = 16 + private val samples = 441 + private val bufferSize = samples * bitsPerSample / 8 * numberOfChannels + private val pushInterval = samples * 1000 / sampleRate + private var inputStream: InputStream? = null + private var pushingTask: Thread? = null + var pushingAudio = false + ``` + + + ```typescript + const [customAudioTrack, setCustomAudioTrack] = useState(null); + const connectionState = useConnectionState(); + const [customMediaState, enableCustomMedia] = useState(false); + ``` + + + ```csharp + private const int CHANNEL = 2; + // Please do not change this value because Unity re-samples the sample rate to 48000. + private const int SAMPLE_RATE = 48000; + + // Number of push audio frame per second. + private const int PUSH_FREQ_PER_SEC = 20; + + private RingBuffer _audioBuffer; + private bool _startConvertSignal = false; + private uint audioTrackID = 0; + + private Thread _pushAudioFrameThread; + private System.Object _rtcLock = new System.Object(); + ``` + \ No newline at end of file diff --git a/assets/code/video-sdk/custom-video-and-audio/set-variables.mdx b/assets/code/video-sdk/custom-video-and-audio/set-variables.mdx new file mode 100644 index 000000000..0be6ead1b --- /dev/null +++ b/assets/code/video-sdk/custom-video-and-audio/set-variables.mdx @@ -0,0 +1,39 @@ + + ```kotlin + private lateinit var previewTextureView: TextureView + private var previewSurfaceTexture: SurfaceTexture? = null + private var mTextureDestroyed = false + private var mPreviewing = false + ``` + + + ```swift + // The video device being used, for example, the ultra-wide back camera. + var videoCaptureDevice: AVCaptureDevice + // The audio device being used. + var audioCaptureDevice: AVCaptureDevice + // The AVCaptureVideoPreviewLayer that is updated by pushSource whenever a new frame is captured. + // This object is used to populate the local camera frames. + @Published var previewLayer: AVCaptureVideoPreviewLayer? + /// The AgoraCameraSourcePush object responsible for capturing video frames + /// from the capture device and sending it to the delegate, ``CustomAudioVideoManager``. + public var cameraPushSource: AgoraCameraSourcePush? + public var micPushSource: AgoraAudioSourcePush? + ``` + + + ```csharp + public Texture2D texture; + private byte[] shareData = null; + private Rect rect; + private uint videoTrackID = 0; + ``` + + + + ```typescript + const [customVideoTrack, setCustomVideoTrack] = useState(null); + const connectionState = useConnectionState(); + const [customMediaState, enableCustomMedia] = useState(false); + ``` + diff --git a/assets/code/video-sdk/encrypt-media-streams/enable-encryption.mdx b/assets/code/video-sdk/encrypt-media-streams/enable-encryption.mdx new file mode 100644 index 000000000..a9a4961cf --- /dev/null +++ b/assets/code/video-sdk/encrypt-media-streams/enable-encryption.mdx @@ -0,0 +1,149 @@ + To enable media stream encryption in your , create an Encryption Config and specify a key, salt, and encryption mode. + + + Call `enableEncryption` and pass the config as a parameter. + + ```kotlin + fun enableEncryption() { + if (encryptionSaltBase64.isBlank() || encryptionKey.isBlank()) return + // Convert the salt string into bytes + val encryptionSalt: ByteArray = Base64.getDecoder().decode(encryptionSaltBase64) + // An object to specify encryption configuration. + val config = EncryptionConfig() + // Specify an encryption mode. + config.encryptionMode = EncryptionConfig.EncryptionMode.AES_128_GCM2 + // Set encryption key and salt. + config.encryptionKey = encryptionKey + System.arraycopy( + encryptionSalt, + 0, + config.encryptionKdfSalt, + 0, + config.encryptionKdfSalt.size + ) + // Call the method to enable media encryption. + if (agoraEngine!!.enableEncryption(true, config) == 0) { + sendMessage( "Media encryption enabled") + } + } + ``` + - EncryptionConfig + - enableEncryption + + + ```swift + func enableEncryption(key: String, salt: String, mode: AgoraEncryptionMode) { + // Convert the salt string in the Base64 format into bytes + let encryptedSalt = Data( + base64Encoded: salt, options: .ignoreUnknownCharacters + ) + // An object to specify encryption configuration. + let config = AgoraEncryptionConfig() + // Set secret key and salt. + config.encryptionKey = key + config.encryptionKdfSalt = encryptedSalt + // Specify an encryption mode. + config.encryptionMode = mode + // Call the method to enable media encryption. + if agoraEngine.enableEncryption(true, encryptionConfig: config) == 0 { + Task { await self.updateLabel(to: "Media encryption enabled.") } + } else { + Task { await self.updateLabel(to: "Media encryption failed.") } + } + } + ``` + + + - AgoraEncryptionMode + - AgoraEncryptionConfig + - enableEncryption(_:encryptionConfig:) + + + - AgoraEncryptionMode + - AgoraEncryptionConfig + - enableEncryption(_:encryptionConfig:) + + + + ```csharp + void enableEncryption() + { + if (agoraEngine != null) + { + if(configData.encryptionKey == "" || configData.salt == "") + { + Debug.Log("Encryption key or encryption salt were not specified in the config.json file"); + return; + } + // Create an encryption configuration. + var config = new EncryptionConfig + { + // Specify a encryption mode + encryptionMode = ENCRYPTION_MODE.AES_128_GCM2, + // Assign a secret key. + encryptionKey = configData.encryptionKey, + // Assign a salt in Base64 format + encryptionKdfSalt = Convert.FromBase64String(configData.salt) + }; + // Enable the built-in encryption. + agoraEngine.EnableEncryption(true, config); + } + } + ``` + - EnableEncryption + - EncryptionConfig + + + + ```javascript + function base64ToUint8Array(base64Str) { + const raw = window.atob(base64Str); + const result = new Uint8Array(new ArrayBuffer(raw.length)); + for (let i = 0; i < raw.length; i += 1) { + result[i] = raw.charCodeAt(i); + } + return result; + } + + function hex2ascii(hexx) { + const hex = hexx.toString(); //force conversion + let str = ""; + for (let i = 0; i < hex.length; i += 2) { + str += String.fromCharCode(parseInt(hex.substr(i, 2), 16)); + } + return str; + } + + // Convert the encryptionSaltBase64 string to base64ToUint8Array. + encryptionSaltBase64 = base64ToUint8Array(config.salt); + // Convert the encryptionKey string to hex2ascii. + encryptionKey = hex2ascii(config.cipherKey); + // Set an encryption mode. + encryptionMode = config.encryptionMode; + + agoraManager + .getAgoraEngine() + .setEncryptionConfig(encryptionMode, encryptionKey, encryptionSaltBase64); + ``` + - EncryptionMode + - setEncryptionConfig + + + ```typescript + const stringToUint8Array = (str: string): Uint8Array => { + const encoder = new TextEncoder(); + return encoder.encode(str); + }; + const useMediaEncryption = () => { + const agoraEngine = useRTCClient(); + useEffect(() => + { + const salt = stringToUint8Array(config.salt); + // Start channel encryption + agoraEngine.setEncryptionConfig(config.encryptionMode, config.cipherKey, salt); + }, []); // Empty dependency array ensures the effect runs only once when the component mounts + }; + ``` + - EncryptionMode + - setEncryptionConfig + diff --git a/assets/code/video-sdk/encrypt-media-streams/enable-end-to-end-encryption.mdx b/assets/code/video-sdk/encrypt-media-streams/enable-end-to-end-encryption.mdx new file mode 100644 index 000000000..402c3d7ed --- /dev/null +++ b/assets/code/video-sdk/encrypt-media-streams/enable-end-to-end-encryption.mdx @@ -0,0 +1,174 @@ + + ```javascript + const joinWithE2EEncryption = async ( + localPlayerContainer, + channelParameters, + password, + uid + ) => { + AgoraRTC.setParameter("ENABLE_ENCODED_TRANSFORM", true); + const token = await fetchToken(uid, config.channelName); + await agoraManager + .getAgoraEngine() + .join(config.appId, config.channelName, token, uid); + // Create a local audio track from the audio sampled by a microphone. + channelParameters.localAudioTrack = + await AgoraRTC.createMicrophoneAudioTrack(); + // Create a local video track from the video captured by a camera. + channelParameters.localVideoTrack = await AgoraRTC.createCameraVideoTrack(); + // Append the local video container to the page body. + document.body.append(localPlayerContainer); + // Publish the local audio and video tracks in the channel. + await agoraManager + .getAgoraEngine() + .publish([ + channelParameters.localAudioTrack, + channelParameters.localVideoTrack, + ]); + const transceiver = + channelParameters.localVideoTrack.getRTCRtpTransceiver(); + if (!transceiver || !transceiver.sender) { + return; + } + const sender = transceiver.sender; + + var browserName = (function (agent) { + switch (true) { + case agent.indexOf("chrome") > -1 && !!window.chrome: + return "Chrome"; + default: + return "other"; + } + })(window.navigator.userAgent.toLowerCase()); + + if (browserName === "Chrome") { + setEncryptionStream(sender, password); + } + // Play the local video track. + channelParameters.localVideoTrack.play(localPlayerContainer); + }; + + async function setEncryptionStream(sender, password) { + const streams = sender.createEncodedStreams(); + const transformer = new TransformStream({ + async transform(chunk, controller) { + // controller.enqueue(chunk); + // return; + + const originView = new Uint8Array(chunk.data); + + let reservedSize = 40; + + const naluType = originView[4] & 0x1f; + console.log(naluType); + if (naluType !== 7) { + reservedSize = 5; + } + + const payload = originView.subarray(reservedSize, originView.length); + const hashKey = await grindKey(password, 10); + const key = await window.crypto.subtle.importKey( + "raw", + hashKey, + { + name: "AES-GCM", + }, + false, + ["encrypt"] + ); + + const iv = await getIv(password); + + const ciphertext = await window.crypto.subtle.encrypt( + { + name: "AES-GCM", + iv: iv, + tagLength: 128, + }, + key, + payload + ); + + const encryptedView = new Uint8Array( + ciphertext.byteLength + reservedSize + 12 + ); + encryptedView.set(originView.subarray(0, reservedSize)); + encryptedView.set(iv, reservedSize); + encryptedView.set(new Uint8Array(ciphertext), reservedSize + 12); + chunk.data = encryptedView.buffer; + + controller.enqueue(chunk); + }, + }); + + streams.readable.pipeThrough(transformer).pipeTo(streams.writable); + } + + async function setDecryptionStream(receiver, password) { + const streams = receiver.createEncodedStreams(); + const transformer = new TransformStream({ + async transform(chunk, controller) { + // controller.enqueue(chunk); + // return; + + const originView = new Uint8Array(chunk.data); + + let reservedSize = 40; + + const naluType = originView[4] & 0x1f; + if (naluType !== 7) { + // controller.enqueue(chunk); + // return; + reservedSize = 5; + } + + const hashKey = await grindKey(password, 10); + const key = await window.crypto.subtle.importKey( + "raw", + hashKey, + { + name: "AES-GCM", + }, + false, + ["decrypt"] + ); + + const header = originView.subarray(0, reservedSize); + const iv = originView.subarray(reservedSize, reservedSize + 12); + const payload = originView.subarray( + reservedSize + 12, + chunk.data.byteLength + ); + + let decrypted = null; + try { + decrypted = await window.crypto.subtle.decrypt( + { + name: "AES-GCM", + iv: iv, + tagLength: 128, + }, + key, + payload + ); + } catch (e) { + console.log(e); + controller.enqueue(chunk); + return; + } + + const decryptedView = new Uint8Array( + decrypted.byteLength + reservedSize + ); + decryptedView.set(header); + decryptedView.set(new Uint8Array(decrypted), reservedSize); + chunk.data = decryptedView.buffer; + + controller.enqueue(chunk); + }, + }); + + streams.readable.pipeThrough(transformer).pipeTo(streams.writable); + } + ``` + diff --git a/assets/code/video-sdk/encrypt-media-streams/event-handler.mdx b/assets/code/video-sdk/encrypt-media-streams/event-handler.mdx new file mode 100644 index 000000000..a993d6d52 --- /dev/null +++ b/assets/code/video-sdk/encrypt-media-streams/event-handler.mdx @@ -0,0 +1,56 @@ + + ```csharp + // Event handler class to handle the events raised by Agora's RtcEngine instance + internal class MediaEncryptionEventHandler : UserEventHandler + { + private MediaEncryptionManager encryptionManager; + public MediaEncryptionEventHandler(MediaEncryptionManager manager):base(manager) + { + encryptionManager = manager; + } + public override void OnEncryptionError(RtcConnection connection, ENCRYPTION_ERROR_TYPE errorType) + { + Debug.Log("Encryption error:" + errorType); + } + } + ``` + - OnEncryptionError + + + + ```kotlin + override fun onEncryptionError(errorType: Int) { + Log.d("Encryption error", errorType.toString()) + } + ``` + - onEncryptionError + + + ```typescript + const useCryptError = () => { + const agoraEngine = useRTCClient(); + useClientEvent(agoraEngine,"crypt-error" , () => { + console.log("decryption failed"); + }); + }; + ``` + + + ```swift + public func rtcEngine( + _ engine: AgoraRtcEngineKit, + didOccur errorType: AgoraEncryptionErrorType + ) { + // encryption error handler + } + ``` + + + - rtcEngine(_:didOccur:) + - AgoraEncryptionErrorType + + + - rtcEngine(_:didOccur:) + - AgoraEncryptionErrorType + + \ No newline at end of file diff --git a/assets/code/video-sdk/encrypt-media-streams/import-library.mdx b/assets/code/video-sdk/encrypt-media-streams/import-library.mdx new file mode 100644 index 000000000..c55f28655 --- /dev/null +++ b/assets/code/video-sdk/encrypt-media-streams/import-library.mdx @@ -0,0 +1,25 @@ + + ```kotlin + import io.agora.rtc2.RtcEngine + import io.agora.rtc2.RtcEngineConfig + import io.agora.rtc2.internal.EncryptionConfig + ``` + + + ```swift + import AgoraRtcKit + ``` + + +```javascript +import AgoraManager from "../agora_manager/agora_manager.js"; +``` + + + ```typescript + import { AgoraRTCProvider, useRTCClient, useClientEvent } from 'agora-rtc-react'; + import config from '../agora-manager/config.ts'; + import AuthenticationWorkflowManager from '../authentication-workflow/authenticationWorkflowManager.tsx'; + import AgoraRTC from 'agora-rtc-sdk-ng'; + ``` + diff --git a/assets/code/video-sdk/encrypt-media-streams/set-variables.mdx b/assets/code/video-sdk/encrypt-media-streams/set-variables.mdx new file mode 100644 index 000000000..3f02572ee --- /dev/null +++ b/assets/code/video-sdk/encrypt-media-streams/set-variables.mdx @@ -0,0 +1,18 @@ + + + In a production environment, you retrieve the encryption key and salt from an authentication server. For this code example you generate them locally. + + ```kotlin + private var encryptionKey = "" // A 32-byte string + private var encryptionSaltBase64 = "" // A 32-byte Base64 string + ``` + + +```javascript +// In a production environment, you retrieve the key and salt from + // an authentication server. For this code example you generate locally. + var encryptionKey = ""; + var encryptionSaltBase64 = ""; + var encryptionMode = ""; +``` + diff --git a/assets/code/video-sdk/ensure-channel-quality/event-handler.mdx b/assets/code/video-sdk/ensure-channel-quality/event-handler.mdx new file mode 100644 index 000000000..a7040fb8f --- /dev/null +++ b/assets/code/video-sdk/ensure-channel-quality/event-handler.mdx @@ -0,0 +1,404 @@ + + + ```kotlin + override fun onConnectionStateChanged(state: Int, reason: Int) { + // Occurs when the network connection state changes + sendMessage( + "Connection state changed\n" + + "New state: $state\n" + + "Reason: $reason" + ) + } + + override fun onLastmileQuality(quality: Int) { + // Reports the last-mile network quality of the local user + (mListener as CallQualityManagerListener).onLastMileQuality(quality) + } + + override fun onLastmileProbeResult(result: LastmileProbeResult) { + // Reports the last mile network probe result + agoraEngine!!.stopLastmileProbeTest() + // The result object contains the detailed test results that help you + // manage call quality, for example, the down link bandwidth. + sendMessage("Available down link bandwidth: " + result.downlinkReport.availableBandwidth) + } + + override fun onNetworkQuality(uid: Int, txQuality: Int, rxQuality: Int) { + // Reports the last mile network quality of each user in the channel + (mListener as CallQualityManagerListener).onNetworkQuality( + uid, txQuality, rxQuality + ) + } + + override fun onRtcStats(rtcStats: RtcStats) { + // Reports the statistics of the current session + counter += 1 + var msg = "" + if (counter == 5) msg = + rtcStats.users.toString() + " user(s)" else if (counter == 10) { + msg = "Packet loss rate: " + rtcStats.rxPacketLossRate + counter = 0 + } + if (msg.isNotEmpty()) sendMessage(msg) + } + + override fun onRemoteVideoStateChanged(uid: Int, state: Int, reason: Int, elapsed: Int) { + // Occurs when the remote video stream state changes + val msg = "Remote video state changed:\n" + + "Uid = $uid\n" + + "NewState = $state\n" + + "Reason = $reason\n" + + "Elapsed = $elapsed" + sendMessage(msg) + } + + override fun onRemoteVideoStats(stats: RemoteVideoStats) { + // Reports the statistics of the video stream sent by each remote user + (mListener as CallQualityManagerListener).onRemoteVideoStats( + stats + ) + } + ``` + + - onLocalVideoStats + - onRemoteVideoStats + - onRtcStats + - onNetworkQuality + + + + + ```dart + @override + RtcEngineEventHandler getEventHandler() { + return RtcEngineEventHandler( + // Occurs when the network connection state changes + onConnectionStateChanged: (RtcConnection connection, + ConnectionStateType state, ConnectionChangedReasonType reason) { + messageCallback( + "Connection state changed\n New state: ${state.name}\n Reason: ${reason.name}"); + super.getEventHandler().onConnectionStateChanged!( + connection, state, reason); + }, + // Reports the last-mile network quality of the local user + onLastmileQuality: (QualityType quality) { + networkQuality = quality.index; + Map eventArgs = {}; + eventArgs["quality"] = quality; + eventCallback("onLastmileQuality", eventArgs); + }, + // Reports the last mile network probe test result + onLastmileProbeResult: (LastmileProbeResult result) { + agoraEngine!.stopLastmileProbeTest(); + // The result object contains the detailed test results that help you + // manage call quality, for example, the down link jitter. + messageCallback("Downlink jitter: ${result.downlinkReport?.jitter}"); + }, + // Reports the last mile network quality of each user in the channel + onNetworkQuality: (RtcConnection connection, int remoteUid, + QualityType txQuality, QualityType rxQuality) { + // Use downlink network quality to update the network status + networkQuality = rxQuality.index; + + Map eventArgs = {}; + eventArgs["connection"] = connection; + eventArgs["remoteUid"] = remoteUid; + eventArgs["txQuality"] = txQuality; + eventArgs["rxQuality"] = rxQuality; + eventCallback("onNetworkQuality", eventArgs); + }, + // Reports the statistics of the current call + onRtcStats: (RtcConnection connection, RtcStats stats) { + counter += 1; + String msg = ""; + + if (counter == 5) { + msg = "${stats.userCount} user(s)"; + } else if (counter == 10) { + msg = "Last mile delay: ${stats.lastmileDelay}"; + counter = 0; + } + if (msg.isNotEmpty) messageCallback(msg); + }, + // Occurs when the remote video stream state changes + onRemoteVideoStateChanged: (RtcConnection connection, int remoteUid, + RemoteVideoState state, RemoteVideoStateReason reason, int elapsed) { + String msg = "Remote video state changed: \n Uid: $remoteUid" + " \n NewState: $state\n reason: $reason\n elapsed: $elapsed"; + messageCallback(msg); + }, + // Reports the statistics of the video stream sent by each remote user + onRemoteVideoStats: (RtcConnection connection, RemoteVideoStats stats) { + remoteVideoStatsSummary = "Uid: ${stats.uid}" + "\nRenderer frame rate: ${stats.rendererOutputFrameRate}" + "\nReceived bitrate: ${stats.receivedBitrate}" + "\nPublish duration: ${stats.publishDuration}" + "\nFrame loss rate: ${stats.frameLossRate}"; + + Map eventArgs = {}; + eventArgs["connection"] = connection; + eventArgs["stats"] = stats; + eventCallback("onRemoteVideoStats", eventArgs); + }, + onTokenPrivilegeWillExpire: (RtcConnection connection, String token) { + super.getEventHandler().onTokenPrivilegeWillExpire!(connection, token); + }, + onJoinChannelSuccess: (RtcConnection connection, int elapsed) { + if (connection.localUid == 0xFFFFFFFF) { + // Echo test started + messageCallback("Audio echo test started"); + return; + } else { + // Joined a channel + isJoined = true; + } + messageCallback( + "Local user uid:${connection.localUid} joined the channel"); + Map eventArgs = {}; + eventArgs["connection"] = connection; + eventArgs["elapsed"] = elapsed; + eventCallback("onJoinChannelSuccess", eventArgs); + super.getEventHandler().onJoinChannelSuccess!(connection, elapsed); + }, + onUserJoined: (RtcConnection connection, int remoteUid, int elapsed) { + super.getEventHandler().onUserJoined!(connection, remoteUid, elapsed); + }, + onUserOffline: (RtcConnection connection, int remoteUid, + UserOfflineReasonType reason) { + super.getEventHandler().onUserOffline!(connection, remoteUid, reason); + }, + ); + } + ``` + + + ```swift + public func rtcEngine(_ engine: AgoraRtcEngineKit, lastmileProbeTest result: AgoraLastmileProbeResult + ) { + engine.stopLastmileProbeTest() + // The result object contains the detailed test results that help you + // manage call quality. For example, the downlink jitter" + print("downlink jitter: \(result.downlinkReport.jitter)") + } + + public func rtcEngine(_ engine: AgoraRtcEngineKit, remoteVideoStats stats: AgoraRtcRemoteVideoStats) { + self.callQualities[stats.uid] = """ + Received Bitrate = \(stats.receivedBitrate) + Frame = \(stats.width)x\(stats.height), \(stats.receivedFrameRate)fps + Frame Loss Rate = \(stats.frameLossRate) + Packet Loss Rate = \(stats.packetLossRate) + """ + } + + public func rtcEngine( + _ engine: AgoraRtcEngineKit, localVideoStats stats: AgoraRtcLocalVideoStats, + sourceType: AgoraVideoSourceType + ) { + self.callQualities[self.localUserId] = """ + Captured Frame = \(stats.captureFrameWidth)x\(stats.captureFrameHeight), \(stats.captureFrameRate)fps + Encoded Frame = \(stats.encodedFrameWidth)x\(stats.encodedFrameHeight), \(stats.encoderOutputFrameRate)fps + Sent Data = \(stats.sentFrameRate)fps, bitrate: \(stats.sentBitrate) + Packet Loss Rate = \(stats.txPacketLossRate) + """ + } + ``` + + + - rtcEngine(_:lastmileProbeTest:) + - rtcEngine(_:remoteVideoStats:) + - rtcEngine(_:localVideoStats:sourceType:) + + + - rtcEngine(_:lastmileProbeTest:) + - rtcEngine(_:remoteVideoStats:) + - rtcEngine(_:localVideoStats:sourceType:) + + + + + +```csharp +// Event handler class to handle the events raised by Agora's RtcEngine instance +internal class CallQualityEventHandler : UserEventHandler +{ + private CallQualityManager callQuality; + internal CallQualityEventHandler(CallQualityManager audioSample):base(audioSample) + { + callQuality = audioSample; + } + public override void OnConnectionStateChanged(RtcConnection connection, CONNECTION_STATE_TYPE state, CONNECTION_CHANGED_REASON_TYPE reason) + { + Debug.Log("Connection state changed" + + "\n New state: " + state + + "\n Reason: " + reason); + } + public override void OnLastmileQuality(int quality) + { + callQuality.updateNetworkStatus(quality); + } + public override void OnLastmileProbeResult(LastmileProbeResult result) + { + callQuality.agoraEngine.StopLastmileProbeTest(); + + Debug.Log("Probe test finished"); + // The result object contains the detailed test results that help you + // manage call quality, for example, the downlink jitter. + Debug.Log("Downlink jitter: " + result.downlinkReport.jitter); + + //Destroy the engine + callQuality.DestroyEngine(); + + } + public override void OnNetworkQuality(RtcConnection connection, uint remoteUid, int txQuality, int rxQuality) + { + // Use downlink network quality to update the network status + callQuality.updateNetworkStatus(rxQuality); + } + public override void OnRtcStats(RtcConnection connection, RtcStats rtcStats) + { + string msg = ""; + msg = rtcStats.userCount + " user(s)"; + msg = "Packet loss rate: " + rtcStats.rxPacketLossRate; + Debug.Log(msg); + } +} +``` +- OnConnectionStateChanged +- OnLastmileQuality +- OnLastmileProbeResult +- OnNetworkQuality +- OnRtcStats + + + +```csharp +// Event handler class to handle the events raised by Agora's RtcEngine instance +internal class CallQualityEventHandler : UserEventHandler +{ + private CallQualityManager callQuality; + internal CallQualityEventHandler(CallQualityManager videoSample):base(videoSample) + { + callQuality = videoSample; + } + public override void OnConnectionStateChanged(RtcConnection connection, CONNECTION_STATE_TYPE state, CONNECTION_CHANGED_REASON_TYPE reason) + { + Debug.Log("Connection state changed" + + "\n New state: " + state + + "\n Reason: " + reason); + } + public override void OnLastmileQuality(int quality) + { + callQuality.updateNetworkStatus(quality); + } + public override void OnLastmileProbeResult(LastmileProbeResult result) + { + callQuality.agoraEngine.StopLastmileProbeTest(); + + Debug.Log("Probe test finished"); + // The result object contains the detailed test results that help you + // manage call quality, for example, the downlink jitter. + Debug.Log("Downlink jitter: " + result.downlinkReport.jitter); + + //Destroy the engine + callQuality.DestroyEngine(); + + } + public override void OnNetworkQuality(RtcConnection connection, uint remoteUid, int txQuality, int rxQuality) + { + // Use downlink network quality to update the network status + callQuality.updateNetworkStatus(rxQuality); + } + public override void OnRtcStats(RtcConnection connection, RtcStats rtcStats) + { + string msg = ""; + msg = rtcStats.userCount + " user(s)"; + msg = "Packet loss rate: " + rtcStats.rxPacketLossRate; + Debug.Log(msg); + } + public override void OnRemoteVideoStateChanged(RtcConnection connection, uint remoteUid, REMOTE_VIDEO_STATE state, REMOTE_VIDEO_STATE_REASON reason, int elapsed) + { + string msg = "Remote video state changed: \n Uid =" + remoteUid + + " \n NewState =" + state + + " \n reason =" + reason + + " \n elapsed =" + elapsed; + Debug.Log(msg); + } + public override void OnRemoteVideoStats(RtcConnection connection, RemoteVideoStats stats) + { + string msg = "Remote Video Stats: " + + "\n User id =" + stats.uid + + "\n Received bitrate =" + stats.receivedBitrate + + "\n Total frozen time =" + stats.totalFrozenTime; + Debug.Log(msg); + } + +} +``` + - OnConnectionStateChanged + - OnLastmileQuality + - OnLastmileProbeResult + - OnNetworkQuality + - OnRtcStats + - OnRemoteVideoStateChanged + - OnRemoteVideoStats + + + + + ```typescript + const networkQuality = useNetworkQuality(); + const connectionState = useConnectionState(); + ``` + - useNetworkQuality + - useConnectionState + + + + ```javascript + // Get the uplink network condition + agoraEngine.on("network-quality", (quality) => { + if (quality.uplinkNetworkQuality == 1) { + document.getElementById("upLinkIndicator").innerHTML = "Excellent"; + document.getElementById("upLinkIndicator").style.color = "green"; + } else if (quality.uplinkNetworkQuality == 2) { + document.getElementById("upLinkIndicator").innerHTML = "Good"; + document.getElementById("upLinkIndicator").style.color = "yellow"; + } else quality.uplinkNetworkQuality >= 4; + { + document.getElementById("upLinkIndicator").innerHTML = "Poor"; + document.getElementById("upLinkIndicator").style.color = "red"; + } + }); + + // Get the downlink network condition + agoraEngine.on("network-quality", (quality) => { + if (quality.downlinkNetworkQuality == 1) { + document.getElementById("downLinkIndicator").innerHTML = "Excellent"; + document.getElementById("downLinkIndicator").style.color = "green"; + } else if (quality.downlinkNetworkQuality == 2) { + document.getElementById("downLinkIndicator").innerHTML = "Good"; + document.getElementById("downLinkIndicator").style.color = "yellow"; + } else if (quality.downlinkNetworkQuality >= 4) { + document.getElementById("downLinkIndicator").innerHTML = "Poor"; + document.getElementById("downLinkIndicator").style.color = "red"; + } + }); + + const handleVSDKEvents = (eventName, ...args) => { + switch (eventName) { + // ... + case "connection-state-change": + // The sample code uses debug console to show the connection state. In a real-world application, you can add + // a label or a icon to the user interface to show the connection state. + + // Display the current connection state. + console.log("Connection state has changed to :" + args[0]); + // Display the previous connection state. + console.log("Connection state was : " + args[1]); + // Display the connection state change reason. + console.log("Connection state change reason : " + args[2]); + } + }; + ``` + - network-quality + diff --git a/assets/code/video-sdk/ensure-channel-quality/implement-call-quality-view.mdx b/assets/code/video-sdk/ensure-channel-quality/implement-call-quality-view.mdx new file mode 100644 index 000000000..d7c476fe2 --- /dev/null +++ b/assets/code/video-sdk/ensure-channel-quality/implement-call-quality-view.mdx @@ -0,0 +1,52 @@ + + ```swift + public override func setupEngine() -> AgoraRtcEngineKit { + let engine = super.setupEngine() + + // Set Audio Scenario + engine.setAudioScenario(.gameStreaming) + + // Enable dual stream mode + engine.setDualStreamMode(.enableSimulcastStream) + engine.setAudioProfile(.default) + + // Set the video configuration + let videoConfig = AgoraVideoEncoderConfiguration( + size: CGSize(width: 640, height: 360), + frameRate: .fps10, + bitrate: AgoraVideoBitrateStandard, + orientationMode: .adaptative, + mirrorMode: .auto + ) + engine.setVideoEncoderConfiguration(videoConfig) + + return engine + } + ``` + + + ```swift + public override func setupEngine() -> AgoraRtcEngineKit { + let engine = super.setupEngine() + + // Set Audio Scenario + engine.setAudioScenario(.gameStreaming) + + // Enable dual stream mode + engine.setDualStreamMode(.enableSimulcastStream) + engine.setAudioProfile(.default) + + // Set the video configuration + let videoConfig = AgoraVideoEncoderConfiguration( + size: CGSize(width: 640, height: 360), + frameRate: .fps10, + bitrate: AgoraVideoBitrateStandard, + orientationMode: .adaptative, + mirrorMode: .auto + ) + engine.setVideoEncoderConfiguration(videoConfig) + + return engine + } + ``` + \ No newline at end of file diff --git a/assets/code/video-sdk/ensure-channel-quality/implement-declarations.mdx b/assets/code/video-sdk/ensure-channel-quality/implement-declarations.mdx new file mode 100644 index 000000000..3cc92bd8f --- /dev/null +++ b/assets/code/video-sdk/ensure-channel-quality/implement-declarations.mdx @@ -0,0 +1,121 @@ + + ```kotlin + private val baseEventHandler: IRtcEngineEventHandler + ``` + + + ```swift + @Published public var callQualities: [UInt: String] = [:] + ``` + + + + ```csharp + private IAudioDeviceManager _audioDeviceManager; // To manage audio devices. + private IVideoDeviceManager _videoDeviceManager; // To manage video devices. + private DeviceInfo[] _audioRecordingDeviceInfos; // Represent information about audio recording devices. + private DeviceInfo[] _videoDeviceInfos; // Represent information about video devices. + + public string networkStatus = ""; + public List videoDevices; + public List audioDevices; + + [DllImport("user32.dll")] + private static extern IntPtr CreateWindowEx(uint dwExStyle, string lpClassName, string lpWindowName, + uint dwStyle, + int x, int y, + int nWidth, + int nHeight, + IntPtr hWndParent, + IntPtr hMenu, + IntPtr hInstance, + IntPtr lpParam); + + [DllImport("user32.dll")] + private static extern bool ShowWindow(IntPtr hWnd, int nCmdShow); + + [DllImport("user32.dll")] + private static extern bool DestroyWindow(IntPtr hWnd); + + private const uint WS_OVERLAPPEDWINDOW = 0x00CF0000; + private const uint WS_VISIBLE = 0x10000000; + private const int SW_SHOW = 5; + private IntPtr hWnd; + ``` + + + ```csharp + private IAudioDeviceManager _audioDeviceManager; // To manage audio devices. + private DeviceInfo[] _audioRecordingDeviceInfos; // Represent information about audio recording devices. + + public string networkStatus = ""; + public List audioDevices; + ``` + + + + + + ```javascript + // A variable to track the state of device test. + var isDeviceTestRunning = false; + // Variables to hold the Audio tracks for device testing. + var testTracks; + // A variable to reference the audio devices dropdown. + var audioDevicesDropDown; + + let channelParameters = { + // A variable to hold a local audio track. + localAudioTrack: null, + // A variable to hold a remote audio track. + remoteAudioTrack: null, + // A variable to hold the remote user id.s + remoteUid: "1", + }; + ``` + + + ```javascript + // A variable to track the state of remote video quality. + var isHighRemoteVideoQuality = false; + // A variable to track the state of device test. + var isDeviceTestRunning = false; + // Variables to hold the Audio/Video tracks for device testing. + var testTracks; + // A variable to reference the audio devices dropdown. + var audioDevicesDropDown; + // A variable to reference the video devices dropdown. + var videoDevicesDropDown; + + let channelParameters = { + // A variable to hold a local audio track. + localAudioTrack: null, + // A variable to hold a local video track. + localVideoTrack: null, + // A variable to hold a remote audio track. + remoteAudioTrack: null, + // A variable to hold a remote video track. + remoteVideoTrack: null, + // A variable to hold the remote user id.s + remoteUid: "1", + }; + ``` + + + + ```typescript + const agoraEngine = useRTCClient(); + const remoteUsers = useRemoteUsers(); + const [isHighRemoteVideoQuality, setVideoQualityState] = useState(false); + const numberOfRemoteUsers = remoteUsers.length; + const remoteUser = remoteUsers[numberOfRemoteUsers - 1]; + const [isDeviceTestRunning, setDeviceTestState] = useState(false); + const { localMicrophoneTrack } = useLocalMicrophoneTrack(); + const { localCameraTrack } = useLocalCameraTrack(); + const enabledFeatures = useRef(false); + ``` + - useRTCClient + - useRemoteUsers + - useLocalCameraTrack + - useLocalMicrophoneTrack + diff --git a/assets/code/video-sdk/ensure-channel-quality/swift/implement-labels.mdx b/assets/code/video-sdk/ensure-channel-quality/implement-labels.mdx similarity index 100% rename from assets/code/video-sdk/ensure-channel-quality/swift/implement-labels.mdx rename to assets/code/video-sdk/ensure-channel-quality/implement-labels.mdx diff --git a/assets/code/video-sdk/ensure-channel-quality/swift/implement-network-status.mdx b/assets/code/video-sdk/ensure-channel-quality/implement-network-status.mdx similarity index 62% rename from assets/code/video-sdk/ensure-channel-quality/swift/implement-network-status.mdx rename to assets/code/video-sdk/ensure-channel-quality/implement-network-status.mdx index 6702d036f..0bb877085 100644 --- a/assets/code/video-sdk/ensure-channel-quality/swift/implement-network-status.mdx +++ b/assets/code/video-sdk/ensure-channel-quality/implement-network-status.mdx @@ -17,4 +17,23 @@ func updateNetworkStatus(quality: Int) { else { networkStatus.backgroundColor = UIColor.white } } ``` -
\ No newline at end of file +
+ + ```csharp + public void updateNetworkStatus(int quality) + { + if (quality > 0 && quality < 3) + { + networkStatus = "Network Quality: Perfect"; + } + else if (quality <= 4) + { + networkStatus = "Network Quality: Good"; + } + else if (quality <= 6) + { + networkStatus = "Network Quality: Poor"; + } + } + ``` + diff --git a/assets/code/video-sdk/ensure-channel-quality/import-library.mdx b/assets/code/video-sdk/ensure-channel-quality/import-library.mdx new file mode 100644 index 000000000..68b4430cd --- /dev/null +++ b/assets/code/video-sdk/ensure-channel-quality/import-library.mdx @@ -0,0 +1,57 @@ + + ```kotlin + import io.agora.rtc2.* + import io.agora.rtc2.video.VideoCanvas + import io.agora.rtc2.internal.LastmileProbeConfig + import io.agora.rtc2.video.VideoEncoderConfiguration + import io.agora.rtc2.IRtcEngineEventHandler.RemoteVideoStats + ``` + + + ```dart + import 'package:agora_rtc_engine/agora_rtc_engine.dart'; + import 'package:permission_handler/permission_handler.dart'; + import 'package:flutter_reference_app/authentication-workflow/agora_manager_authentication.dart'; + import 'package:flutter_reference_app/agora-manager/agora_manager.dart'; + ``` + + + ```swift + import SwiftUI + import AgoraRtcKit + ``` + + + ```csharp + using Agora.Rtc; + using System.Runtime.InteropServices; + using System; + ``` + + +```javascript +import AgoraManager from "../agora_manager/agora_manager.js"; +import AgoraRTC from "agora-rtc-sdk-ng"; +``` + + + ```typescript + import { + AgoraRTCProvider, + useRTCClient, + useRemoteUsers, + useLocalCameraTrack, + useLocalMicrophoneTrack, + useNetworkQuality, + useConnectionState, + useJoin, + LocalVideoTrack, + useAutoPlayAudioTrack, + useVolumeLevel + } from "agora-rtc-react"; + import AgoraRTC, {ILocalAudioTrack, ICameraVideoTrack} from "agora-rtc-sdk-ng"; + import AuthenticationWorkflowManager from "../authentication-workflow/authenticationWorkflowManager"; + import { useState, useRef, useEffect } from "react"; + import config from "../agora-manager/config"; + ``` + diff --git a/assets/code/video-sdk/ensure-channel-quality/probe-test.mdx b/assets/code/video-sdk/ensure-channel-quality/probe-test.mdx new file mode 100644 index 000000000..5fb7d397c --- /dev/null +++ b/assets/code/video-sdk/ensure-channel-quality/probe-test.mdx @@ -0,0 +1,125 @@ + + + ```kotlin + private fun startProbeTest() { + if (agoraEngine == null) setupAgoraEngine() + // Configure a LastmileProbeConfig instance. + val config = LastmileProbeConfig() + // Probe the uplink network quality. + config.probeUplink = true + // Probe the down link network quality. + config.probeDownlink = true + // The expected uplink bitrate (bps). The value range is [100000,5000000]. + config.expectedUplinkBitrate = 100000 + // The expected down link bitrate (bps). The value range is [100000,5000000]. + config.expectedDownlinkBitrate = 100000 + agoraEngine!!.startLastmileProbeTest(config) + sendMessage("Running the last mile probe test ...") + // Test results are reported through the onLastmileProbeResult callback + } + ``` + + - LastmileProbeConfig + + - startLastmileProbeTest + + + + ```dart + void startProbeTest() { + // Configure the probe test + LastmileProbeConfig config = const LastmileProbeConfig( + probeUplink: true, + probeDownlink: true, + expectedUplinkBitrate: 100000, // Range 100000-5000000 bps + expectedDownlinkBitrate: 100000, // Range 100000-5000000 bps + ); + agoraEngine!.startLastmileProbeTest(config); + messageCallback("Running the last mile probe test ..."); + // Test results are reported through the onLastmileProbeResult callback + } + ``` + + + ```swift + func startProbeTest() { + // Configure a LastmileProbeConfig instance. + let config = AgoraLastmileProbeConfig() + // Probe the uplink network quality. + config.probeUplink = true + // Probe the downlink network quality. + config.probeDownlink = true + // The expected uplink bitrate (bps). The value range is [100000,5000000]. + config.expectedUplinkBitrate = 100000 + // The expected downlink bitrate (bps). The value range is [100000,5000000]. + config.expectedDownlinkBitrate = 100000 + + print(agoraEngine.startLastmileProbeTest(config)) + } + + // Result of the probe test + public func rtcEngine( + _ engine: AgoraRtcEngineKit, + lastmileQuality quality: AgoraNetworkQuality + ) { + self.lastMileQuality = quality + } + ``` + + + - AgoraLastmileProbeConfig + - startLastmileProbeTest(_:) + - rtcEngine(_:lastmileQuality:) + + + - AgoraLastmileProbeConfig + - startLastmileProbeTest(_:) + - rtcEngine(_:lastmileQuality:) + + + + ```csharp + // Probe test to check network quality. + public void StartProbeTest() + { + // Configure a LastmileProbeConfig instance. + LastmileProbeConfig config = new LastmileProbeConfig(); + + // Probe theuplink network quality. + config.probeUplink = true; + + // Probe the downlink network quality. + config.probeDownlink = true; + + // The expected uplink bitrate (bps). The value range is [100000,5000000]. + config.expectedUplinkBitrate = 100000; + + // The expected downlink bitrate (bps). The value range is [100000,5000000]. + config.expectedDownlinkBitrate = 100000; + + agoraEngine.StartLastmileProbeTest(config); + Debug.Log("Running the last mile probe test ..."); + } + ``` + + - StartLastmileProbeTest + - LastmileProbeConfig + + + - StartLastmileProbeTest + - LastmileProbeConfig + + + + ```typescript + const updateNetworkStatus = () => { + const networkLabels = { + 0: 'Unknown', 1: 'Excellent', + 2: 'Good', 3: 'Poor', + 4: 'Bad', 5: 'Very Bad', + 6: 'No Connection' + } + return ; + }; + ``` + \ No newline at end of file diff --git a/assets/code/video-sdk/ensure-channel-quality/set-audio-video-profile.mdx b/assets/code/video-sdk/ensure-channel-quality/set-audio-video-profile.mdx new file mode 100644 index 000000000..1b2ebbd63 --- /dev/null +++ b/assets/code/video-sdk/ensure-channel-quality/set-audio-video-profile.mdx @@ -0,0 +1,28 @@ + + ```js + const setAudioProfile = async () => { + // Create a local audio track and set an audio profile for the local audio track. + channelParameters.localAudioTrack = + await AgoraRTC.createMicrophoneAudioTrack({ + encoderConfig: "high_quality_stereo", + }); + }; + + const setVideoProfile = async () => { + // Set a video profile. + channelParameters.localVideoTrack = await AgoraRTC.createCameraVideoTrack({ + optimizationMode: "detail", + encoderConfig: { + width: 640, + // Specify a value range and an ideal value + height: { ideal: 480, min: 400, max: 500 }, + frameRate: 15, + bitrateMin: 600, + bitrateMax: 1000, + }, + }); + }; + ``` + - encoderConfig + - optimizationMode + diff --git a/assets/code/video-sdk/ensure-channel-quality/set-latency.mdx b/assets/code/video-sdk/ensure-channel-quality/set-latency.mdx new file mode 100644 index 000000000..0fd57c7c7 --- /dev/null +++ b/assets/code/video-sdk/ensure-channel-quality/set-latency.mdx @@ -0,0 +1,65 @@ + + + ```kotlin + // Set the latency level + options.audienceLatencyLevel = Constants.AUDIENCE_LATENCY_LEVEL_ULTRA_LOW_LATENCY; + ``` + + + ```kotlin + // Set the latency level + options.audienceLatencyLevel = Constants.AUDIENCE_LATENCY_LEVEL_LOW_LATENCY; + ``` + + - ChannelMediaOptions + + + + ```dart + // Set the latency level + ChannelMediaOptions options = ChannelMediaOptions( + clientRoleType: clientRole, + channelProfile: ChannelProfileType.channelProfileLiveBroadcasting, + audienceLatencyLevel: AudienceLatencyLevelType.audienceLatencyLevelUltraLowLatency + ); + ``` + + + ```dart + // Set the latency level + ChannelMediaOptions options = ChannelMediaOptions( + clientRoleType: clientRole, + channelProfile: ChannelProfileType.channelProfileLiveBroadcasting, + audienceLatencyLevel: AudienceLatencyLevelType.audienceLatencyLevelLowLatency + ); + ``` + + + + + ```swift + let opt = AgoraRtcChannelMediaOptions() + opt.audienceLatencyLevel = .ultraLowLatency + ``` + + + ```swift + let opt = AgoraRtcChannelMediaOptions() + opt.audienceLatencyLevel = .lowLatency + ``` + + + - audienceLatencyLevel + - AgoraAudienceLatencyLevelType + + + - audienceLatencyLevel + - AgoraAudienceLatencyLevelType + + + + ``` csharp + // Set the latency level + options.audienceLatencyLevel= Constants.AUDIENCE_LATENCY_LEVEL_ULTRA_LOW_LATENCY; + ``` + \ No newline at end of file diff --git a/assets/code/video-sdk/ensure-channel-quality/setup-engine.mdx b/assets/code/video-sdk/ensure-channel-quality/setup-engine.mdx new file mode 100644 index 000000000..7951a04bc --- /dev/null +++ b/assets/code/video-sdk/ensure-channel-quality/setup-engine.mdx @@ -0,0 +1,367 @@ + + + ```kotlin + override fun setupAgoraEngine(): Boolean { + try { + val config = RtcEngineConfig() + config.mContext = mContext + config.mAppId = appId + config.mEventHandler = iRtcEngineEventHandler + // Configure the log file + val logConfig = RtcEngineConfig.LogConfig() + logConfig.fileSizeInKB = 256 // Range 128-1024 Kb + logConfig.level = Constants.LogLevel.getValue(Constants.LogLevel.LOG_LEVEL_WARN) + config.mLogConfig = logConfig + agoraEngine = RtcEngine.create(config) + // Enable video mode + agoraEngine!!.enableVideo() + } catch (e: Exception) { + sendMessage(e.toString()) + return false + } + + // Enable the dual stream mode + agoraEngine!!.setDualStreamMode(Constants.SimulcastStreamMode.ENABLE_SIMULCAST_STREAM) + // If you set the dual stream mode to AUTO_SIMULCAST_STREAM, the low-quality video + // steam is not sent by default; the SDK automatically switches to low-quality after + // it receives a request to subscribe to a low-quality video stream. + + // Set an audio profile and an audio scenario. + agoraEngine!!.setAudioProfile( + Constants.AUDIO_PROFILE_DEFAULT, + Constants.AUDIO_SCENARIO_GAME_STREAMING + ) + + // Set the video profile + val videoConfig = VideoEncoderConfiguration() + // Set mirror mode + videoConfig.mirrorMode = VideoEncoderConfiguration.MIRROR_MODE_TYPE.MIRROR_MODE_AUTO + // Set frameRate + videoConfig.frameRate = VideoEncoderConfiguration.FRAME_RATE.FRAME_RATE_FPS_10.value + // Set bitrate + videoConfig.bitrate = VideoEncoderConfiguration.STANDARD_BITRATE + // Set dimensions + videoConfig.dimensions = VideoEncoderConfiguration.VD_640x360 + // Set orientation mode + videoConfig.orientationMode = + VideoEncoderConfiguration.ORIENTATION_MODE.ORIENTATION_MODE_ADAPTIVE + // Set degradation preference + videoConfig.degradationPrefer = + VideoEncoderConfiguration.DEGRADATION_PREFERENCE.MAINTAIN_BALANCED + // Set compression preference: low latency or quality + videoConfig.advanceOptions.compressionPreference = + VideoEncoderConfiguration.COMPRESSION_PREFERENCE.PREFER_LOW_LATENCY + // Apply the configuration + agoraEngine!!.setVideoEncoderConfiguration(videoConfig) + + // Start the probe test + startProbeTest() + return true + } + ``` + + - setAudioProfile + - enableDualStreamMode + - VideoEncoderConfiguration + - setVideoEncoderConfiguration + - LogConfig + + + + ```kotlin + override fun setupAgoraEngine(): Boolean { + try { + val config = RtcEngineConfig() + config.mContext = mContext + config.mAppId = appId + config.mEventHandler = iRtcEngineEventHandler + // Configure the log file + val logConfig = RtcEngineConfig.LogConfig() + logConfig.fileSizeInKB = 256 // Range 128-1024 Kb + logConfig.level = Constants.LogLevel.getValue(Constants.LogLevel.LOG_LEVEL_WARN) + config.mLogConfig = logConfig + agoraEngine = RtcEngine.create(config) + } catch (e: Exception) { + sendMessage(e.toString()) + return false + } + + // Set an audio profile and an audio scenario. + agoraEngine!!.setAudioProfile( + Constants.AUDIO_PROFILE_DEFAULT, + Constants.AUDIO_SCENARIO_GAME_STREAMING + ) + + // Start the probe test + startProbeTest() + return true } + ``` + + - setAudioProfile + - LogConfig + + + + + + ```dart + @override + Future setupAgoraEngine() async { + // Retrieve or request camera and microphone permissions + await [Permission.microphone, Permission.camera].request(); + + // Create an instance of the Agora engine + agoraEngine = createAgoraRtcEngine(); + await agoraEngine!.initialize(RtcEngineContext( + appId: appId, + logConfig: + const LogConfig(fileSizeInKB: 2048, level: LogLevel.logLevelWarn))); + + if (currentProduct != ProductName.voiceCalling) { + await agoraEngine!.enableVideo(); + } + + // Enable the dual stream mode + agoraEngine!.enableDualStreamMode(enabled: true); + + // Set audio profile and audio scenario. + agoraEngine!.setAudioProfile( + profile: AudioProfileType.audioProfileDefault, + scenario: AudioScenarioType.audioScenarioChatroom); + + // Set the video configuration + VideoEncoderConfiguration videoConfig = const VideoEncoderConfiguration( + mirrorMode: VideoMirrorModeType.videoMirrorModeAuto, + frameRate: 10, + bitrate: standardBitrate, + dimensions: VideoDimensions(width: 640, height: 360), + orientationMode: OrientationMode.orientationModeAdaptive, + degradationPreference: DegradationPreference.maintainBalanced + ); + + // Apply the video configuration + agoraEngine!.setVideoEncoderConfiguration(videoConfig); + + // Start the probe test + startProbeTest(); + + // Register the event handler + agoraEngine!.registerEventHandler(getEventHandler()); + } + ``` + + + ```swift + func setupEngine() -> AgoraRtcEngineKit { + let engine = super.setupEngine() + + // Set Audio Scenario + engine.setAudioScenario(.gameStreaming) + + // Enable dual stream mode + engine.setDualStreamMode(.enableSimulcastStream) + engine.setAudioProfile(.default) + + // Set the video configuration + let videoConfig = AgoraVideoEncoderConfiguration( + size: CGSize(width: 640, height: 360), + frameRate: .fps10, + bitrate: AgoraVideoBitrateStandard, + orientationMode: .adaptative, + mirrorMode: .auto + ) + engine.setVideoEncoderConfiguration(videoConfig) + + return engine + } + ``` + + + - setAudioScenario(_:) + - setDualStreamMode(_:) + - setAudioProfile(_:) + - AgoraVideoEncoderConfiguration + - setVideoEncoderConfiguration(_:) + + + - setAudioScenario(_:) + - setDualStreamMode(_:) + - setAudioProfile(_:) + - AgoraVideoEncoderConfiguration + - setVideoEncoderConfiguration(_:) + + + + + + ``` csharp + public override void SetupAgoraEngine() + { + base.SetupAgoraEngine(); + + // Specify a path for the log file. + agoraEngine.SetLogFile("/path/to/folder/agorasdk1.log"); + + // Set the log file size. + agoraEngine.SetLogFileSize(256); // Range 128-20480 Kb + + // Specify a log level. + agoraEngine.SetLogLevel(LOG_LEVEL.LOG_LEVEL_WARN); + + // Enable the dual stream mode. + agoraEngine.EnableDualStreamMode(true); + + // Set audio profile and audio scenario. + agoraEngine.SetAudioProfile(AUDIO_PROFILE_TYPE.AUDIO_PROFILE_DEFAULT, AUDIO_SCENARIO_TYPE.AUDIO_SCENARIO_CHATROOM); + + // Set the video profile. + VideoEncoderConfiguration videoConfig = new VideoEncoderConfiguration(); + + // Set mirror mode. + videoConfig.mirrorMode = VIDEO_MIRROR_MODE_TYPE.VIDEO_MIRROR_MODE_DISABLED; + + // Set frame rate. + videoConfig.frameRate = (int)FRAME_RATE.FRAME_RATE_FPS_15; + + // Set bitrate. + videoConfig.bitrate = (int)BITRATE.STANDARD_BITRATE; + + // Set dimensions. + videoConfig.dimensions = new VideoDimensions(640, 360); + + // Set orientation mode. + videoConfig.orientationMode = ORIENTATION_MODE.ORIENTATION_MODE_ADAPTIVE; + + // Set degradation preference. + videoConfig.degradationPreference = DEGRADATION_PREFERENCE.MAINTAIN_BALANCED; + + // Set the latency level + videoConfig.advanceOptions.compressionPreference = COMPRESSION_PREFERENCE.PREFER_LOW_LATENCY; + + // Apply the configuration. + agoraEngine.SetVideoEncoderConfiguration(videoConfig); + + // Attach the event handler + agoraEngine.InitEventHandler(new CallQualityEventHandler(this)); + } + ``` + + + ```csharp + public override void SetupAgoraEngine() + { + base.SetupAgoraEngine(); + + // Specify a path for the log file. + agoraEngine.SetLogFile("/path/to/folder/agorasdk1.log"); + + // Set the log file size. + agoraEngine.SetLogFileSize(256); // Range 128-20480 Kb + + // Specify a log level. + agoraEngine.SetLogLevel(LOG_LEVEL.LOG_LEVEL_WARN); + + // Enable the dual stream mode. + agoraEngine.EnableDualStreamMode(true); + + // Set audio profile and audio scenario. + agoraEngine.SetAudioProfile(AUDIO_PROFILE_TYPE.AUDIO_PROFILE_DEFAULT, AUDIO_SCENARIO_TYPE.AUDIO_SCENARIO_CHATROOM); + + // Set the video profile. + VideoEncoderConfiguration videoConfig = new VideoEncoderConfiguration(); + + // Set mirror mode. + videoConfig.mirrorMode = VIDEO_MIRROR_MODE_TYPE.VIDEO_MIRROR_MODE_DISABLED; + + // Set framerate. + videoConfig.frameRate = (int)FRAME_RATE.FRAME_RATE_FPS_15; + + // Set bitrate. + videoConfig.bitrate = (int)BITRATE.STANDARD_BITRATE; + + // Set dimensions. + videoConfig.dimensions = new VideoDimensions(640, 360); + + // Set orientation mode. + videoConfig.orientationMode = ORIENTATION_MODE.ORIENTATION_MODE_ADAPTIVE; + + // Set degradation preference. + videoConfig.degradationPreference = DEGRADATION_PREFERENCE.MAINTAIN_BALANCED; + + // Set the latency level + videoConfig.advanceOptions.compressionPreference = COMPRESSION_PREFERENCE.PREFER_LOW_LATENCY; + + // Apply the configuration. + agoraEngine.SetVideoEncoderConfiguration(videoConfig); + + // Attach the eventHandler + agoraEngine.InitEventHandler(new CallQualityEventHandler(this)); + } + ``` + + - SetLogFile + - SetLogFileSize + - SetLogLevel + - EnableDualStreamMode + - SetAudioProfile + - SetVideoEncoderConfiguration + - InitEventHandler + + + ```csharp + public override void SetupAgoraEngine() + { + base.SetupAgoraEngine(); + + // Specify a path for the log file. + agoraEngine.SetLogFile("/path/to/folder/agorasdk1.log"); + + // Set the log file size. + agoraEngine.SetLogFileSize(256); // Range 128-20480 Kb + + // Specify a log level. + agoraEngine.SetLogLevel(LOG_LEVEL.LOG_LEVEL_WARN); + + // Set audio profile and audio scenario. + agoraEngine.SetAudioProfile(AUDIO_PROFILE_TYPE.AUDIO_PROFILE_DEFAULT, AUDIO_SCENARIO_TYPE.AUDIO_SCENARIO_CHATROOM); + + // Attach the eventHandler + agoraEngine.InitEventHandler(new CallQualityEventHandler(this)); + } + ``` + - SetLogFile + - SetLogFileSize + - SetLogLevel + - InitEventHandler + + + + + ```typescript + const agoraEngine = useRTCClient(AgoraRTC.createClient({ codec: "vp8", mode: config.selectedProduct })); + + + + + + + const callQualityEssentials = async () => { + try { + AgoraRTC.setLogLevel(2); // Info level + await agoraEngine.enableDualStream(); + } catch (error) { + console.log(error); + } + await localCameraTrack?.setEncoderConfiguration({ + width: 640, + height: { ideal: 480, min: 400, max: 500 }, + frameRate: 15, + bitrateMin: 600, + bitrateMax: 1000, + }); + }; + ``` + - useRTCClient + - AgoraRTCProvider + - setEncoderConfiguration + diff --git a/assets/code/video-sdk/ensure-channel-quality/show-stats.mdx b/assets/code/video-sdk/ensure-channel-quality/show-stats.mdx new file mode 100644 index 000000000..819b53d67 --- /dev/null +++ b/assets/code/video-sdk/ensure-channel-quality/show-stats.mdx @@ -0,0 +1,45 @@ + + ```typescript + const showStatistics = () => { + const localAudioStats = agoraEngine.getLocalAudioStats(); + console.log("Local audio stats:", localAudioStats); + + const localVideoStats = agoraEngine.getLocalVideoStats(); + console.log("Local video stats:", localVideoStats); + + const rtcStats = agoraEngine.getRTCStats(); + console.log("Channel statistics:", rtcStats); + }; + ``` + - getLocalAudioStats + - getLocalVideoStats + - getRTCStats + + + +```js + const getStatistics = async (remoteUid) => { + const localAudioStats = agoraEngine.getLocalAudioStats(); + const localVideoStats = agoraEngine.getLocalVideoStats(); + let remoteAudioStats; + let remoteVideoStats; + if (remoteUid !== undefined) { + remoteAudioStats = agoraEngine.getRemoteAudioStats()[remoteUid]; + remoteVideoStats = agoraEngine.getRemoteVideoStats()[remoteUid]; + } + const rtcStats = agoraEngine.getRTCStats(); + return { + localAudioStats, + localVideoStats, + remoteAudioStats, + remoteVideoStats, + rtcStats, + }; + }; +``` +- getLocalAudioStats +- getLocalVideoStats +- getRemoteAudioStats +- getRemoteVideoStats +- getRTCStats + diff --git a/assets/code/video-sdk/ensure-channel-quality/swift/implement-declarations.mdx b/assets/code/video-sdk/ensure-channel-quality/swift/implement-declarations.mdx deleted file mode 100644 index 4e0e95854..000000000 --- a/assets/code/video-sdk/ensure-channel-quality/swift/implement-declarations.mdx +++ /dev/null @@ -1,13 +0,0 @@ - -```swift -var networkStatusLabel: NSTextField! -var networkStatus: NSTextField! -var qualityButton: NSButton! -``` - - -```swift -var networkStatusLabel: UILabel! -var networkStatus: UILabel! -``` - \ No newline at end of file diff --git a/assets/code/video-sdk/ensure-channel-quality/switch-quality.mdx b/assets/code/video-sdk/ensure-channel-quality/switch-quality.mdx new file mode 100644 index 000000000..8725a5a65 --- /dev/null +++ b/assets/code/video-sdk/ensure-channel-quality/switch-quality.mdx @@ -0,0 +1,118 @@ + + + ```kotlin + fun setStreamQuality(remoteUid: Int, highQuality: Boolean) { + // Set the stream type of the remote video + if (highQuality) { + agoraEngine!!.setRemoteVideoStreamType(remoteUid, Constants.VIDEO_STREAM_HIGH) + } else { + agoraEngine!!.setRemoteVideoStreamType(remoteUid, Constants.VIDEO_STREAM_LOW) + } + } + ``` + - setRemoteVideoStreamType + + + ```dart + void setVideoQuality(int remoteUid, bool isHighQuality) { + if (isHighQuality) { + agoraEngine!.setRemoteVideoStreamType(uid: remoteUid, + streamType: VideoStreamType.videoStreamHigh); + } else { + agoraEngine!.setRemoteVideoStreamType(uid: remoteUid, + streamType: VideoStreamType.videoStreamLow); + } + } + ``` + + + ```swift + func setStreamQuality(for uid: UInt, to quality: AgoraVideoStreamType) { + agoraEngine.setRemoteVideoStream(uid, type: quality) + } + ``` + + - AgoraVideoStreamType + - setRemoteVideoStream(_:type:) + + + - AgoraVideoStreamType + - setRemoteVideoStream(_:type:) + + + + ```csharp + // Switch between high and low remote user video quality. + public void SetLowStreamQuality() + { + if(remoteUid > 1) + { + agoraEngine.SetRemoteVideoStreamType(remoteUid, VIDEO_STREAM_TYPE.VIDEO_STREAM_LOW); + Debug.Log("Switching to low-quality video"); + } + else + { + Debug.Log("No remote user in the channel"); + } + } + public void SetHighStreamQuality() + { + if (remoteUid > 1) + { + agoraEngine.SetRemoteVideoStreamType(remoteUid, VIDEO_STREAM_TYPE.VIDEO_STREAM_HIGH); + Debug.Log("Switching to high-quality video"); + } + else + { + Debug.Log("No remote user in the channel"); + } + } + ``` + - SetRemoteVideoStreamType + + + ```javascript + const handleVSDKEvents = (eventName, ...args) => { + switch (eventName) { + case "user-published": + if (args[1] == "video") { + // ... + // Change stream quality on click + document.getElementById(remotePlayerContainer.id).addEventListener("click", function () { + if (isHighRemoteVideoQuality === false) { + agoraManager.setRemoteVideoStreamType(channelParameters.remoteUid, 0); + isHighRemoteVideoQuality = true; + } else { + agoraManager.setRemoteVideoStreamType(channelParameters.remoteUid, 1); + isHighRemoteVideoQuality = false; + } + }); + } + } + }; + // Play the remote video track. + channelParameters.remoteVideoTrack.play(remotePlayerContainer); + ``` + + - setRemoteVideoStreamType + + + + ```typescript + const setRemoteVideoQuality = () => { + if (!remoteUser) { + console.log("No remote user in the channel"); + return; + } + + const newQualityState = !isHighRemoteVideoQuality; + const streamType = newQualityState ? 0 : 1; + + agoraEngine + .setRemoteVideoStreamType(remoteUser.uid, streamType) + .then(() => setVideoQualityState(newQualityState)) + .catch((error) => console.error(error)); + }; + ``` + + diff --git a/assets/code/video-sdk/ensure-channel-quality/test-hardware.mdx b/assets/code/video-sdk/ensure-channel-quality/test-hardware.mdx new file mode 100644 index 000000000..e9703b6a0 --- /dev/null +++ b/assets/code/video-sdk/ensure-channel-quality/test-hardware.mdx @@ -0,0 +1,407 @@ + + + ```kotlin + fun startEchoTest(): SurfaceView { + if (agoraEngine == null) setupAgoraEngine() + // Set test configuration parameters + val echoConfig = EchoTestConfiguration() + echoConfig.enableAudio = true + echoConfig.enableVideo = true + echoConfig.channelId = channelName + echoConfig.intervalInSeconds = 2 // Interval between recording and playback + // Set up a SurfaceView + val localSurfaceView = SurfaceView(mContext) + localSurfaceView.visibility = View.VISIBLE + // Call setupLocalVideo with a VideoCanvas having uid set to 0. + agoraEngine!!.setupLocalVideo( + VideoCanvas( + localSurfaceView, + VideoCanvas.RENDER_MODE_HIDDEN, + 0 + ) + ) + echoConfig.view = localSurfaceView + + // Get a token from the server or from the config file + if (serverUrl.contains("http")) { // A valid server url is available + // Fetch a token from the server for channelName + fetchToken(channelName, 0, object : TokenCallback { + override fun onTokenReceived(rtcToken: String?) { + // Set the token in the config + echoConfig.token = rtcToken + // Start the echo test + agoraEngine!!.startEchoTest(echoConfig) + } + + override fun onError(errorMessage: String) { + // Handle the error + sendMessage("Error: $errorMessage") + } + }) + } else { // use the token from the config.json file + echoConfig.token = config!!.optString("rtcToken") + // Start the echo test + agoraEngine!!.startEchoTest(echoConfig) + } + return localSurfaceView + } + + fun stopEchoTest() { + agoraEngine!!.stopEchoTest() + destroyAgoraEngine() + } + ``` + + - EchoTestConfiguration + - startEchoTest + - stopEchoTest + - setupLocalVideo + + + + ```kotlin + fun startEchoTest() { + if (agoraEngine == null) setupAgoraEngine() + // Set test configuration parameters + val echoConfig = EchoTestConfiguration() + echoConfig.enableAudio = true + echoConfig.channelId = channelName + echoConfig.intervalInSeconds = 2 // Interval between recording and playback + + // Get a token from the server or from the config file + if (serverUrl.contains("http")) { // A valid server url is available + // Fetch a token from the server for channelName + fetchToken(channelName, 0, object : TokenCallback { + override fun onTokenReceived(rtcToken: String?) { + // Set the token in the config + echoConfig.token = rtcToken + // Start the echo test + agoraEngine!!.startEchoTest(echoConfig) + } + + override fun onError(errorMessage: String) { + // Handle the error + sendMessage("Error: $errorMessage") + } + }) + } else { // use the token from the config.json file + echoConfig.token = config!!.optString("rtcToken") + // Start the echo test + agoraEngine!!.startEchoTest(echoConfig) + } + } + + fun stopEchoTest() { + agoraEngine!!.stopEchoTest() + destroyAgoraEngine() + } + ``` + + - EchoTestConfiguration + - startEchoTest + - stopEchoTest + + + + + ```dart + void startEchoTest() async { + if (agoraEngine == null) setupAgoraEngine(); + + // Get a token for the test + String token; + if (config['serverUrl'].toString().contains('http')){ + // Use the uid 0xFFFFFFFF to get a token for the echo test + // Ensure that the channel name is unique for each user when running the echo test + token = await fetchToken(0xFFFFFFFF, channelName); + } else { + token = config['rtcToken']; + } + + // Set test configuration parameters + EchoTestConfiguration echoConfig = EchoTestConfiguration( + enableAudio: true, + enableVideo: false, + channelId: channelName, + intervalInSeconds: 2, // Interval between recording and playback + token: token, + ); + + // Start the echo test + agoraEngine!.startEchoTest(echoConfig); + } + + void stopEchoTest() { + agoraEngine!.stopEchoTest(); + localUid = config['uid']; + destroyAgoraEngine(); + } + ``` + + + ```swift + func startEchoTest(channel: String) async throws -> Int32 { + let echoConfig = AgoraEchoTestConfiguration() + echoConfig.enableAudio = true + echoConfig.enableVideo = true + echoConfig.channelId = channel + echoConfig.intervalInSeconds = 2 // Interval between recording and playback + + echoConfig.view = echoView + echoConfig.token = <#Token#> + let localCanvas = AgoraRtcVideoCanvas() + localCanvas.view = echoConfig.view + localCanvas.uid = 0 + + agoraEngine.setupLocalVideo(localCanvas) + + return agoraEngine.startEchoTest(withConfig: echoConfig) + } + + func stopEchoTest() -> Int32 { + self.agoraEngine.stopPreview() + self.agoraEngine.enableLocalVideo(false) + return agoraEngine.stopEchoTest() + } + ``` + + + - AgoraEchoTestConfiguration + - AgoraRtcVideoCanvas + - setupLocalVideo(_:) + - startEchoTest(withConfig:) + + + - AgoraEchoTestConfiguration + - AgoraRtcVideoCanvas + - setupLocalVideo(_:) + - startEchoTest(withConfig:) + + + + + + ```csharp + // Get the list of available audio devices. + private void GetAudioRecordingDevice() + { + _audioDeviceManager = agoraEngine.GetAudioDeviceManager(); + _audioRecordingDeviceInfos = _audioDeviceManager.EnumerateRecordingDevices(); + audioDevices = new List(); + + for (var i = 0; i < _audioRecordingDeviceInfos.Length; i++) + { + Debug.Log(string.Format("AudioRecordingDevice device index: {0}, name: {1}, id: {2}", i, + _audioRecordingDeviceInfos[i].deviceName, _audioRecordingDeviceInfos[i].deviceId)); + audioDevices.Add(_audioRecordingDeviceInfos[i].deviceName); + } + } + + // Get the list of available video devices. + private void GetVideoDeviceManager() + { + _videoDeviceManager = agoraEngine.GetVideoDeviceManager(); + _videoDeviceInfos = _videoDeviceManager.EnumerateVideoDevices(); + + videoDevices = new List(); + for (var i = 0; i < _videoDeviceInfos.Length; i++) + { + Debug.Log(string.Format("VideoDeviceManager device index: {0}, name: {1}, id: {2}", i, + _videoDeviceInfos[i].deviceName, _videoDeviceInfos[i].deviceId)); + videoDevices.Add(_videoDeviceInfos[i].deviceName); + } + } + + // Device test to check if the audio and video device is working properly. Only valid before joining the channel. + public void StartAudioVideoDeviceTest(string selectedAudioDevice, string selectedVideoDevice) + { + Debug.Log("Please conduct the device test before joining the channel."); + SetupAgoraEngine(); + foreach (var device in _audioRecordingDeviceInfos) + { + if(selectedAudioDevice == device.deviceName) + { + _audioDeviceManager.SetRecordingDevice(device.deviceId); + } + } + _audioDeviceManager.StartAudioDeviceLoopbackTest(500); + foreach (var device in _videoDeviceInfos) + { + if(selectedVideoDevice == device.deviceName) + { + _videoDeviceManager.SetDevice(device.deviceId); + } + } + hWnd = CreateWindowEx( + 0, + "Static", + "My Window", + WS_OVERLAPPEDWINDOW | WS_VISIBLE, + 100, + 100, + 640, + 480, + IntPtr.Zero, + IntPtr.Zero, + Marshal.GetHINSTANCE(typeof(EnsureCallQuality).Module), + IntPtr.Zero); + ShowWindow(hWnd, SW_SHOW); + _videoDeviceManager.StartDeviceTest(hWnd); + } + public void StopAudioVideoDeviceTest() + { + DestroyWindow(hWnd); + _audioDeviceManager.StopAudioDeviceLoopbackTest(); + _videoDeviceManager.StopDeviceTest(); + DestroyEngine(); + } + ``` + - SetRecordingDevice + - StartAudioDeviceLoopbackTest + - SetDevice + - StopAudioDeviceLoopbackTest + - GetAudioDeviceManager + - GetVideoDeviceManager + - IVideoDeviceManager + - EnumerateVideoDevices + - IAudioDeviceManager + - EnumerateRecordingDevices + + + ```csharp + // Get the list of available audio devices. + private void GetAudioRecordingDevice() + { + _audioDeviceManager = agoraEngine.GetAudioDeviceManager(); + _audioRecordingDeviceInfos = _audioDeviceManager.EnumerateRecordingDevices(); + audioDevices = new List(); + + for (var i = 0; i < _audioRecordingDeviceInfos.Length; i++) + { + Debug.Log(string.Format("AudioRecordingDevice device index: {0}, name: {1}, id: {2}", i, + _audioRecordingDeviceInfos[i].deviceName, _audioRecordingDeviceInfos[i].deviceId)); + audioDevices.Add(_audioRecordingDeviceInfos[i].deviceName); + } + } + // Device test to check if the audio device is working properly. Only valid before joining the channel. + public void StartDeviceTest(string selectedAudioDevice) + { + Debug.Log("Please conduct the device test before joining the channel."); + SetupAgoraEngine(); + foreach (var device in _audioRecordingDeviceInfos) + { + if(selectedAudioDevice == device.deviceName) + { + _audioDeviceManager.SetRecordingDevice(device.deviceId); + } + } + _audioDeviceManager.StartAudioDeviceLoopbackTest(500); + } + public void StopDeviceTest() + { + _audioDeviceManager.StopAudioDeviceLoopbackTest(); + DestroyEngine(); + } + ``` + - SetRecordingDevice + - StartAudioDeviceLoopbackTest + - StopAudioDeviceLoopbackTest + - GetAudioDeviceManager + - IAudioDeviceManager + - EnumerateRecordingDevices + + + + + ```javascript + const getDevices = async () => { + const devices = await AgoraRTC.getDevices(); + const audioDevices = devices.filter(function (device) { + return device.kind === "audioinput"; + }); + return { + audioDevices, + }; + }; + + const createTestTracks = async (camera, mic) => { + const audioTrack = AgoraRTC.createMicrophoneAudioTrack({ + microphoneId: mic, + }); + return { + audioTrack, + }; + }; + ``` + - getDevices + - createMicrophoneAudioTrack + + + ```javascript + const getDevices = async () => { + const devices = await AgoraRTC.getDevices(); + const audioDevices = devices.filter(function (device) { + return device.kind === "audioinput"; + }); + const videoDevices = devices.filter(function (device) { + return device.kind === "videoinput"; + }); + return { + audioDevices, + videoDevices, + }; + }; + + const createTestTracks = async (camera, mic) => { + const videoTrack = AgoraRTC.createCameraVideoTrack({ + cameraId: camera, + }); + const audioTrack = AgoraRTC.createMicrophoneAudioTrack({ + microphoneId: mic, + }); + return { + videoTrack, + audioTrack, + }; + }; + ``` + - getDevices + - createMicrophoneAudioTrack + - createCameraVideoTrack + + + + ```typescript + const handleStartDeviceTest = () => { + setDeviceTestState(true); + }; + + const handleStopDeviceTest = () => { + setDeviceTestState(false); + }; + const VideoDeviceTestComponent: React.FC<{ localCameraTrack: ICameraVideoTrack | null }> = ({ localCameraTrack }) => { + useJoin({ appid: config.appId, channel: config.channelName, token: config.rtcToken }, true); + + return ( +
+ +
+ ); + }; + const AudioDeviceTestComponent: React.FC<{ localMicrophoneTrack: ILocalAudioTrack }> = ({ localMicrophoneTrack }) => { + useAutoPlayAudioTrack(localMicrophoneTrack, true); + const volume = useVolumeLevel(localMicrophoneTrack); + + return ( +
+

local Audio Volume: {Math.floor(volume * 100)}

+
+ ); + }; + ``` + - useVolumeLevel + - useJoin + - useAutoPlayAudioTrack + - LocalVideoTrack + +
diff --git a/assets/code/video-sdk/geofencing/combine-geofence.mdx b/assets/code/video-sdk/geofencing/combine-geofence.mdx new file mode 100644 index 000000000..9004d931e --- /dev/null +++ b/assets/code/video-sdk/geofencing/combine-geofence.mdx @@ -0,0 +1,34 @@ + + ```kotlin + // Area codes support bitwise operations + config.mAreaCode = AREA_CODE_NA or AREA_CODE_EU + ``` + + ```kotlin + // Exclude Mainland China from the regions for connection + config.mAreaCode = AREA_CODE_GLOB ^ AREA_CODE_CN + ``` + - List of AreaCodes + + + ```swift + // Your app will only connect to Agora SD-RTN located in Europe and India. + config.areaCode = AgoraAreaCodeType( + rawValue: AgoraAreaCodeType.EUR.rawValue | AgoraAreaCodeType.IN.rawValue + ) + ``` + + + - AgoraAreaCodeType + + + - AgoraAreaCodeType + + + + ```ts + AgoraRTC.setArea({ + areaCode: [AREAS.NORTH_AMERICA, AREAS.ASIA] + }) + ``` + diff --git a/assets/code/video-sdk/geofencing/set-geofence.mdx b/assets/code/video-sdk/geofencing/set-geofence.mdx new file mode 100644 index 000000000..91acfb2c9 --- /dev/null +++ b/assets/code/video-sdk/geofencing/set-geofence.mdx @@ -0,0 +1,94 @@ + + ```kotlin + // Set the engine configuration + val config = RtcEngineConfig() + config.mContext = mContext + config.mAppId = appId + config.mEventHandler = iRtcEngineEventHandler + + // Set the geofencing area code(s) + config.mAreaCode = AREA_CODE_NA + + // Create an RtcEngine instance + agoraEngine = RtcEngine.create(config) + ``` + - RtcEngineConfig + - RtcEngine.create + + + ```swift + let config = AgoraRtcEngineConfig() + // Your app will only connect to Agora SD-RTN located in North America. + config.areaCode = .NA; + + let eng = AgoraRtcEngineKit.sharedEngine(with: config, delegate: self) + // ... + ``` + + + - AgoraRtcEngineConfig + - AgoraRtcEngineConfig/areaCode + - AgoraRtcEngineKit/sharedEngine(with:delegate:) + + + - AgoraRtcEngineConfig + - AgoraRtcEngineConfig/areaCode + - AgoraRtcEngineKit/sharedEngine(with:delegate:) + + + + ```csharp + public override void SetupAgoraEngine() + { + // Set the region of your choice. + region = AREA_CODE.AREA_CODE_CN; + + base.SetupAgoraEngine(); + } + ``` + + - AREA_CODE + - RtcEngineContext + - Initialize + + + - AREA_CODE + - RtcEngineContext + - Initialize + + + + + ```typescript + const useGeofencing = () => { + useEffect(() => { + AgoraRTC.setArea({ + areaCode: [AREAS.NORTH_AMERICA, AREAS.ASIA] + }) + }, []); + }; + + function EnableGeofencing() { + useGeofencing(); + + return ( +
+ +
+ ); + } + ``` + - setArea + +
+ + ```js + // Your app will only connect to Agora SD-RTN located in North America. + AgoraRTC.setArea({ + areaCode:"ASIA" + }) + // You can use [] to include more than one region. + ``` + - setArea + - AREAS + diff --git a/assets/code/video-sdk/get-started-sdk/create-engine.mdx b/assets/code/video-sdk/get-started-sdk/create-engine.mdx new file mode 100644 index 000000000..7907ae8fb --- /dev/null +++ b/assets/code/video-sdk/get-started-sdk/create-engine.mdx @@ -0,0 +1,219 @@ + + + ```kotlin + protected open fun setupAgoraEngine(): Boolean { + try { + // Set the engine configuration + val config = RtcEngineConfig() + config.mContext = mContext + config.mAppId = appId + // Assign an event handler to receive engine callbacks + config.mEventHandler = iRtcEngineEventHandler + // Create an RtcEngine instance + agoraEngine = RtcEngine.create(config) + // By default, the video module is disabled, call enableVideo to enable it. + agoraEngine!!.enableVideo() + } catch (e: Exception) { + sendMessage(e.toString()) + return false + } + return true + } + ``` + + - RtcEngine + - RtcEngineConfig + - create + - enableVideo + + + ```kotlin + protected open fun setupAgoraEngine(): Boolean { + try { + // Set the engine configuration + val config = RtcEngineConfig() + config.mContext = mContext + config.mAppId = appId + // Assign an event handler to receive engine callbacks + config.mEventHandler = iRtcEngineEventHandler + // Create an RtcEngine instance + agoraEngine = RtcEngine.create(config) + } catch (e: Exception) { + sendMessage(e.toString()) + return false + } + return true + } + ``` + + - RtcEngine + - RtcEngineConfig + - create + + + + + ```dart + Future setupAgoraEngine() async { + // Retrieve or request camera and microphone permissions + await [Permission.microphone, Permission.camera].request(); + + // Create an instance of the Agora engine + agoraEngine = createAgoraRtcEngine(); + await agoraEngine!.initialize(RtcEngineContext(appId: appId)); + + if (currentProduct != ProductName.voiceCalling) { + await agoraEngine!.enableVideo(); + } + + // Register the event handler + agoraEngine!.registerEventHandler(getEventHandler()); + } + ``` + + + ```swift + // The Agora RTC Engine Kit for the session. + public var agoraEngine: AgoraRtcEngineKit { + if let engine { return engine } + let engine = setupEngine() + self.engine = engine + return engine + } + + open func setupEngine() -> AgoraRtcEngineKit { + let eng = AgoraRtcEngineKit.sharedEngine(withAppId: appId, delegate: self) + if DocsAppConfig.shared.product != .voice { + eng.enableVideo() + } else { eng.enableAudio() } + eng.setClientRole(role) + return eng + } + ``` + + + - sharedEngine(withAppId:delegate:) + - enableVideo() + - enableAudio() + - setClientRole(_:) + + + - sharedEngine(withAppId:delegate:) + - enableVideo() + - enableAudio() + - setClientRole(_:) + + + + + ```csharp + // Define a public function called SetupAgoraEngine to setup the video SDK engine. + public virtual void SetupAgoraEngine() + { + if(_appID == "" || _token == "") + { + Debug.Log("Please set an app ID and a token in the config file."); + return; + } + // Create an instance of the video SDK engine. + agoraEngine = Agora.Rtc.RtcEngine.CreateAgoraRtcEngine(); + + RtcEngineContext context = new RtcEngineContext(_appID, 0, CHANNEL_PROFILE_TYPE.CHANNEL_PROFILE_COMMUNICATION, + AUDIO_SCENARIO_TYPE.AUDIO_SCENARIO_DEFAULT, region, null); + + agoraEngine.Initialize(context); + + // Enable the audio module. + agoraEngine.EnableAudio(); + + // Set the user role as broadcaster. + agoraEngine.SetClientRole(CLIENT_ROLE_TYPE.CLIENT_ROLE_BROADCASTER); + + // Attach the eventHandler + InitEventHandler(); + + } + ``` + - CreateAgoraRtcEngine + + - Initialize + + - EnableAudio + + - SetClientRole + + + + ```csharp + // Define a public function called SetupAgoraEngine to setup the video SDK engine. + public virtual void SetupAgoraEngine() + { + if(_appID == "" || _token == "") + { + Debug.Log("Please set an app ID and a token in the config file."); + return; + } + // Create an instance of the video SDK engine. + agoraEngine = Agora.Rtc.RtcEngine.CreateAgoraRtcEngine(); + + // Set context configuration based on the product type + CHANNEL_PROFILE_TYPE channelProfile = configData.product == "Video Calling" + ? CHANNEL_PROFILE_TYPE.CHANNEL_PROFILE_COMMUNICATION + : CHANNEL_PROFILE_TYPE.CHANNEL_PROFILE_LIVE_BROADCASTING; + + RtcEngineContext context = new RtcEngineContext(_appID, 0, channelProfile, + AUDIO_SCENARIO_TYPE.AUDIO_SCENARIO_DEFAULT, region, null); + + agoraEngine.Initialize(context); + + // Enable the video module. + agoraEngine.EnableVideo(); + + // Set the user role as broadcaster. + agoraEngine.SetClientRole(CLIENT_ROLE_TYPE.CLIENT_ROLE_BROADCASTER); + + // Attach the eventHandler + InitEventHandler(); + + } + ``` + - CreateAgoraRtcEngine + + - Initialize + + - EnableVideo + + - SetClientRole + + + + ```javascript + const AgoraRTCManager = async (eventsCallback) => { + let agoraEngine = null; + + // Set up the signaling engine with the provided App ID, UID, and configuration + const setupAgoraEngine = async () => { + agoraEngine = new AgoraRTC.createClient({ mode: "rtc", codec: "vp9" }); + }; + + await setupAgoraEngine(); + + const getAgoraEngine = () => { + return agoraEngine; + }; + }; + ``` + - AgoraRTC.createClient + - IAgoraRTCRemoteUser + + + ```typescript + const agoraEngine = useRTCClient(AgoraRTC.createClient({ codec: "vp8", mode: config.selectedProduct })); + + }/> + + ``` + - useRTCClient + - AgoraRTCProvider + + diff --git a/assets/code/video-sdk/get-started-sdk/declare-variables.mdx b/assets/code/video-sdk/get-started-sdk/declare-variables.mdx new file mode 100644 index 000000000..5d41d3aae --- /dev/null +++ b/assets/code/video-sdk/get-started-sdk/declare-variables.mdx @@ -0,0 +1,127 @@ + + ```kotlin + protected var agoraEngine: RtcEngine? = null // The RTCEngine instance + protected var mListener: AgoraManagerListener? = null // The event handler for AgoraEngine events + protected var config: JSONObject? // Configuration parameters from the config.json file + protected val appId: String // Your App ID from Agora console + var channelName: String // The name of the channel to join + var localUid: Int // UID of the local user + var remoteUids = HashSet() // An object to store uids of remote users + var isJoined = false // Status of the video call + private set + var isBroadcaster = true // Local user role + ``` + + + ```dart + late Map config; // Configuration parameters + int localUid = -1; + String appId = "", channelName = ""; + List remoteUids = []; // Uids of remote users in the channel + bool isJoined = false; // Indicates if the local user has joined the channel + bool isBroadcaster = true; // Client role + RtcEngine? agoraEngine; // Agora engine instance + ``` + + + ```swift + // The Agora App ID for the session. + public let appId: String + // The client's role in the session. + public var role: AgoraClientRole = .audience { + didSet { agoraEngine.setClientRole(role) } + } + + // The set of all users in the channel. + @Published public var allUsers: Set = [] + + // Integer ID of the local user. + @Published public var localUserId: UInt = 0 + + private var engine: AgoraRtcEngineKit? + ``` + + + + ```csharp + // Define some variables to be used later. + internal string _appID; + internal string _channelName; + internal string _token; + internal uint remoteUid; + internal IRtcEngine agoraEngine; + internal VideoSurface LocalView; + internal VideoSurface RemoteView; + internal ConfigData configData; + internal AREA_CODE region = AREA_CODE.AREA_CODE_GLOB; + internal string userRole = ""; + + #if (UNITY_2018_3_OR_NEWER && UNITY_ANDROID) + // Define an ArrayList of permissions required for Android devices. + private ArrayList permissionList = new ArrayList() { Permission.Camera, Permission.Microphone }; + #endif + ``` + + + ```csharp + // Define some variables to be used later. + internal string _appID; + internal string _channelName; + internal string _token; + internal uint remoteUid; + internal IRtcEngine agoraEngine; + internal ConfigData configData; + internal AREA_CODE region = AREA_CODE.AREA_CODE_GLOB; + + #if (UNITY_2018_3_OR_NEWER && UNITY_ANDROID) + // Define an ArrayList of permissions required for Android devices. + private ArrayList permissionList = new ArrayList() { Permission.Microphone }; + #endif + ``` + + + + + ```javascript + let channelParameters = { + // A variable to hold a local audio track. + localAudioTrack: null, + // A variable to hold a local video track. + localVideoTrack: null, + // A variable to hold a remote audio track. + remoteAudioTrack: null, + // A variable to hold a remote video track. + remoteVideoTrack: null, + // A variable to hold the remote user id.s + remoteUid: null, + }; + + const AgoraRTCManager = async (eventsCallback) => { + let agoraEngine = null; + // Setup done in later steps + }; + ``` + + + ```javascript + let channelParameters = { + // A variable to hold a local audio track. + localAudioTrack: null, + // A variable to hold a remote audio track. + remoteAudioTrack: null, + // A variable to hold the remote user id.s + remoteUid: null, + }; + + const AgoraRTCManager = async (eventsCallback) => { + let agoraEngine = null; + // Setup done in later steps + }; + ``` + + + + ```typescript + const [joined, setJoined] = useState(false); + ``` + diff --git a/assets/code/video-sdk/get-started-sdk/destroy.mdx b/assets/code/video-sdk/get-started-sdk/destroy.mdx new file mode 100644 index 000000000..bceeeb02c --- /dev/null +++ b/assets/code/video-sdk/get-started-sdk/destroy.mdx @@ -0,0 +1,67 @@ + + ```kotlin + protected fun destroyAgoraEngine() { + // Release the RtcEngine instance to free up resources + RtcEngine.destroy() + agoraEngine = null + } + ``` + + - destroy + + + ```dart + void destroyAgoraEngine() { + // Release the RtcEngine instance to free up resources + if (agoraEngine != null) { + agoraEngine!.release(); + agoraEngine = null; + } + } + ``` + + + ```swift + func destroyAgoraEngine() { + AgoraRtcEngineKit.destroy() + } + ``` + + + - destroy() + + + - destroy() + + + + ```csharp + // Use this function to destroy the engine + public virtual void DestroyEngine() + { + if (agoraEngine != null) + { + // Destroy the engine. + agoraEngine.LeaveChannel(); + agoraEngine.Dispose(); + agoraEngine = null; + } + } + ``` + + - Dispose + + + - Dispose + + + + ```typescript + useEffect(() => { + return () => { + localCameraTrack?.close(); + localMicrophoneTrack?.close(); + }; + }, []); + ``` + diff --git a/assets/code/video-sdk/get-started-sdk/handle-events.mdx b/assets/code/video-sdk/get-started-sdk/handle-events.mdx new file mode 100644 index 000000000..b039ece01 --- /dev/null +++ b/assets/code/video-sdk/get-started-sdk/handle-events.mdx @@ -0,0 +1,347 @@ + + ```kotlin + protected open val iRtcEngineEventHandler: IRtcEngineEventHandler? + get() = object : IRtcEngineEventHandler() { + // Listen for a remote user joining the channel. + override fun onUserJoined(uid: Int, elapsed: Int) { + sendMessage("Remote user joined $uid") + // Save the uid of the remote user. + remoteUids.add(uid) + if (isBroadcaster && (currentProduct == ProductName.INTERACTIVE_LIVE_STREAMING + || currentProduct == ProductName.BROADCAST_STREAMING) + ) { + // Remote video does not need to be rendered + } else { + // Set up and return a SurfaceView for the new user + setupRemoteVideo(uid) + } + } + + override fun onJoinChannelSuccess(channel: String, uid: Int, elapsed: Int) { + // Set the joined status to true. + isJoined = true + sendMessage("Joined Channel $channel") + // Save the uid of the local user. + localUid = uid + mListener!!.onJoinChannelSuccess(channel, uid, elapsed) + } + + override fun onUserOffline(uid: Int, reason: Int) { + sendMessage("Remote user offline $uid $reason") + // Update the list of remote Uids + remoteUids.remove(uid) + // Notify the UI + mListener!!.onRemoteUserLeft(uid) + } + + override fun onError(err: Int) { + when (err) { + ErrorCode.ERR_TOKEN_EXPIRED -> sendMessage("Your token has expired") + ErrorCode.ERR_INVALID_TOKEN -> sendMessage("Your token is invalid") + else -> sendMessage("Error code: $err") + } + } + } + ``` + + - IRtcEngineEventHandler + - For a list of error codes, see onError + + + + + ```csharp + // Init event handler to receive callbacks + public virtual void InitEventHandler() + { + agoraEngine.InitEventHandler(new UserEventHandler(this)); + } + // An event handler class to deal with video SDK events + internal class UserEventHandler : IRtcEngineEventHandler + { + internal readonly AgoraManager agoraManager; + + internal UserEventHandler(AgoraManager videoSample) + { + agoraManager = videoSample; + } + // This callback is triggered when the local user joins the channel. + public override void OnJoinChannelSuccess(RtcConnection connection, int elapsed) + { + Debug.Log("You joined channel: " +connection.channelId); + } + // This callback is triggered when a remote user leaves the channel or drops offline. + public override void OnUserOffline(RtcConnection connection, uint uid, USER_OFFLINE_REASON_TYPE reason) + { + Debug.Log("OnUserOffline"); + } + public override void OnUserJoined(RtcConnection connection, uint uid, int elapsed) + { + Debug.Log("OnUserJoined"); + } + } + ``` + - OnJoinChannelSuccess + - OnUserOffline + - OnUserJoined + + + ```csharp + // Init event handler to receive callbacks + public virtual void InitEventHandler() + { + agoraEngine.InitEventHandler(new UserEventHandler(this)); + } + // An event handler class to deal with video SDK events + internal class UserEventHandler : IRtcEngineEventHandler + { + internal readonly AgoraManager agoraManager; + + internal UserEventHandler(AgoraManager videoSample) + { + agoraManager = videoSample; + } + // This callback is triggered when the local user joins the channel. + public override void OnJoinChannelSuccess(RtcConnection connection, int elapsed) + { + Debug.Log("You joined channel: " +connection.channelId); + } + + // This callback is triggered when a remote user leaves the channel or drops offline. + public override void OnUserOffline(RtcConnection connection, uint uid, USER_OFFLINE_REASON_TYPE reason) + { + agoraManager.DestroyVideoView(uid); + } + public override void OnUserJoined(RtcConnection connection, uint uid, int elapsed) + { + agoraManager.MakeVideoView(uid, connection.channelId); + // Save the remote user ID in a variable. + agoraManager.remoteUid = uid; + } + } + ``` + - OnJoinChannelSuccess + - OnUserOffline + - OnUserJoined + + + + ```dart + RtcEngineEventHandler getEventHandler() { + return RtcEngineEventHandler( + // Occurs when the network connection state changes + onConnectionStateChanged: (RtcConnection connection, + ConnectionStateType state, ConnectionChangedReasonType reason) { + if (reason == + ConnectionChangedReasonType.connectionChangedLeaveChannel) { + remoteUids.clear(); + isJoined = false; + } + // Notify the UI + Map eventArgs = {}; + eventArgs["connection"] = connection; + eventArgs["state"] = state; + eventArgs["reason"] = reason; + eventCallback("onConnectionStateChanged", eventArgs); + }, + // Occurs when a local user joins a channel + onJoinChannelSuccess: (RtcConnection connection, int elapsed) { + isJoined = true; + messageCallback( + "Local user uid:${connection.localUid} joined the channel"); + // Notify the UI + Map eventArgs = {}; + eventArgs["connection"] = connection; + eventArgs["elapsed"] = elapsed; + eventCallback("onJoinChannelSuccess", eventArgs); + }, + // Occurs when a remote user joins the channel + onUserJoined: (RtcConnection connection, int remoteUid, int elapsed) { + remoteUids.add(remoteUid); + messageCallback("Remote user uid:$remoteUid joined the channel"); + // Notify the UI + Map eventArgs = {}; + eventArgs["connection"] = connection; + eventArgs["remoteUid"] = remoteUid; + eventArgs["elapsed"] = elapsed; + eventCallback("onUserJoined", eventArgs); + }, + // Occurs when a remote user leaves the channel + onUserOffline: (RtcConnection connection, int remoteUid, + UserOfflineReasonType reason) { + remoteUids.remove(remoteUid); + messageCallback("Remote user uid:$remoteUid left the channel"); + // Notify the UI + Map eventArgs = {}; + eventArgs["connection"] = connection; + eventArgs["remoteUid"] = remoteUid; + eventArgs["reason"] = reason; + eventCallback("onUserOffline", eventArgs); + }, + ); + } + ``` + + + +```swift +func rtcEngine( + _ engine: AgoraRtcEngineKit, didJoinChannel channel: String, + withUid uid: UInt, elapsed: Int +) { + // The delegate is telling us that the local user has successfully joined the channel. + self.localUserId = uid + self.allUsers.insert(uid) +} + +func rtcEngine(_ engine: AgoraRtcEngineKit, didJoinedOfUid uid: UInt, elapsed: Int) { + // The delegate is telling us that a remote user has joined the channel. + self.allUsers.insert(uid) +} + +func rtcEngine(_ engine: AgoraRtcEngineKit, didOfflineOfUid uid: UInt, reason: AgoraUserOfflineReason) { + // The delegate is telling us that a remote user has left the channel. + self.allUsers.remove(uid) +} +``` + + + +```swift +func rtcEngine( + _ engine: AgoraRtcEngineKit, didJoinChannel channel: String, + withUid uid: UInt, elapsed: Int +) { + // The delegate is telling us that the local user has successfully joined the channel. + self.localUserId = uid + if self.role == .broadcaster { + self.allUsers.insert(uid) + } +} + +func rtcEngine(_ engine: AgoraRtcEngineKit, didJoinedOfUid uid: UInt, elapsed: Int) { + // The delegate is telling us that a remote user has joined the channel. + self.allUsers.insert(uid) +} + +func rtcEngine(_ engine: AgoraRtcEngineKit, didOfflineOfUid uid: UInt, reason: AgoraUserOfflineReason) { + // The delegate is telling us that a remote user has left the channel. + self.allUsers.remove(uid) +} +``` + + + + - AgoraRtcEngineDelegate + - rtcEngine(_:didJoinChannel:withUid:elapsed:) + - rtcEngine(_:didJoinedOfUid:elapsed:) + - rtcEngine(_:didOfflineOfUid:reason:) + + + - AgoraRtcEngineDelegate + - rtcEngine(_:didJoinChannel:withUid:elapsed:) + - rtcEngine(_:didJoinedOfUid:elapsed:) + - rtcEngine(_:didOfflineOfUid:reason:) + + + + ```typescript + useClientEvent(agoraEngine, "user-joined", (user) => { + console.log("The user" , user.uid , " has joined the channel"); + }); + + useClientEvent(agoraEngine, "user-left", (user) => { + console.log("The user" , user.uid , " has left the channel"); + }); + + useClientEvent(agoraEngine, "user-published", (user, mediaType) => { + console.log("The user" , user.uid , " has published media in the channel"); + }); + ``` + - useClientEvent + + + + ```javascript + // Event Listeners + agoraEngine.on("user-published", async (user, mediaType) => { + // Subscribe to the remote user when the SDK triggers the "user-published" event. + await agoraEngine.subscribe(user, mediaType); + console.log("subscribe success"); + eventsCallback("user-published", user, mediaType) + }); + + // Listen for the "user-unpublished" event. + agoraEngine.on("user-unpublished", (user) => { + console.log(user.uid + "has left the channel"); + }); + ``` + + The `eventsCallback` callback can be used by the UI to handle all events. The sample project uses the following callback: + + ```javascript + const handleVSDKEvents = (eventName, ...args) => { + switch (eventName) { + case "user-published": + if (args[1] == "video") { + // Retrieve the remote video track. + channelParameters.remoteVideoTrack = args[0].videoTrack; + // Retrieve the remote audio track. + channelParameters.remoteAudioTrack = args[0].audioTrack; + // Save the remote user id for reuse. + channelParameters.remoteUid = args[0].uid.toString(); + // Specify the ID of the DIV container. You can use the uid of the remote user. + remotePlayerContainer.id = args[0].uid.toString(); + channelParameters.remoteUid = args[0].uid.toString(); + remotePlayerContainer.textContent = + "Remote user " + args[0].uid.toString(); + // Append the remote container to the page body. + document.body.append(remotePlayerContainer); + // Play the remote video track. + channelParameters.remoteVideoTrack.play(remotePlayerContainer); + } + // Subscribe and play the remote audio track If the remote user publishes the audio track only. + if (args[1] == "audio") { + // Get the RemoteAudioTrack object in the AgoraRTCRemoteUser object. + channelParameters.remoteAudioTrack = args[0].audioTrack; + // Play the remote audio track. No need to pass any DOM element. + channelParameters.remoteAudioTrack.play(); + } + } + }; + ``` + + + ```javascript + // Event Listeners + agoraEngine.on("user-published", async (user, mediaType) => { + // Subscribe to the remote user when the SDK triggers the "user-published" event. + await agoraEngine.subscribe(user, mediaType); + console.log("subscribe success"); + eventsCallback("user-published", user, mediaType) + }); + + // Listen for the "user-unpublished" event. + agoraEngine.on("user-unpublished", (user) => { + console.log(user.uid + "has left the channel"); + }); + ``` + + The `eventsCallback` callback can be used by the UI to handle all events. The sample project uses the following callback: + + ```javascript + const handleVSDKEvents = (eventName, ...args) => { + switch (eventName) { + case "user-published": + // Subscribe and play the remote audio track If the remote user publishes the audio track only. + if (args[1] == "audio") { + // Get the RemoteAudioTrack object in the AgoraRTCRemoteUser object. + channelParameters.remoteAudioTrack = args[0].audioTrack; + // Play the remote audio track. No need to pass any DOM element. + channelParameters.remoteAudioTrack.play(); + } + } + }; + ``` + + diff --git a/assets/code/video-sdk/get-started-sdk/import-library.mdx b/assets/code/video-sdk/get-started-sdk/import-library.mdx new file mode 100644 index 000000000..e4bca08bb --- /dev/null +++ b/assets/code/video-sdk/get-started-sdk/import-library.mdx @@ -0,0 +1,51 @@ + + ```kotlin + import io.agora.rtc2.video.VideoCanvas + import io.agora.rtc2.* + ``` + + + ```dart + import 'dart:convert'; + import 'package:flutter/services.dart'; + import 'package:agora_rtc_engine/agora_rtc_engine.dart'; + import 'package:permission_handler/permission_handler.dart'; + ``` + + + ```swift + import AgoraRtcKit + import SwiftUI + ``` + + + ```javascript + import AgoraRTC from "agora-rtc-sdk-ng"; + import config from "./config.json"; + ``` + + + + ```csharp + using UnityEngine.UI; + using Agora.Rtc; + using System; + using System.IO; + ``` + + + ```typescript + import { + LocalVideoTrack, + RemoteUser, + useJoin, + useLocalCameraTrack, + useLocalMicrophoneTrack, + usePublish, + useRTCClient, + useRemoteUsers, + useClientEvent + } from "agora-rtc-react"; + import { IMicrophoneAudioTrack, ICameraVideoTrack } from "agora-rtc-sdk-ng"; + ``` + \ No newline at end of file diff --git a/assets/code/video-sdk/get-started-sdk/join-channel.mdx b/assets/code/video-sdk/get-started-sdk/join-channel.mdx new file mode 100644 index 000000000..01aae9084 --- /dev/null +++ b/assets/code/video-sdk/get-started-sdk/join-channel.mdx @@ -0,0 +1,365 @@ + + + + + ```kotlin + open fun joinChannel(channelName: String, token: String?): Int { + // Ensure that necessary Android permissions have been granted + if (!checkSelfPermission()) { + sendMessage("Permissions were not granted") + return -1 + } + this.channelName = channelName + + // Create an RTCEngine instance + if (agoraEngine == null) setupAgoraEngine() + + val options = ChannelMediaOptions() + + // For a Video/Voice call, set the channel profile as COMMUNICATION. + options.channelProfile = Constants.CHANNEL_PROFILE_COMMUNICATION + // Set the client role to broadcaster or audience + options.clientRoleType = Constants.CLIENT_ROLE_BROADCASTER + // Start local preview. + agoraEngine!!.startPreview() + + // Join the channel using a token. + agoraEngine!!.joinChannel(token, channelName, localUid, options) + return 0 + } + ``` + + + + ```kotlin + open fun joinChannel(channelName: String, token: String?): Int { + // Ensure that necessary Android permissions have been granted + if (!checkSelfPermission()) { + sendMessage("Permissions were not granted") + return -1 + } + this.channelName = channelName + + // Create an RTCEngine instance + if (agoraEngine == null) setupAgoraEngine() + + val options = ChannelMediaOptions() + // For a Video/Voice call, set the channel profile as COMMUNICATION. + options.channelProfile = Constants.CHANNEL_PROFILE_COMMUNICATION + // Set the client role to broadcaster or audience + options.clientRoleType = Constants.CLIENT_ROLE_BROADCASTER + + // Join the channel using a token. + agoraEngine!!.joinChannel(token, channelName, localUid, options) + return 0 + } + ``` + + + + + ```kotlin + open fun joinChannel(channelName: String, token: String?): Int { + // Ensure that necessary Android permissions have been granted + if (!checkSelfPermission()) { + sendMessage("Permissions were not granted") + return -1 + } + this.channelName = channelName + + // Create an RTCEngine instance + if (agoraEngine == null) setupAgoraEngine() + + val options = ChannelMediaOptions() + // Set the channel profile as LIVE_BROADCASTING. + options.channelProfile = Constants.CHANNEL_PROFILE_LIVE_BROADCASTING + + // Set ultra-low latency for Interactive live streaming + options.audienceLatencyLevel = + Constants.AUDIENCE_LATENCY_LEVEL_ULTRA_LOW_LATENCY + + // Set the client role as BROADCASTER or AUDIENCE according to the scenario. + if (isBroadcaster) { // Broadcasting Host or Video-calling client + options.clientRoleType = Constants.CLIENT_ROLE_BROADCASTER + // Start local preview. + agoraEngine!!.startPreview() + } else { // Audience + options.clientRoleType = Constants.CLIENT_ROLE_AUDIENCE + } + + // Join the channel with a token. + agoraEngine!!.joinChannel(token, channelName, localUid, options) + return 0 + } + ``` + + + + + ```kotlin + open fun joinChannel(channelName: String, token: String?): Int { + // Ensure that necessary Android permissions have been granted + if (!checkSelfPermission()) { + sendMessage("Permissions were not granted") + return -1 + } + this.channelName = channelName + + // Create an RTCEngine instance + if (agoraEngine == null) setupAgoraEngine() + + val options = ChannelMediaOptions() + // Set the channel profile as LIVE_BROADCASTING. + options.channelProfile = Constants.CHANNEL_PROFILE_LIVE_BROADCASTING + + // Set Low latency for Broadcast streaming + options.audienceLatencyLevel = + Constants.AUDIENCE_LATENCY_LEVEL_LOW_LATENCY + + // Set the client role as BROADCASTER or AUDIENCE according to the scenario. + if (isBroadcaster) { // Broadcasting Host or Video-calling client + options.clientRoleType = Constants.CLIENT_ROLE_BROADCASTER + // Start local preview. + agoraEngine!!.startPreview() + } else { // Audience + options.clientRoleType = Constants.CLIENT_ROLE_AUDIENCE + } + + // Join the channel with a token. + agoraEngine!!.joinChannel(token, channelName, localUid, options) + return 0 + } + ``` + + + - joinChannel + + - ChannelMediaOptions + + - startPreview + + + + + + ```dart + Future join({ + String channelName = '', + String token = '', + int uid = -1, + ClientRoleType clientRole = ClientRoleType.clientRoleBroadcaster, + }) async { + channelName = (channelName.isEmpty) ? this.channelName : channelName; + token = (token.isEmpty) ? config['rtcToken'] : token; + uid = (uid == -1) ? localUid : uid; + + // Set up Agora engine + if (agoraEngine == null) await setupAgoraEngine(); + + // Enable the local video preview + await agoraEngine!.startPreview(); + + // Set channel options including the client role and channel profile + ChannelMediaOptions options = ChannelMediaOptions( + clientRoleType: clientRole, + channelProfile: ChannelProfileType.channelProfileCommunication, + ); + + // Join a channel + await agoraEngine!.joinChannel( + token: token, + channelId: channelName, + options: options, + uid: uid, + ); + } + ``` + + + + ```csharp + public virtual void Join() + { + // Create an instance of the engine. + SetupAgoraEngine(); + + // Setup local video view. + SetupLocalVideo(); + + // Join the channel using the specified token and channel name. + agoraEngine.JoinChannel(configData.rtcToken, configData.channelName); + } + ``` + - JoinChannel + + + ```csharp + public virtual void Join() + { + // Create an instance of the engine. + SetupAgoraEngine(); + + // Join the channel using the specified token and channel name. + agoraEngine.JoinChannel(configData.rtcToken, configData.channelName); + } + ``` + - JoinChannel + + + + + +```swift +func joinVideoCall( + _ channel: String, token: String? = nil, uid: UInt = 0 +) async -> Int32 { + /// See ``AgoraManager/checkForPermissions()``, or Apple's docs for details of this method. + if await !AgoraManager.checkForPermissions() { + await self.updateLabel(key: "invalid-permissions") + return -3 + } + + let opt = AgoraRtcChannelMediaOptions() + opt.channelProfile = .communication + + return self.agoraEngine.joinChannel( + byToken: token, channelId: channel, + uid: uid, mediaOptions: opt + ) +} +``` + + + + +```swift +func joinVoiceCall( + _ channel: String, token: String? = nil, uid: UInt = 0 +) async -> Int32 { + /// See ``AgoraManager/checkForPermissions()``, or Apple's docs for details of this method. + if await !AgoraManager.checkForPermissions() { + await self.updateLabel(key: "invalid-permissions") + return -3 + } + + let opt = AgoraRtcChannelMediaOptions() + opt.channelProfile = .communication + + return self.agoraEngine.joinChannel( + byToken: token, channelId: channel, + uid: uid, mediaOptions: opt + ) +} +``` + + + + +```swift +func joinBroadcastStream( + _ channel: String, token: String? = nil, + uid: UInt = 0, isBroadcaster: Bool = true +) async -> Int32 { + /// See ``AgoraManager/checkForPermissions()``, or Apple's docs for details of this method. + if isBroadcaster, await !AgoraManager.checkForPermissions() { + await self.updateLabel(key: "invalid-permissions") + return -3 + } + + let opt = AgoraRtcChannelMediaOptions() + opt.channelProfile = .liveBroadcasting + opt.clientRoleType = isBroadcaster ? .broadcaster : .audience + opt.audienceLatencyLevel = isBroadcaster ? .ultraLowLatency : .lowLatency + + return self.agoraEngine.joinChannel( + byToken: token, channelId: channel, + uid: uid, mediaOptions: opt + ) +} +``` + + + + + - AgoraRtcChannelMediaOptions + - joinChannel(byToken:channelId:uid:mediaOptions:joinSuccess:) + + + - AgoraRtcChannelMediaOptions + - joinChannel(byToken:channelId:uid:mediaOptions:joinSuccess:) + + + + + ```javascript + const join = async (localPlayerContainer, channelParameters) => { + await agoraEngine.join( + config.appId, + config.channelName, + config.token, + config.uid + ); + // Create a local audio track from the audio sampled by a microphone. + channelParameters.localAudioTrack = + await AgoraRTC.createMicrophoneAudioTrack(); + // Create a local video track from the video captured by a camera. + channelParameters.localVideoTrack = await AgoraRTC.createCameraVideoTrack(); + // Append the local video container to the page body. + document.body.append(localPlayerContainer); + // Publish the local audio and video tracks in the channel. + await getAgoraEngine().publish([ + channelParameters.localAudioTrack, + channelParameters.localVideoTrack, + ]); + // Play the local video track. + channelParameters.localVideoTrack.play(localPlayerContainer); + }; + ``` + - join + - createMicrophoneAudioTrack + - createCameraVideoTrack + - publish + - play + + + ```javascript + const join = async (localPlayerContainer, channelParameters) => { + await agoraEngine.join( + config.appId, + config.channelName, + config.token, + config.uid + ); + // Create a local audio track from the audio sampled by a microphone. + channelParameters.localAudioTrack = + await AgoraRTC.createMicrophoneAudioTrack(); + // Append the local container to the page body. + document.body.append(localPlayerContainer); + // Publish the local audio tracks in the channel. + await getAgoraEngine().publish([ + channelParameters.localAudioTrack, + ]); + }; + ``` + - join + - createMicrophoneAudioTrack + - publish + + + + ```typescript + // Publish local tracks + usePublish([localMicrophoneTrack, localCameraTrack]); + + // Join the Agora channel with the specified configuration + useJoin({ + appid: config.appId, + channel: config.channelName, + token: config.rtcToken, + uid: config.uid, + }); + ``` + - usePublish + + - useJoin + diff --git a/assets/code/video-sdk/get-started-sdk/leave-channel.mdx b/assets/code/video-sdk/get-started-sdk/leave-channel.mdx new file mode 100644 index 000000000..892482a63 --- /dev/null +++ b/assets/code/video-sdk/get-started-sdk/leave-channel.mdx @@ -0,0 +1,121 @@ + + ```kotlin + fun leaveChannel() { + if (!isJoined) { + sendMessage("Join a channel first") + } else { + // To leave a channel, call the `leaveChannel` method + agoraEngine!!.leaveChannel() + sendMessage("You left the channel") + + // Set the `joined` status to false + isJoined = false + // Destroy the engine instance + destroyAgoraEngine() + } + } + ``` + + - leaveChannel + + + + + ```csharp + // Define a public function called Leave() to leave the channel. + public virtual void Leave() + { + if (agoraEngine != null) + { + // Leave the channel and clean up resources + agoraEngine.LeaveChannel(); + agoraEngine.DisableVideo(); + LocalView.SetEnable(false); + DestroyVideoView(remoteUid); + DestroyEngine(); + } + } + ``` + - LeaveChannel + - DisableVideo + - SetEnable + + + ```csharp + // Define a public function called Leave() to leave the channel. + public virtual void Leave() + { + if (agoraEngine != null) + { + // Leave the channel and clean up resources + agoraEngine.LeaveChannel(); + DestroyEngine(); + } + } + ``` + - LeaveChannel + + + + ```dart + Future leave() async { + // Clear saved remote Uids + remoteUids.clear(); + + // Leave the channel + if (agoraEngine != null) { + await agoraEngine!.leaveChannel(); + } + isJoined = false; + + // Destroy the Agora engine instance + destroyAgoraEngine(); + } + ``` + + + ```swift + func leaveChannel( + leaveChannelBlock: ((AgoraChannelStats) -> Void)? = nil + ) -> Int32 { + let leaveErr = self.agoraEngine.leaveChannel(leaveChannelBlock) + self.agoraEngine.stopPreview() + self.allUsers.removeAll() + return leaveErr + } + ``` + + + - leaveChannel(_:) + - stopPreview() + + + - leaveChannel(_:) + - stopPreview() + + + + + ```javascript + const leave = async (channelParameters) => { + // Destroy the local audio and video tracks. + channelParameters.localAudioTrack.close(); + channelParameters.localVideoTrack.close(); + // Remove the containers you created for the local video and remote video. + await agoraEngine.leave(); + }; + ``` + - leave + + + ```javascript + const leave = async (channelParameters) => { + // Destroy the local audio tracks. + channelParameters.localAudioTrack.close(); + await agoraEngine.leave(); + }; + ``` + - leave + + + diff --git a/assets/code/video-sdk/get-started-sdk/local-video.mdx b/assets/code/video-sdk/get-started-sdk/local-video.mdx new file mode 100644 index 000000000..53c9eb923 --- /dev/null +++ b/assets/code/video-sdk/get-started-sdk/local-video.mdx @@ -0,0 +1,102 @@ + + ```kotlin + val localVideo: SurfaceView + get() { + // Create a SurfaceView object for the local video + val localSurfaceView = SurfaceView(mContext) + localSurfaceView.visibility = View.VISIBLE + // Call setupLocalVideo with a VideoCanvas having uid set to 0. + agoraEngine!!.setupLocalVideo( + VideoCanvas( + localSurfaceView, + VideoCanvas.RENDER_MODE_HIDDEN, + 0 + ) + ) + return localSurfaceView + } + ``` + + - setupLocalVideo + + + + ```dart + AgoraVideoView localVideoView() { + return AgoraVideoView( + controller: VideoViewController( + rtcEngine: agoraEngine!, + canvas: const VideoCanvas(uid: 0), // Use uid = 0 for local view + ), + ); + } + ``` + + + ```swift + func createLocalCanvasView() { + // Create and return the video view + var canvas = AgoraRtcVideoCanvas() + let canvasView = UIView() + canvas.view = canvasView + + agoraEngine.startPreview() + agoraEngine.setupLocalVideo(canvas) + } + ``` + + - AgoraRtcVideoCanvas + - startPreview() + - setupLocalVideo(_:) + + See [`AgoraVideoCanvasView`](https://github.com/AgoraIO/video-sdk-samples-ios/blob/main/agora-manager/AgoraVideoCanvasView.swift) for a full implementation. + + + + ```csharp + public virtual void SetupLocalVideo() + { + // Set the local video view. + LocalView.SetForUser(configData.uid, _channelName); + + // Start rendering local video. + LocalView.SetEnable(true); + + } + ``` + - SetForUser + - SetEnable + + + ```swift + func createLocalCanvasView() { + // Create and return the video view + var canvas = AgoraRtcVideoCanvas() + let canvasView = NSView() + canvas.view = canvasView + + agoraEngine.startPreview() + agoraEngine.setupLocalVideo(canvas) + } + ``` + + - AgoraRtcVideoCanvas + - startPreview() + - setupLocalVideo(_:) + + See [`AgoraVideoCanvasView`](https://github.com/AgoraIO/video-sdk-samples-macos/blob/main/agora-manager/AgoraVideoCanvasView.swift) for a full implementation. + + + + ```javascript + + ``` + + + ```typescript +
+ +
+ ``` + - LocalAudioTrack +
diff --git a/assets/code/video-sdk/get-started-sdk/remote-video.mdx b/assets/code/video-sdk/get-started-sdk/remote-video.mdx new file mode 100644 index 000000000..c1bb7c418 --- /dev/null +++ b/assets/code/video-sdk/get-started-sdk/remote-video.mdx @@ -0,0 +1,121 @@ + + ```kotlin + protected fun setupRemoteVideo(remoteUid: Int) { + // Create a new SurfaceView + val remoteSurfaceView = SurfaceView(mContext) + remoteSurfaceView.setZOrderMediaOverlay(true) + // Create a VideoCanvas using the remoteSurfaceView + val videoCanvas = VideoCanvas( + remoteSurfaceView, + VideoCanvas.RENDER_MODE_FIT, remoteUid + ) + agoraEngine!!.setupRemoteVideo(videoCanvas) + // Set the visibility + remoteSurfaceView.visibility = View.VISIBLE + // Notify the UI to display the video + mListener!!.onRemoteUserJoined(remoteUid, remoteSurfaceView) + } + ``` + - VideoCanvas + - setupRemoteVideo + + + + ```csharp + // Dynamically create views for the remote users + public void MakeVideoView(uint uid, string channelName) + { + // Create and configure a remote user's video view + AgoraUI agoraUI = new AgoraUI(); + GameObject userView = agoraUI.MakeRemoteView(uid.ToString()); + userView.AddComponent(); + + VideoSurface videoSurface = userView.AddComponent(); + videoSurface.SetForUser(uid, channelName, VIDEO_SOURCE_TYPE.VIDEO_SOURCE_REMOTE); + videoSurface.OnTextureSizeModify += (int width, int height) => + { + float scale = (float)height / (float)width; + videoSurface.transform.localScale = new Vector3(-5, 5 * scale, 1); + Debug.Log("OnTextureSizeModify: " + width + " " + height); + }; + videoSurface.SetEnable(true); + + RemoteView = videoSurface; + } + + // Destroy the remote user's video view when they leave + public void DestroyVideoView(uint uid) + { + var userView = GameObject.Find(uid.ToString()); + if (!ReferenceEquals(userView, null)) + { + userView.SetActive(false); // Deactivate the GameObject + } + } + ``` + - VideoSurface + - SetForUser + + + + ```dart + AgoraVideoView remoteVideoView(int remoteUid) { + return AgoraVideoView( + controller: VideoViewController.remote( + rtcEngine: agoraEngine!, + canvas: VideoCanvas(uid: remoteUid), + connection: RtcConnection(channelId: channelName), + ), + ); + } + ``` + + + ```swift + func createRemoteCanvasView(with uid: UInt) { + // Create and return the video view + var canvas = AgoraRtcVideoCanvas() + canvas.uid = uid + let canvasView = UIView() + canvas.view = canvasView + + agoraEngine.setupRemoteVideo(canvas) + } + ``` + + - AgoraRtcVideoCanvas + - setupRemoteVideo(_:) + +See [`AgoraVideoCanvasView`](https://github.com/AgoraIO/video-sdk-samples-ios/blob/main/agora-manager/AgoraVideoCanvasView.swift) for a full implementation. + + + + ```swift + func createRemoteCanvasView(with uid: UInt) { + // Create and return the video view + var canvas = AgoraRtcVideoCanvas() + canvas.uid = uid + let canvasView = NSView() + canvas.view = canvasView + + agoraEngine.setupRemoteVideo(canvas) + } + ``` + + - AgoraRtcVideoCanvas + - setupRemoteVideo(_:) + +See [`AgoraVideoCanvasView`](https://github.com/AgoraIO/video-sdk-samples-macos/blob/main/agora-manager/AgoraVideoCanvasView.swift) for a full implementation. + + + + ```typescript + const remoteUsers = useRemoteUsers(); + {remoteUsers.map((remoteUser) => ( +
+ +
+ ))} + ``` + - RemoteVideoTrack +
diff --git a/assets/code/video-sdk/get-started-sdk/request-permissions.mdx b/assets/code/video-sdk/get-started-sdk/request-permissions.mdx new file mode 100644 index 000000000..e44d6a293 --- /dev/null +++ b/assets/code/video-sdk/get-started-sdk/request-permissions.mdx @@ -0,0 +1,102 @@ + + + ```kotlin + companion object { + protected const val PERMISSION_REQ_ID = 22 + protected val REQUESTED_PERMISSIONS = arrayOf( + Manifest.permission.RECORD_AUDIO, + Manifest.permission.CAMERA + ) + } + + protected fun checkSelfPermission(): Boolean { + return ContextCompat.checkSelfPermission( + mContext, + REQUESTED_PERMISSIONS[0] + ) == PackageManager.PERMISSION_GRANTED && + ContextCompat.checkSelfPermission( + mContext, + REQUESTED_PERMISSIONS[1] + ) == PackageManager.PERMISSION_GRANTED + } + + if (!checkSelfPermission()) { + ActivityCompat.requestPermissions(activity, REQUESTED_PERMISSIONS, PERMISSION_REQ_ID) + } + ``` + + + ```kotlin + companion object { + protected const val PERMISSION_REQ_ID = 22 + protected val REQUESTED_PERMISSIONS = arrayOf( + Manifest.permission.RECORD_AUDIO + ) + } + + protected fun checkSelfPermission(): Boolean { + return ContextCompat.checkSelfPermission( + mContext, + REQUESTED_PERMISSIONS[0] + ) == PackageManager.PERMISSION_GRANTED && + ContextCompat.checkSelfPermission( + mContext, + REQUESTED_PERMISSIONS[1] + ) == PackageManager.PERMISSION_GRANTED + } + + if (!checkSelfPermission()) { + ActivityCompat.requestPermissions(activity, REQUESTED_PERMISSIONS, PERMISSION_REQ_ID) + } + ``` + + + + + + ```csharp + // Define a private function called CheckPermissions() to check for required permissions. + public void CheckPermissions() + { + #if (UNITY_2018_3_OR_NEWER && UNITY_ANDROID) + // Check for each permission in the permission list and request the user to grant it if necessary. + foreach (string permission in permissionList) + { + if (!Permission.HasUserAuthorizedPermission(permission)) + { + Permission.RequestUserPermission(permission); + } + } + #endif + } + ``` + + + ```swift + static func checkForPermissions() async -> Bool { + var hasPermissions = await self.avAuthorization(mediaType: .video) + // Break out, because camera permissions have been denied or restricted. + if !hasPermissions { return false } + hasPermissions = await self.avAuthorization(mediaType: .audio) + return hasPermissions + } + + static func avAuthorization(mediaType: AVMediaType) async -> Bool { + let mediaAuthorizationStatus = AVCaptureDevice.authorizationStatus(for: mediaType) + switch mediaAuthorizationStatus { + case .denied, .restricted: return false + case .authorized: return true + case .notDetermined: + return await withCheckedContinuation { continuation in + AVCaptureDevice.requestAccess(for: mediaType) { granted in + continuation.resume(returning: granted) + } + } + @unknown default: return false + } + } + ``` + + - authorizationStatus(for:) + - requestAccess(for:completionHandler:) + \ No newline at end of file diff --git a/assets/code/video-sdk/get-started-sdk/set-user-role.mdx b/assets/code/video-sdk/get-started-sdk/set-user-role.mdx new file mode 100644 index 000000000..6565084a8 --- /dev/null +++ b/assets/code/video-sdk/get-started-sdk/set-user-role.mdx @@ -0,0 +1,68 @@ + + + ```csharp + public virtual void SetClientRole(string role) + { + if(agoraEngine == null) + { + Debug.Log("Click join and then change the client role!"); + return; + } + userRole = role; + if (role == "Host") + { + agoraEngine.SetClientRole(CLIENT_ROLE_TYPE.CLIENT_ROLE_BROADCASTER); + } + else if(role == "Audience") + { + agoraEngine.SetClientRole(CLIENT_ROLE_TYPE.CLIENT_ROLE_AUDIENCE); + } + } + ``` + - SetClientRole + + + + + ```typescript + const agoraEngine = useRTCClient(); + const handleRoleChange = (event: React.ChangeEvent) => + { + setRole(event.target.value); + if(event.target.value === "host") + { + agoraEngine.setClientRole("host").then(() => { + // Your code to handle the resolution of the promise + console.log("Client role set to host successfully"); + }).catch((error) => { + // Your code to handle any errors + console.error("Error setting client role:", error); + }); + } + else + { + agoraEngine.setClientRole("audience").then(() => { + // Your code to handle the resolution of the promise + console.log("Client role set to host successfully"); + }).catch((error) => { + // Your code to handle any errors + console.error("Error setting client role:", error); + }); + } + }; + ``` + + + ```swift + agoraEngine.setClientRole(role) + ``` + + + - setClientRole(_:) + - AgoraClientRole + + + - setClientRole(_:) + - AgoraClientRole + + diff --git a/assets/code/video-sdk/get-started-sdk/setup-audio-video-tracks.mdx b/assets/code/video-sdk/get-started-sdk/setup-audio-video-tracks.mdx new file mode 100644 index 000000000..7eccdd628 --- /dev/null +++ b/assets/code/video-sdk/get-started-sdk/setup-audio-video-tracks.mdx @@ -0,0 +1,9 @@ + + ```typescript + const { isLoading: isLoadingCam, localCameraTrack } = useLocalCameraTrack(); + const { isLoading: isLoadingMic, localMicrophoneTrack } = useLocalMicrophoneTrack(); + ``` + - useLocalCameraTrack + - useLocalMicrophoneTrack + + \ No newline at end of file diff --git a/assets/code/video-sdk/get-started-sdk/swift/create-ui.mdx b/assets/code/video-sdk/get-started-sdk/swift/create-ui.mdx deleted file mode 100644 index 6ceef5810..000000000 --- a/assets/code/video-sdk/get-started-sdk/swift/create-ui.mdx +++ /dev/null @@ -1,278 +0,0 @@ - - - -``` swift -import Cocoa -import AppKit -import Foundation - -class ViewController: NSViewController { - - // The video feed for the local user is displayed here - var localView: NSView! - // The video feed for the remote user is displayed here - var remoteView: NSView! - // Click to join or leave a call - var joinButton: NSButton! - // Track if the local user is in a call - var joined: Bool = false - - override func viewDidLoad() { - super.viewDidLoad() - initViews() - } - - func joinChannel() - - func leaveChannel() {} - - func initViews() { - - // Initializes the remote video view. This view displays video when a remote host joins the channel. - remoteView = NSView() - remoteView.frame = CGRect(x: 20, y: 80, width: 150, height: 150) - self.view.addSubview(remoteView) - localView = NSView() - localView.frame = CGRect(x: 300, y: 80, width: 150, height: 150) - self.view.addSubview(localView) - - joinButton = NSButton() - joinButton.frame = CGRect(x: 200 , y: 10, width: 50, height: 20) - joinButton.title = "Join" - joinButton.target = self - joinButton.action = #selector(buttonAction) - self.view.addSubview(joinButton) - } - - - @objc func buttonAction(sender: NSButton!) { - if !joined { - joinChannel() - // Check if successfully joined the channel and set button title accordingly - if joined { joinButton.title = "Leave" } - } else { - leaveChannel() - // Check if successfully left the channel and set button title accordingly - if !joined { joinButton.title = "Join"} - } - } - -} -``` - - - -``` swift -import Cocoa -import AppKit -import Foundation - -class ViewController: NSViewController { - - // The video feed for the local user is displayed here - var localView: NSView! - // The video feed for the remote user is displayed here - var remoteView: NSView! - // Click to join or leave a call - var joinButton: NSButton! - // Choose to be broadcaster or audience - var role: NSSegmentedControl! - // Track if the local user is in a call - var joined: Bool = false - - override func viewDidLoad() { - super.viewDidLoad() - initViews() - } - - func joinChannel() - - func leaveChannel() {} - - func initViews() { - - // Initializes the remote video view. This view displays video when a remote host joins the channel. - remoteView = NSView() - remoteView.frame = CGRect(x: 20, y: 80, width: 150, height: 150) - self.view.addSubview(remoteView) - localView = NSView() - localView.frame = CGRect(x: 300, y: 80, width: 150, height: 150) - self.view.addSubview(localView) - - joinButton = NSButton() - joinButton.frame = CGRect(x: 200 , y:10, width:50, height:20) - joinButton.title = "Join" - joinButton.target = self - joinButton.action = #selector(buttonAction) - self.view.addSubview(joinButton) - - // Selector to be the host or the audience - role = NSSegmentedControl( - labels: ["Broadcast", "Audience"], trackingMode: .selectOne, target: self, - action: #selector(roleAction) - ) - role.frame = CGRect(x: 20, y: 10, width: 160, height: 20) - role.selectedSegment = 0 - self.view.addSubview(role) - } - - - @objc func buttonAction(sender: NSButton!) { - if !joined { - joinChannel() - // Check if successfully joined the channel and set button title accordingly - if joined { joinButton.title = "Leave" } - } else { - leaveChannel() - // Check if successfully left the channel and set button title accordingly - if !joined { joinButton.title = "Join"} - } - } - - @objc func roleAction(sender: NSSegmentedControl!) {} -} -``` - - - - - - - ``` swift - import UIKit - import AVFoundation - - class ViewController: UIViewController { - // The video feed for the local user is displayed here - var localView: UIView! - // The video feed for the remote user is displayed here - var remoteView: UIView! - // Click to join or leave a call - var joinButton: UIButton! - - // Track if the local user is in a call - var joined: Bool = false { - didSet { - DispatchQueue.main.async { - self.joinButton.setTitle( self.joined ? "Leave" : "Join", for: .normal) - } - } - } - - override func viewDidLoad() { - super.viewDidLoad() - initViews() - } - - func joinChannel() async { } - - func leaveChannel() {} - - override func viewDidLayoutSubviews() { - super.viewDidLayoutSubviews() - remoteView.frame = CGRect(x: 20, y: 50, width: 350, height: 330) - localView.frame = CGRect(x: 20, y: 400, width: 350, height: 330) - } - - func initViews() { - // Initializes the remote video view. This view displays video when a remote host joins the channel. - remoteView = UIView() - self.view.addSubview(remoteView) - // Initializes the local video window. This view displays video when the local user is a host. - localView = UIView() - self.view.addSubview(localView) - // Button to join or leave a channel - joinButton = UIButton(type: .system) - joinButton.frame = CGRect(x: 140, y: 700, width: 100, height: 50) - joinButton.setTitle("Join", for: .normal) - - joinButton.addTarget(self, action: #selector(buttonAction), for: .touchUpInside) - self.view.addSubview(joinButton) - } - - @objc func buttonAction(sender: UIButton!) { - if !joined { - sender.isEnabled = false - Task { - await joinChannel() - sender.isEnabled = true - } - } else { - leaveChannel() - } - } - } - ``` - - - ``` swift - import AVFoundation - import UIKit - - class ViewController: UIViewController { - // The video feed for the local user is displayed here - var localView: UIView! - // The video feed for the remote user is displayed here - var remoteView: UIView! - // Click to join or leave a call - var joinButton: UIButton! - // Choose to be broadcaster or audience - var role: UISegmentedControl! - // Track if the local user is in a call - var joined: Bool = false - - override func viewDidLoad() { - super.viewDidLoad() - initViews() - } - - func joinChannel() async { } - - func leaveChannel() {} - - override func viewDidLayoutSubviews() { - super.viewDidLayoutSubviews() - remoteView.frame = CGRect(x: 20, y: 50, width: 350, height: 330) - localView.frame = CGRect(x: 20, y: 400, width: 350, height: 330) - } - - func initViews() { - // Initializes the remote video view. This view displays video when a remote host joins the channel. - remoteView = UIView() - self.view.addSubview(remoteView) - // Initializes the local video window. This view displays video when the local user is a host. - localView = UIView() - self.view.addSubview(localView) - // Button to join or leave a channel - joinButton = UIButton(type: .system) - joinButton.frame = CGRect(x: 140, y: 700, width: 100, height: 50) - joinButton.setTitle("Join", for: .normal) - - joinButton.addTarget(self, action: #selector(buttonAction), for: .touchUpInside) - self.view.addSubview(joinButton) - - // Selector to be the host or the audience - role = UISegmentedControl(items: ["Broadcast", "Audience"]) - role.frame = CGRect(x: 20, y: 740, width: 350, height: 40) - role.selectedSegmentIndex = 0 - role.addTarget(self, action: #selector(roleAction), for: .valueChanged) - self.view.addSubview(role) - } - - @objc func buttonAction(sender: UIButton!) { - if !joined { - sender.isEnabled = false - Task { - await joinChannel() - sender.isEnabled = true - } - } else { - leaveChannel() - } - } - - @objc func roleAction(sender: UISegmentedControl!) {} - } - ``` - - \ No newline at end of file diff --git a/assets/code/video-sdk/get-started-sdk/swift/join-and-leave.mdx b/assets/code/video-sdk/get-started-sdk/swift/join-and-leave.mdx deleted file mode 100644 index 76abf28e5..000000000 --- a/assets/code/video-sdk/get-started-sdk/swift/join-and-leave.mdx +++ /dev/null @@ -1,239 +0,0 @@ - - - - ``` swift - func joinChannel() { - - let option = AgoraRtcChannelMediaOptions() - - // Set the client role option as broadcaster or audience. - if self.userRole == .broadcaster { - option.clientRoleType = .broadcaster - setupLocalVideo() - } else { - option.clientRoleType = .audience - } - - // For a video call scenario, set the channel profile as communication. - option.channelProfile = .communication - - // Join the channel with a temp token. Pass in your token and channel name here - let result = agoraEngine.joinChannel( - byToken: token, channelId: channelName, uid: 0, mediaOptions: option, - joinSuccess: { (channel, uid, elapsed) in } - ) - // Check if joining the channel was successful and set joined Bool accordingly - if result == 0 { - joined = true - showMessage(title: "Success", text: "Successfully joined the channel as \(self.userRole)") - } - } - - func leaveChannel() { - agoraEngine.stopPreview() - let result = agoraEngine.leaveChannel(nil) - // Check if leaving the channel was successful and set joined Bool accordingly - if result == 0 { joined = false } - } - ``` - - - - ``` swift - func joinChannel() { - - let option = AgoraRtcChannelMediaOptions() - - // Set the client role option as broadcaster or audience. - if self.userRole == .broadcaster { - option.clientRoleType = .broadcaster - setupLocalVideo() - } else { - option.clientRoleType = .audience - } - // For a live streaming scenario, set the channel profile as liveBroadcasting. - option.channelProfile = .liveBroadcasting - // Join the channel with a temp token. Pass in your token and channel name here - let result = agoraEngine.joinChannel( - byToken: token, channelId: channelName, uid: 0, mediaOptions: option, - joinSuccess: { (channel, uid, elapsed) in } - ) - // Check if joining the channel was successful and set joined Bool accordingly - if result == 0 { - joined = true - showMessage(title: "Success", text: "Successfully joined the channel as \(self.userRole)") - } - } - - func leaveChannel() { - agoraEngine.stopPreview() - let result = agoraEngine.leaveChannel(nil) - // Check if leaving the channel was successful and set joined Bool accordingly - if result == 0 { joined = false } - } - ``` - - - ``` swift - func joinChannel() { - - let option = AgoraRtcChannelMediaOptions() - - // Set the client role option as broadcaster or audience. - if self.userRole == .broadcaster { - option.clientRoleType = .broadcaster - setupLocalVideo() - } else { - option.clientRoleType = .audience - option.audienceLatencyLevel = .lowLatency - } - // For a live streaming scenario, set the channel profile as liveBroadcasting. - option.channelProfile = .liveBroadcasting - // Join the channel with a temp token. Pass in your token and channel name here - let result = agoraEngine.joinChannel( - byToken: token, channelId: channelName, uid: 0, mediaOptions: option, - joinSuccess: { (channel, uid, elapsed) in } - ) - // Check if joining the channel was successful and set joined Bool accordingly - if result == 0 { - joined = true - showMessage(title: "Success", text: "Successfully joined the channel as \(self.userRole)") - } - } - - func leaveChannel() { - agoraEngine.stopPreview() - let result = agoraEngine.leaveChannel(nil) - // Check if leaving the channel was successful and set joined Bool accordingly - if (result == 0) { joined = false } - } - ``` - For , use `AgoraAudienceLatencyLevelLowLatency` for audience roles. This ensures low latency, which is a feature of and its use is subject to special [pricing](../reference/pricing#unit-pricing). - - - - - - - ``` swift - func joinChannel() async { - if await !self.checkForPermissions() { - showMessage(title: "Error", text: "Permissions were not granted") - return - } - - let option = AgoraRtcChannelMediaOptions() - - // Set the client role option as broadcaster or audience. - if self.userRole == .broadcaster { - option.clientRoleType = .broadcaster - setupLocalVideo() - } else { - option.clientRoleType = .audience - } - - // For a video call scenario, set the channel profile as communication. - option.channelProfile = .communication - - // Join the channel with a temp token. Pass in your token and channel name here - let result = agoraEngine.joinChannel( - byToken: token, channelId: channelName, uid: 0, mediaOptions: option, - joinSuccess: { (channel, uid, elapsed) in } - ) - // Check if joining the channel was successful and set joined Bool accordingly - if result == 0 { - joined = true - showMessage(title: "Success", text: "Successfully joined the channel as \(self.userRole)") - } - } - - func leaveChannel() { - agoraEngine.stopPreview() - let result = agoraEngine.leaveChannel(nil) - // Check if leaving the channel was successful and set joined Bool accordingly - if result == 0 { joined = false } - } - ``` - - - - ``` swift - func joinChannel() async { - if await !self.checkForPermissions() { - showMessage(title: "Error", text: "Permissions were not granted") - return - } - - let option = AgoraRtcChannelMediaOptions() - - // Set the client role option as broadcaster or audience. - if self.userRole == .broadcaster { - option.clientRoleType = .broadcaster - setupLocalVideo() - } else { - option.clientRoleType = .audience - } - // For a live streaming scenario, set the channel profile as liveBroadcasting. - option.channelProfile = .liveBroadcasting - // Join the channel with a temp token. Pass in your token and channel name here - let result = agoraEngine.joinChannel( - byToken: token, channelId: channelName, uid: 0, mediaOptions: option, - joinSuccess: { (channel, uid, elapsed) in } - ) - // Check if joining the channel was successful and set joined Bool accordingly - if result == 0 { - joined = true - showMessage(title: "Success", text: "Successfully joined the channel as \(self.userRole)") - } - } - - func leaveChannel() { - agoraEngine.stopPreview() - let result = agoraEngine.leaveChannel(nil) - // Check if leaving the channel was successful and set joined Bool accordingly - if result == 0 { joined = false } - } - ``` - - - ``` swift - func joinChannel() { - if !self.checkForPermissions() { - showMessage(title: "Error", text: "Permissions were not granted") - return - } - - let option = AgoraRtcChannelMediaOptions() - - // Set the client role option as broadcaster or audience. - if self.userRole == .broadcaster { - option.clientRoleType = .broadcaster - setupLocalVideo() - } else { - option.clientRoleType = .audience - option.audienceLatencyLevel = .lowLatency - } - // For a live streaming scenario, set the channel profile as liveBroadcasting. - option.channelProfile = .liveBroadcasting - // Join the channel with a temp token. Pass in your token and channel name here - let result = agoraEngine.joinChannel( - byToken: token, channelId: channelName, uid: 0, mediaOptions: option, - joinSuccess: { (channel, uid, elapsed) in } - ) - // Check if joining the channel was successful and set joined Bool accordingly - if result == 0 { - joined = true - showMessage(title: "Success", text: "Successfully joined the channel as \(self.userRole)") - } - } - - func leaveChannel() { - agoraEngine.stopPreview() - let result = agoraEngine.leaveChannel(nil) - // Check if leaving the channel was successful and set joined Bool accordingly - if (result == 0) { joined = false } - } - ``` - For , use `AgoraAudienceLatencyLevelLowLatency` for audience roles. This ensures low latency, which is a feature of and its use is subject to special [pricing](../reference/pricing#unit-pricing). - - \ No newline at end of file diff --git a/assets/code/video-sdk/get-started-sdk/swift/role-action.mdx b/assets/code/video-sdk/get-started-sdk/swift/role-action.mdx deleted file mode 100644 index 3451eb33e..000000000 --- a/assets/code/video-sdk/get-started-sdk/swift/role-action.mdx +++ /dev/null @@ -1,14 +0,0 @@ - -``` swift -@objc func roleAction(sender: NSSegmentedControl!) { - self.userRole = sender.selectedSegment == 0 ? .broadcaster : .audience -} -``` - - -``` swift -@objc func roleAction(sender: UISegmentedControl!) { - self.userRole = sender.selectedSegmentIndex == 0 ? .broadcaster : .audience -} -``` - \ No newline at end of file diff --git a/assets/code/video-sdk/get-started-sdk/swift/show-message.mdx b/assets/code/video-sdk/get-started-sdk/swift/show-message.mdx deleted file mode 100644 index fc02562b3..000000000 --- a/assets/code/video-sdk/get-started-sdk/swift/show-message.mdx +++ /dev/null @@ -1,26 +0,0 @@ - -``` swift -func showMessage(title: String, text: String, delay: Int = 2) -> Void { - let deadlineTime = DispatchTime.now() + .seconds(delay) - DispatchQueue.main.asyncAfter(deadline: deadlineTime, execute: { - let alert: NSAlert = NSAlert() - alert.messageText = title - alert.informativeText = text - alert.alertStyle = .informational - alert.runModal() - }) -} -``` - - -``` swift -func showMessage(title: String, text: String, delay: Int = 2) -> Void { - let deadlineTime = DispatchTime.now() + .seconds(delay) - DispatchQueue.main.asyncAfter(deadline: deadlineTime, execute: { - let alert = UIAlertController(title: title, message: text, preferredStyle: .alert) - self.present(alert, animated: true) - alert.dismiss(animated: true, completion: nil) - }) -} -``` - \ No newline at end of file diff --git a/assets/code/video-sdk/get-started-sdk/swift/view-did-disappear.mdx b/assets/code/video-sdk/get-started-sdk/swift/view-did-disappear.mdx deleted file mode 100644 index f1055a74e..000000000 --- a/assets/code/video-sdk/get-started-sdk/swift/view-did-disappear.mdx +++ /dev/null @@ -1,19 +0,0 @@ - -``` swift - override func viewDidDisappear() - { - super.viewDidDisappear() - leaveChannel() - DispatchQueue.global(qos: .userInitiated).async {AgoraRtcEngineKit.destroy()} - } -``` - - -``` swift - override func viewDidDisappear(_ animated: Bool) { - super.viewDidDisappear(animated) - leaveChannel() - DispatchQueue.global(qos: .userInitiated).async {AgoraRtcEngineKit.destroy()} - } -``` - \ No newline at end of file diff --git a/assets/code/video-sdk/live-streaming-multiple-channels/import-library.mdx b/assets/code/video-sdk/live-streaming-multiple-channels/import-library.mdx new file mode 100644 index 000000000..9a26cf75d --- /dev/null +++ b/assets/code/video-sdk/live-streaming-multiple-channels/import-library.mdx @@ -0,0 +1,43 @@ + + ```kotlin + import io.agora.rtc2.* + import io.agora.rtc2.video.ChannelMediaInfo + import io.agora.rtc2.video.ChannelMediaRelayConfiguration + import io.agora.rtc2.video.VideoCanvas + ``` + + + ```swift + import AgoraRtcKit + ``` + + + ```csharp + using Agora.Rtc; + ``` + + + ```typescript + import { + AgoraRTCProvider, + useRTCClient, + useConnectionState, + usePublish, + useJoin, + useRemoteUsers, + RemoteUser, + useLocalCameraTrack, + useLocalMicrophoneTrack + } from "agora-rtc-react"; + import AgoraRTC, { IAgoraRTCClient, ChannelMediaRelayState, ChannelMediaRelayError, ChannelMediaRelayEvent } from "agora-rtc-sdk-ng"; + import config from "../agora-manager/config"; + import {useState} from 'react'; + import AuthenticationWorkflowManager from "../authentication-workflow/authenticationWorkflowManager"; + ``` + + +```js +import AgoraManager from "../agora_manager/agora_manager.js"; +import AgoraRTC from "agora-rtc-sdk-ng"; +``` + diff --git a/assets/code/video-sdk/live-streaming-multiple-channels/join-a-second-channel.mdx b/assets/code/video-sdk/live-streaming-multiple-channels/join-a-second-channel.mdx new file mode 100644 index 000000000..803eeda53 --- /dev/null +++ b/assets/code/video-sdk/live-streaming-multiple-channels/join-a-second-channel.mdx @@ -0,0 +1,236 @@ + + ```kotlin + fun joinSecondChannel() { + // Create an RtcEngineEx instance + // This interface class contains multi-channel methods + agoraEngineEx = RtcEngineEx.create(mContext, appId, secondChannelEventHandler) as RtcEngineEx + // By default, the video module is disabled, call enableVideo to enable it. + agoraEngineEx.enableVideo() + + if (isSecondChannelJoined) { + agoraEngineEx.leaveChannelEx(rtcSecondConnection) + } else { + val mediaOptions = ChannelMediaOptions() + if (!isBroadcaster) { // Audience Role + mediaOptions.autoSubscribeAudio = true + mediaOptions.autoSubscribeVideo = true + mediaOptions.clientRoleType = Constants.CLIENT_ROLE_AUDIENCE + } else { // Host Role + mediaOptions.publishCameraTrack = true + mediaOptions.publishMicrophoneTrack = true + mediaOptions.channelProfile = Constants.CHANNEL_PROFILE_LIVE_BROADCASTING + mediaOptions.clientRoleType = Constants.CLIENT_ROLE_BROADCASTER + } + rtcSecondConnection = RtcConnection() + rtcSecondConnection!!.channelId = secondChannelName + rtcSecondConnection!!.localUid = secondChannelUid + + if (isValidURL(serverUrl)) { + fetchToken(secondChannelName, secondChannelUid, object : TokenCallback { + override fun onTokenReceived(rtcToken: String?) { + // Handle the received rtcToken + if (rtcToken != null) secondChannelToken = rtcToken + agoraEngineEx.joinChannelEx( + secondChannelToken, + rtcSecondConnection, + mediaOptions, + secondChannelEventHandler + ) + } + + override fun onError(errorMessage: String) { + // Handle the error + sendMessage("Error: $errorMessage") + } + }) + } else { + agoraEngineEx.joinChannelEx( + secondChannelToken, + rtcSecondConnection, + mediaOptions, + secondChannelEventHandler + ) + } + } + } + ``` + - RtcEngineEx + - RtcConnection + - joinChannelEx + - leaveChannelEx + + + + `joinChannelEx` method lets you join a second channel. If you've been already joined a second channel, `leaveChannelEx` can let to leave that channel. + + ```swift + func joinChannelEx(token: String?) -> Int32 { + let mediaOptions = AgoraRtcChannelMediaOptions() + mediaOptions.channelProfile = .liveBroadcasting + mediaOptions.clientRoleType = .audience + mediaOptions.autoSubscribeAudio = true + mediaOptions.autoSubscribeVideo = true + + return agoraEngine.joinChannelEx( + byToken: token, connection: self.secondConnection, + delegate: nil, mediaOptions: mediaOptions + ) + } + ``` + + Take a look at [`ChannelRelayView`](https://github.com/AgoraIO/video-sdk-samples-ios/blob/main/live-streaming-over-multiple-channels/ChannelRelayView.swift) for an example on how to use `joinChannelEx`, `leaveChannelEx` methods to implement this behavior. + + + - joinChannelEx(byToken:connection:delegate:mediaOptions:joinSuccess:) + + + - joinChannelEx(byToken:connection:delegate:mediaOptions:joinSuccess:) + + + + + ```csharp + // Method to join the second channel. + public void JoinSecondChannel() + { + if (agoraEngineEx != null) + { + if (string.IsNullOrEmpty(configData.secondChannelToken) || string.IsNullOrEmpty(configData.secondChannelName)) + { + Debug.Log("please specify a valid channel name and a token for the second channel"); + return; + } + ChannelMediaOptions mediaOptions = new ChannelMediaOptions(); + mediaOptions.autoSubscribeAudio.SetValue(true); + mediaOptions.autoSubscribeVideo.SetValue(true); + mediaOptions.publishCameraTrack.SetValue(true); + mediaOptions.publishMicrophoneTrack.SetValue(true); + mediaOptions.clientRoleType.SetValue(CLIENT_ROLE_TYPE.CLIENT_ROLE_BROADCASTER); + mediaOptions.channelProfile.SetValue(CHANNEL_PROFILE_TYPE.CHANNEL_PROFILE_LIVE_BROADCASTING); + rtcSecondConnection = new RtcConnection(); + rtcSecondConnection.channelId = configData.secondChannelName; + rtcSecondConnection.localUid = configData.secondChannelUID; + agoraEngineEx.JoinChannelEx(configData.secondChannelToken, rtcSecondConnection, mediaOptions); + } + else + { + Debug.Log("Engine was not initialized"); + } + } + ``` + - ChannelMediaOptions + - RtcConnection + - JoinChannelEx + + + + ```typescript + const JoinSecondChannel = ({ agoraEngineSubscriber }: { agoraEngineSubscriber: IAgoraRTCClient }) => { + const [joinSecondChannelVisible, setJoinSecondChannelVisible] = useState(false); + const remoteUsers = useRemoteUsers(); + const { isLoading: isLoadingCam, localCameraTrack } = useLocalCameraTrack(joinSecondChannelVisible); + const { isLoading: isLoadingMic, localMicrophoneTrack } = useLocalMicrophoneTrack(joinSecondChannelVisible); + + const connection = useConnectionState(agoraEngineSubscriber); + useJoin({ + appid: config.appId, + channel: config.secondChannel, + token: config.secondChannelToken, + uid: config.secondChannelUID, + }, joinSecondChannelVisible, agoraEngineSubscriber); + + usePublish([localMicrophoneTrack, localCameraTrack], (connection == "CONNECTED") && joinSecondChannelVisible , agoraEngineSubscriber); + const handleButtonClick = () => { + setJoinSecondChannelVisible((prev) => !prev); + // You can perform any other logic here if needed + }; + return( +
+ + {remoteUsers.map((remoteUser) => ( +
+ + +
+ ))} +
+ ) + } + ``` + - useLocalCameraTrack + - useLocalMicrophoneTrack + - useConnectionState + - usePublish + - useJoin + - useRemoteUsers +
+ +```js +const handleMultipleChannels = async (isMultipleChannel, clientRole, secondChannelName, secondChannelToken, channelParameters) => { + if (isMultipleChannel == false) { + // Create an Agora engine instance. + agoraEngineSubscriber = AgoraRTC.createClient({ + mode: "live", + codec: "vp9", + }); + // Setup event handlers to subscribe and unsubscribe to the second channel users. + agoraEngineSubscriber.on("user-published", async (user, mediaType) => { + // Subscribe to the remote user when the SDK triggers the "user-published" event. + await agoraEngineSubscriber.subscribe(user, mediaType); + console.log("Subscribe success!"); + if (clientRole == "") { + // set role to broadcaster + await agoraEngineSubscriber.setClientRole(ClientRoleType.Broadcaster); + } + // You only play the video when you join the channel as a host. + else if (clientRole == "audience" && mediaType == "video") { + // Dynamically create a container in the form of a DIV element to play the second channel remote video track. + const container = document.createElement("div"); + // Set the container size. + container.style.width = "640px"; + container.style.height = "480px"; + container.style.padding = "15px 5px 5px 5px"; + // Specify the container id and text. + container.id = user.uid.toString(); + container.textContent = + "Remote user from the second channel" + user.uid.toString(); + // Append the container to page body. + document.body.append(container); + // Play the remote video in the container. + user.videoTrack.play(container); + } + // Listen for the "user-unpublished" event. + agoraEngineSubscriber.on("user-unpublished", (user) => { + console.log(user.uid + "has left the channel"); + }); + }); + // Set the user role. + agoraEngineSubscriber.setClientRole(clientRole); + // Join the new channel. + await agoraEngineSubscriber.join( + config.appId, + secondChannelName, + secondChannelToken, + config.uid + ); + // An audience can not publish audio and video tracks in the channel. + if (clientRole != "audience") { + await agoraEngineSubscriber.publish([ + channelParameters.localAudioTrack, + channelParameters.localVideoTrack, + ]); + } + isMultipleChannel = true; + // Update the button text. + document.getElementById("multiple-channels").innerHTML = + "Leave Second Channel"; + } else { + isMultipleChannel = false; + // Leave the channel. + await agoraEngineSubscriber.leave(); + } +}; +``` + diff --git a/assets/code/video-sdk/live-streaming-multiple-channels/leave-second-channel.mdx b/assets/code/video-sdk/live-streaming-multiple-channels/leave-second-channel.mdx new file mode 100644 index 000000000..ed8c23d28 --- /dev/null +++ b/assets/code/video-sdk/live-streaming-multiple-channels/leave-second-channel.mdx @@ -0,0 +1,35 @@ + + + ```kotlin + agoraEngineEx.leaveChannelEx(rtcSecondConnection) + ``` + - leaveChannelEx + + + ```csharp + // Method to leave the second channel. + public void LeaveSecondChannel() + { + if (agoraEngineEx != null) + { + agoraEngineEx.LeaveChannelEx(rtcSecondConnection); + } + } + ``` + - LeaveChannelEx + + + + ```swift + func leaveChannelEx() -> Int32 { + agoraEngine.leaveChannelEx(self.secondConnection, leaveChannelBlock: nil) + } + ``` + + + - leaveChannelEx(_:leaveChannelBlock:) + + + - leaveChannelEx(_:leaveChannelBlock:) + + \ No newline at end of file diff --git a/assets/code/video-sdk/live-streaming-multiple-channels/monitor-channel-media-relay-state.mdx b/assets/code/video-sdk/live-streaming-multiple-channels/monitor-channel-media-relay-state.mdx new file mode 100644 index 000000000..0dc316afd --- /dev/null +++ b/assets/code/video-sdk/live-streaming-multiple-channels/monitor-channel-media-relay-state.mdx @@ -0,0 +1,111 @@ + + ```kotlin + override val iRtcEngineEventHandler: IRtcEngineEventHandler + get() = object : IRtcEngineEventHandler() { + override fun onChannelMediaRelayStateChanged(state: Int, code: Int) { + if (state == 2) { + mediaRelaying = true + } else if (state == 3) { + mediaRelaying = false + } + } + } + ``` + - onChannelMediaRelayStateChanged + + + + Use the following callback to have your respond to connection and failure events: + + ```swift + func rtcEngine( + _ engine: AgoraRtcEngineKit, + channelMediaRelayStateDidChange state: AgoraChannelMediaRelayState, + error: AgoraChannelMediaRelayError + ) { + switch state { + case .connecting: + // Channel media relay is connecting. + break + case .running: + // Channel media relay is running. + break + case .failure: + // Channel media relay failure + break + default: return + } + } + ``` + + + - rtcEngine(_:channelMediaRelayStateDidChange:error:) + + + - rtcEngine(_:channelMediaRelayStateDidChange:error:) + + + + + ```csharp + // Event handler class to handle the events raised by Agora's RtcEngine instance + internal class MultiChannelLiveStreamingEventHandler : UserEventHandler + { + private MultiChannelLiveStreamingManager multiChannelLiveStreamingManager; + + internal MultiChannelLiveStreamingEventHandler(MultiChannelLiveStreamingManager videoSample) : base(videoSample) + { + multiChannelLiveStreamingManager = videoSample; + } + + public override void OnChannelMediaRelayStateChanged(int state, int code) + { + // This example shows messages in the debug console when the relay state changes, + // a production level app needs to handle state change properly. + switch (state) + { + case 1: // RELAY_STATE_CONNECTING: + Debug.Log("Channel media relay connecting."); + break; + case 2: // RELAY_STATE_RUNNING: + Debug.Log("Channel media relay running."); + break; + case 3: // RELAY_STATE_FAILURE: + Debug.Log("Channel media relay failure. Error code: " + code); + break; + } + } + } + ``` + - OnChannelMediaRelayStateChanged + + +```typescript +const useChannelMediaRelayState = () => { + const agoraEngine = useRTCClient(); + useClientEvent(agoraEngine, "channel-media-relay-state", (state: ChannelMediaRelayState, code: ChannelMediaRelayError) => { + console.log("Channel media relay state changed :", state); + if(code) + { + console.error("Channel media relay error :", code); + } + }); +}; + +const useChannelMediaRelayEvent = () => { + const agoraEngine = useRTCClient(); + useClientEvent(agoraEngine, "channel-media-relay-event", (event: ChannelMediaRelayEvent) => { + console.log("Channel media relay event :", event); + }) +}; +``` +- useClientEvent + + +```js +agoraEngine.on("channel-media-relay-state", state => +{ + console.log("The current state is : "+ state); +}); +``` + diff --git a/assets/code/video-sdk/live-streaming-multiple-channels/receive-callbacks-from-second-channel.mdx b/assets/code/video-sdk/live-streaming-multiple-channels/receive-callbacks-from-second-channel.mdx new file mode 100644 index 000000000..49a06425f --- /dev/null +++ b/assets/code/video-sdk/live-streaming-multiple-channels/receive-callbacks-from-second-channel.mdx @@ -0,0 +1,94 @@ + + ```kotlin + // Callbacks for the second channel + private val secondChannelEventHandler: IRtcEngineEventHandler = + object : IRtcEngineEventHandler() { + override fun onJoinChannelSuccess(channel: String, uid: Int, elapsed: Int) { + isSecondChannelJoined = true + sendMessage("Joined channel $secondChannelName, uid: $uid") + val eventArgs = mutableMapOf() + eventArgs["channel"] = channel + eventArgs["uid"] = uid + mListener?.onEngineEvent("onJoinChannelSuccess2", eventArgs) + } + + override fun onLeaveChannel(stats: RtcStats) { + isSecondChannelJoined = false + sendMessage("Left the channel $secondChannelName") + val eventArgs = mutableMapOf() + eventArgs["stats"] = stats + mListener?.onEngineEvent("onLeaveChannel2", eventArgs) + } + + override fun onUserJoined(uid: Int, elapsed: Int) { + sendMessage(String.format("user %d joined!", uid)) + + // Create surfaceView for remote video + val remoteSurfaceView = SurfaceView(mContext) + remoteSurfaceView.setZOrderMediaOverlay(true) + + // Setup remote video to render + agoraEngineEx.setupRemoteVideoEx( + VideoCanvas( + remoteSurfaceView, + VideoCanvas.RENDER_MODE_HIDDEN, uid + ), rtcSecondConnection + ) + + val eventArgs = mutableMapOf() + eventArgs["uid"] = uid + eventArgs["surfaceView"] = remoteSurfaceView + mListener?.onEngineEvent("onUserJoined2", eventArgs) + } + + override fun onUserOffline(uid: Int, reason: Int) { + val eventArgs = mutableMapOf() + eventArgs["uid"] = uid + mListener?.onEngineEvent("onUserOffline2", eventArgs) + } + } + } + ``` + - IRtcEngineEventHandler + - setupRemoteVideoEx + + + + Use the following callbacks to receive streams and state change notifications of the secondary channel, with a separate delegate: + + ```swift + public class ExDelegate: NSObject, AgoraRtcEngineDelegate { + + let connection: AgoraRtcConnection + + // Catch remote streams from the secondary channel + public func rtcEngine(_ engine: AgoraRtcEngineKit, didJoinedOfUid uid: UInt, elapsed: Int) { + // remote user joined channel + } + // Catch when the local user leaves the secondary channel + public func rtcEngine(_ engine: AgoraRtcEngineKit, didLeaveChannelWith stats: AgoraChannelStats) { + // local user left channel + } + // Catch remote streams ended/left from the secondary channel + public func rtcEngine(_ engine: AgoraRtcEngineKit, didOfflineOfUid uid: UInt, reason: AgoraUserOfflineReason) { + // remote user left channel, or connection lost + } + } + ``` + + You will need to set the delegate in `joinChannelEx`. + + + - AgoraRtcEngineDelegate + - rtcEngine(_:didJoinedOfUid:elapsed:) + - rtcEngine(_:didLeaveChannelWith:) + - rtcEngine(_:didOfflineOfUid:reason:) + + + - AgoraRtcEngineDelegate + - rtcEngine(_:didJoinedOfUid:elapsed:) + - rtcEngine(_:didLeaveChannelWith:) + - rtcEngine(_:didOfflineOfUid:reason:) + + + \ No newline at end of file diff --git a/assets/code/video-sdk/live-streaming-multiple-channels/set-variables.mdx b/assets/code/video-sdk/live-streaming-multiple-channels/set-variables.mdx new file mode 100644 index 000000000..1028d2dfb --- /dev/null +++ b/assets/code/video-sdk/live-streaming-multiple-channels/set-variables.mdx @@ -0,0 +1,72 @@ + + ```kotlin + // Channel media relay variables + private var destinationChannelName: String // Name of the destination channel + private var destinationChannelToken: String // Access token for the destination channel + private var destinationChannelUid = 0 // User ID that the user uses in the destination channel + private var sourceChannelToken: String // Access token for the source channel, Generate using channelName and uid = 0 + private var mediaRelaying = false + + // Multi channel streaming variables + private lateinit var agoraEngineEx: RtcEngineEx + private var rtcSecondConnection: RtcConnection? = null + private var secondChannelName: String // Name of the second channel" + private var secondChannelUid = 0 // Uid for the second channel + private var secondChannelToken: String // Access token for the second channel + private var isSecondChannelJoined = false // Track connection state of the second channel + ``` + + + ```swift + // channel id for the primary channel + var primaryChannel: String + // channel id for the secondary channel + var secondaryChannel: String + // Can be any number, the range is arbitrary + var destUid: UInt = .random(in: 1000...5000) + + /// AgoraRtcConnection object for joining and leaving secondary channels + var secondConnection: AgoraRtcConnection { + AgoraRtcConnection( + channelId: self.secondaryChannel, + localUid: Int(self.destUid) + ) + } + ``` + + + - AgoraRtcConnection + + + - AgoraRtcConnection + + + + + ```csharp + private RtcConnection rtcSecondConnection; + internal IRtcEngineEx agoraEngineEx; + ``` + + +```js +// A variable to track the co-hosting state. +var isCoHost = false; +// The destination channel name you want to join. +var destChannelName = ""; +//In a production app, the user adds the channel name and you retrieve the +// authentication token from a token server. +var destChannelToken = ""; +// A variable to track the multiple channel state. +var isMultipleChannel = false; +// Local user role. +var clientRole = "host"; +// The second channel name you want to join. +var secondChannelName = ""; +//In a production app, the user adds the channel name and you retrieve the +// authentication token from a token server. +var secondChannelToken = ""; +// A variable to create a second instance of Agora engine. +var agoraEngineSubscriber; +``` + diff --git a/assets/code/video-sdk/live-streaming-multiple-channels/start-stop-channel-media-relay.mdx b/assets/code/video-sdk/live-streaming-multiple-channels/start-stop-channel-media-relay.mdx new file mode 100644 index 000000000..fbd78fd5b --- /dev/null +++ b/assets/code/video-sdk/live-streaming-multiple-channels/start-stop-channel-media-relay.mdx @@ -0,0 +1,272 @@ + + ```kotlin + fun channelRelay() { + if (agoraEngine == null) { + return + } + + if (mediaRelaying) { + agoraEngine!!.stopChannelMediaRelay() + } else { + // Configure the source channel information + val srcChannelInfo = ChannelMediaInfo(channelName, sourceChannelToken, 0) + val mediaRelayConfiguration = ChannelMediaRelayConfiguration() + mediaRelayConfiguration.setSrcChannelInfo(srcChannelInfo) + + // Configure the destination channel information. + val destChannelInfo = ChannelMediaInfo(destinationChannelName, destinationChannelToken, destinationChannelUid) + mediaRelayConfiguration.setDestChannelInfo(destinationChannelName, destChannelInfo) + + // Start relaying media streams across channels + agoraEngine!!.startOrUpdateChannelMediaRelay(mediaRelayConfiguration) + } + } + ``` + - ChannelMediaRelayConfiguration + - startOrUpdateChannelMediaRelay + - stopChannelMediaRelay + - pauseAllChannelMediaRelay + - resumeAllChannelMediaRelay + + + + You can use `startOrUpdateChannelMediaRelay`, `stopChannelMediaRelay` methods to manage the state of relaying media streams across channels. + + ```swift + func setupMediaRelay( + sourceToken: String?, destinationToken: String? + ) -> Int32 { + // Configure the source channel information. + let srcChannelInfo = AgoraChannelMediaRelayInfo(token: sourceToken) + srcChannelInfo.channelName = self.primaryChannel + srcChannelInfo.uid = 0 + let mediaRelayConfiguration = AgoraChannelMediaRelayConfiguration() + mediaRelayConfiguration.sourceInfo = srcChannelInfo + + // Configure the destination channel information. + let destChannelInfo = AgoraChannelMediaRelayInfo(token: destinationToken) + destChannelInfo.channelName = self.secondaryChannel + destChannelInfo.uid = self.destUid + mediaRelayConfiguration.setDestinationInfo( + destChannelInfo, forChannelName: self.secondaryChannel + ) + + // Start relaying media streams across channels + return agoraEngine.startOrUpdateChannelMediaRelay(mediaRelayConfiguration) + } + + func stopMediaRelay() -> Int32 { + agoraEngine.stopChannelMediaRelay() + } + ``` + + + - startOrUpdateChannelMediaRelay(_:) + - stopChannelMediaRelay() + + Have a look at [`ChannelRelayView`](https://github.com/AgoraIO/video-sdk-samples-ios/blob/main/live-streaming-over-multiple-channels/ChannelRelayView.swift) for details on how to implement a relay media toggle using `startOrUpdateChannelMediaRelay`, `stopChannelMediaRelay` methods. + + + - startOrUpdateChannelMediaRelay(_:) + - stopChannelMediaRelay() + + Have a look at [`ChannelRelayView`](https://github.com/AgoraIO/video-sdk-samples-macos/blob/main/live-streaming-over-multiple-channels/ChannelRelayView.swift) for details on how to implement a relay media toggle using `startOrUpdateChannelMediaRelay`, `stopChannelMediaRelay` methods. + + + + + ```csharp + // Method to relay media to the destination channel. + public void StartChannelRelay() + { + if (agoraEngine != null) + { + if (string.IsNullOrEmpty(configData.destChannelName) || string.IsNullOrEmpty(configData.destToken)) + { + Debug.Log("Specify a valid destination channel name and token."); + return; + } + + // Configure a ChannelMediaRelayConfiguration instance to add source and destination channels. + ChannelMediaRelayConfiguration mediaRelayConfiguration = new ChannelMediaRelayConfiguration(); + + // Configure the source channel information. + mediaRelayConfiguration.srcInfo = new ChannelMediaInfo + { + channelName = configData.channelName, + uid = configData.uid, + token = configData.rtcToken + }; + + // Set up the destination channel information. + mediaRelayConfiguration.destInfos = new ChannelMediaInfo[1]; + mediaRelayConfiguration.destInfos[0] = new ChannelMediaInfo + { + channelName = configData.destChannelName, + uid = configData.destUID, + token = configData.destToken + }; + + // Number of destination channels. + mediaRelayConfiguration.destCount = 1; + + // Start media relaying + agoraEngine.StartOrUpdateChannelMediaRelay(mediaRelayConfiguration); + } + else + { + Debug.Log("Agora Engine is not initialized. Click 'Join' to join the primary channel and then join the second channel."); + } + } + + // Method to stop media relaying. + public void StopChannelRelay() + { + if (agoraEngine != null) + { + agoraEngine.StopChannelMediaRelay(); + } + } + ``` + - ChannelMediaRelayConfiguration + - StartOrUpdateChannelMediaRelay + - StopChannelMediaRelay + + + + ```typescript + const ChannelMediaRelay = () => { + const agoraEngine = useRTCClient(); + const channelMediaConfig = AgoraRTC.createChannelMediaRelayConfiguration(); + const [isRelayRunning, setIsRelayRunning] = useState(false); + const connectionState = useConnectionState(); + + // Channel media relay events. + useChannelMediaRelayState(); + useChannelMediaRelayEvent(); + + if(config.destChannelName === "" || config.destChannelToken === "") + { + console.log("Please specify a valid channel name and a valid token for the destination channel in the config file"); + return; + } + channelMediaConfig.setSrcChannelInfo({ + channelName: config.channelName, + token: config.rtcToken, + uid: 0, + }); + channelMediaConfig.addDestChannelInfo({ + channelName: config.destChannelName, + token: config.destChannelToken, + uid: config.destUID, + }); + + const startChannelMediaRelay = () => { + agoraEngine + .startChannelMediaRelay(channelMediaConfig) + .then(() => { + console.log("Channel relay started successfully"); + setIsRelayRunning(true); + }) + .catch((e) => { + console.log(`startChannelMediaRelay failed`, e); + }); + }; + + const stopChannelMediaRelay = () => { + agoraEngine.stopChannelMediaRelay() + .then(() => { + console.log("Channel relay stopped successfully"); + setIsRelayRunning(false); + }) + .catch((e) => { + console.log(`stopChannelMediaRelay failed`, e); + }); + }; + + return ( +
+ +
+ ); + }; + ``` + - createChannelMediaRelayConfiguration + - startChannelMediaRelay + - stopChannelMediaRelay + - updateChannelMediaRelay + +
+ +```js +const handleChannelMediaRelay = ( + isCoHost, + destUID, + destChannelName, + destChannelToken +) => { + const channelMediaConfig = AgoraRTC.createChannelMediaRelayConfiguration(); + if (!isCoHost) { + // Set the source channel information. + // Set channelName as the source channel name. Set uid as the ID of the host whose stream is relayed. + // The token is generated with the source channel name. + // Assign the token you generated for the source channel. + console.log("entering handleChannelMediaRelay"); + channelMediaConfig.setSrcChannelInfo({ + channelName: config.channelName, + token: config.token, + uid: config.uid, + }); + // Set the destination channel information. You can set a maximum of four destination channels. + // Set channelName as the destination channel name. Set uid as 0 or a 32-bit unsigned integer. + // To avoid UID conflicts, the uid must be different from any other user IDs in the destination channel. + // Assign the token you generated for the destination channel. + channelMediaConfig.addDestChannelInfo({ + channelName: destChannelName, + token: destChannelToken, + uid: destUID, + }); + // Start media relaying. + agoraManager + .getAgoraEngine() + .startChannelMediaRelay(channelMediaConfig) + .then(() => { + // Update the button text. + document.getElementById(`coHost`).innerHTML = + "Stop Channel Media Relay"; + console.log(`startChannelMediaRelay success`); + }) + .catch((e) => { + console.log(`startChannelMediaRelay failed`, e); + }); + } else { + // Remove a destination channel. + channelMediaConfig.removeDestChannelInfo(destChannelName); + // Update the configurations of the media stream relay. + agoraManager + .getAgoraEngine() + .updateChannelMediaRelay(channelMediaConfig) + .then(() => { + console.log("updateChannelMediaRelay success"); + }) + .catch((e) => { + console.log("updateChannelMediaRelay failed", e); + }); + //Stop the relay. + agoraManager + .getAgoraEngine() + .stopChannelMediaRelay() + .then(() => { + console.log("stop media relay success"); + }) + .catch((e) => { + console.log("stop media relay failed", e); + }); + // Update the button text. + document.getElementById(`coHost`).innerHTML = "Start Channel Media Relay"; + } +}; +``` + diff --git a/assets/code/video-sdk/live-streaming-multiple-channels/swift/configure-buttons.mdx b/assets/code/video-sdk/live-streaming-multiple-channels/swift/configure-buttons.mdx index 1a4bd9b0c..0e9653c79 100644 --- a/assets/code/video-sdk/live-streaming-multiple-channels/swift/configure-buttons.mdx +++ b/assets/code/video-sdk/live-streaming-multiple-channels/swift/configure-buttons.mdx @@ -1,5 +1,5 @@ -``` swift +```swift channelRelayBtn = NSButton() channelRelayBtn.frame = CGRect(x: 255, y: 10, width: 150, height: 20) channelRelayBtn.title = "Start Channel Media Relay" @@ -9,7 +9,7 @@ self.view.addSubview(channelRelayBtn) ``` -``` swift +```swift channelRelayBtn = UIButton(type: .system) channelRelayBtn.frame = CGRect(x: 100, y: 550, width: 200, height: 50) channelRelayBtn.setTitle("Start Channel Media Relay", for: .normal) diff --git a/assets/code/video-sdk/live-streaming-multiple-channels/swift/create-ui.mdx b/assets/code/video-sdk/live-streaming-multiple-channels/swift/create-ui.mdx index 10ecc34c8..adaa456fc 100644 --- a/assets/code/video-sdk/live-streaming-multiple-channels/swift/create-ui.mdx +++ b/assets/code/video-sdk/live-streaming-multiple-channels/swift/create-ui.mdx @@ -1,10 +1,10 @@ -``` swift +```swift var channelRelayBtn: NSButton! ``` -``` swift +```swift var channelRelayBtn: UIButton! ``` \ No newline at end of file diff --git a/assets/code/video-sdk/live-streaming-multiple-channels/swift/mc-configure-buttons.mdx b/assets/code/video-sdk/live-streaming-multiple-channels/swift/mc-configure-buttons.mdx index cb3d7e4ad..c80e99325 100644 --- a/assets/code/video-sdk/live-streaming-multiple-channels/swift/mc-configure-buttons.mdx +++ b/assets/code/video-sdk/live-streaming-multiple-channels/swift/mc-configure-buttons.mdx @@ -1,5 +1,5 @@ -``` swift +```swift secondChannelBtn = NSButton() secondChannelBtn.frame = CGRect(x: 255, y: 40, width: 180, height: 50) secondChannelBtn.title = "Join second channel" @@ -9,7 +9,7 @@ self.view.addSubview(secondChannelBtn) ``` -``` swift +```swift secondChannelBtn = UIButton(type: .system) secondChannelBtn.frame = CGRect(x: 120, y: 600, width: 180, height: 50) secondChannelBtn.setTitle("Join second channel", for: .normal) diff --git a/assets/code/video-sdk/live-streaming-multiple-channels/swift/mc-create-ui.mdx b/assets/code/video-sdk/live-streaming-multiple-channels/swift/mc-create-ui.mdx index c3c14e352..9d6511316 100644 --- a/assets/code/video-sdk/live-streaming-multiple-channels/swift/mc-create-ui.mdx +++ b/assets/code/video-sdk/live-streaming-multiple-channels/swift/mc-create-ui.mdx @@ -1,10 +1,10 @@ -``` swift +```swift var secondChannelBtn: NSButton! ``` -``` swift +```swift var secondChannelBtn: UIButton! ``` \ No newline at end of file diff --git a/assets/code/video-sdk/live-streaming-multiple-channels/swift/mc-join-second-channel.mdx b/assets/code/video-sdk/live-streaming-multiple-channels/swift/mc-join-second-channel.mdx index 1c2a95ade..cce82c2ad 100644 --- a/assets/code/video-sdk/live-streaming-multiple-channels/swift/mc-join-second-channel.mdx +++ b/assets/code/video-sdk/live-streaming-multiple-channels/swift/mc-join-second-channel.mdx @@ -1,5 +1,5 @@ -``` swift +```swift @objc func secondChannelBtnClicked() { if isSecondChannelJoined { let result = agoraEngine.leaveChannelEx(rtcSecondConnection, leaveChannelBlock: nil) @@ -41,7 +41,7 @@ ``` -``` swift +```swift @objc func secondChannelBtnClicked() { if isSecondChannelJoined { let result = agoraEngine.leaveChannelEx(rtcSecondConnection, leaveChannelBlock: nil) diff --git a/assets/code/video-sdk/live-streaming-multiple-channels/swift/mc-second-channel-delegate.mdx b/assets/code/video-sdk/live-streaming-multiple-channels/swift/mc-second-channel-delegate.mdx index 120f73df0..7e1f3714e 100644 --- a/assets/code/video-sdk/live-streaming-multiple-channels/swift/mc-second-channel-delegate.mdx +++ b/assets/code/video-sdk/live-streaming-multiple-channels/swift/mc-second-channel-delegate.mdx @@ -1,5 +1,5 @@ -``` swift +```swift class SecondChannelDelegate: NSObject, AgoraRtcEngineDelegate { func rtcEngine(_ engine: AgoraRtcEngineKit, didJoinChannel channel: String, withUid uid: UInt, elapsed: Int) { print("Join Channel Success - joined channel: \(channel) uid: \(uid)") @@ -19,7 +19,7 @@ class SecondChannelDelegate: NSObject, AgoraRtcEngineDelegate { ``` -``` swift +```swift class SecondChannelDelegate: NSObject, AgoraRtcEngineDelegate { func rtcEngine(_ engine: AgoraRtcEngineKit, didJoinChannel channel: String, withUid uid: UInt, elapsed: Int) { print("Join Channel Success - joined channel: \(channel) uid: \(uid)") diff --git a/assets/code/video-sdk/live-streaming-multiple-channels/swift/monitor-relay-state.mdx b/assets/code/video-sdk/live-streaming-multiple-channels/swift/monitor-relay-state.mdx index 2dab6d3bb..0371c5c6c 100644 --- a/assets/code/video-sdk/live-streaming-multiple-channels/swift/monitor-relay-state.mdx +++ b/assets/code/video-sdk/live-streaming-multiple-channels/swift/monitor-relay-state.mdx @@ -1,5 +1,5 @@ -``` swift +```swift func rtcEngine(_ engine: AgoraRtcEngineKit, channelMediaRelayStateDidChange state: AgoraChannelMediaRelayState, error: AgoraChannelMediaRelayError) { // This example shows toast messages when the relay state changes, @@ -34,7 +34,7 @@ func rtcEngine(_ engine: AgoraRtcEngineKit, ``` - ``` swift + ```swift func rtcEngine(_ engine: AgoraRtcEngineKit, channelMediaRelayStateDidChange state: AgoraChannelMediaRelayState, error: AgoraChannelMediaRelayError) { // This example shows toast messages when the relay state changes, diff --git a/assets/code/video-sdk/play-media/configure-engine.mdx b/assets/code/video-sdk/play-media/configure-engine.mdx new file mode 100644 index 000000000..a3d694669 --- /dev/null +++ b/assets/code/video-sdk/play-media/configure-engine.mdx @@ -0,0 +1,21 @@ + +```typescript +function MediaPlaying() { + const agoraEngine = useRTCClient(AgoraRTC.createClient({ codec: "vp8", mode: config.selectedProduct })); + + return ( +
+

Stream media to a channel

+ + + + + +
+ ); +} +``` + - useRTCClient + - AgoraRTCProvider + +
\ No newline at end of file diff --git a/assets/code/video-sdk/play-media/destroy-media-player.mdx b/assets/code/video-sdk/play-media/destroy-media-player.mdx new file mode 100644 index 000000000..d6a87d34b --- /dev/null +++ b/assets/code/video-sdk/play-media/destroy-media-player.mdx @@ -0,0 +1,52 @@ + + ```kotlin + fun destroyMediaPlayer(){ + // Destroy the media player instance and clean up + if (mediaPlayer != null) { + mediaPlayer?.stop() + mediaPlayer?.unRegisterPlayerObserver(mediaPlayerObserver) + mediaPlayer?.destroy() + mediaPlayer = null + } + } + ``` + - stop + - unregisterPlayerSourceObserver + - destroyMediaPlayer + + + ```csharp + public void DestroyMediaPlayer() + { + mediaPlayer.Dispose(); // Dispose of the media player instance + mediaPlayer = null; // Set the media player reference to null + } + + public override void DestroyEngine() + { + // Destroy the media player + if (mediaPlayer != null) + { + mediaPlayer.Stop(); + DestroyMediaPlayer(); + } + + base.DestroyEngine(); // Call the base class's engine cleanup method + } + ``` + * Dispose + + + + ```swift + agoraEngine.destroyMediaPlayer(mediaPlayer) + ``` + + + - destroyMediaPlayer(_:) + + + - destroyMediaPlayer(_:) + + + \ No newline at end of file diff --git a/assets/code/video-sdk/play-media/display-media.mdx b/assets/code/video-sdk/play-media/display-media.mdx new file mode 100644 index 000000000..0df9b3366 --- /dev/null +++ b/assets/code/video-sdk/play-media/display-media.mdx @@ -0,0 +1,68 @@ + + ```kotlin + fun mediaPlayerSurfaceView(): SurfaceView { + // Sets up and returns a SurfaceView to display the media player output + // Instantiate a SurfaceView + val videoSurfaceView = SurfaceView(mContext) + // Create a VideoCanvas using the SurfaceView + val videoCanvas = VideoCanvas( + videoSurfaceView, + Constants.RENDER_MODE_HIDDEN, + 0 + ) + // Set the source type and media player Id + videoCanvas.sourceType = Constants.VIDEO_SOURCE_MEDIA_PLAYER + videoCanvas.mediaPlayerId = mediaPlayer?.mediaPlayerId ?: 0 + // Setup the SurfaceView + agoraEngine?.setupLocalVideo(videoCanvas) + + return videoSurfaceView + } + ``` + - getMediaPlayerId + - setupLocalVideo + + + ```swift + canvas.sourceType = .mediaPlayer + canvas.mediaPlayerId = mediaPlayer.getMediaPlayerId() + agoraEngine.setupLocalVideo(canvas) + ``` + + + - sourceType + - getMediaPlayerId() + - setupLocalVideo(_:) + + + - sourceType + - getMediaPlayerId() + - setupLocalVideo(_:) + + + + + ```csharp + public void PreviewMediaTrack(bool previewMedia) + { + GameObject localViewGo = LocalView.gameObject; + + // Add a VideoSurface component to the local view game object + LocalView = localViewGo.AddComponent(); + + if (previewMedia) + { + // Setup local view to display the media file. + LocalView.SetForUser((uint)mediaPlayer.GetId(), "", VIDEO_SOURCE_TYPE.VIDEO_SOURCE_MEDIA_PLAYER); + } + else + { + // Setup local view to display the local video. + LocalView.SetForUser(0, "", VIDEO_SOURCE_TYPE.VIDEO_SOURCE_CAMERA_PRIMARY); + } + } + ``` + * VideoSurface + * SetForUser + + \ No newline at end of file diff --git a/assets/code/video-sdk/play-media/event-handler.mdx b/assets/code/video-sdk/play-media/event-handler.mdx new file mode 100644 index 000000000..36a08aa69 --- /dev/null +++ b/assets/code/video-sdk/play-media/event-handler.mdx @@ -0,0 +1,191 @@ + + ```kotlin + private val mediaPlayerObserver: IMediaPlayerObserver = object : IMediaPlayerObserver { + override fun onPlayerStateChanged(state: MediaPlayerState, error: MediaPlayerError) { + // Reports changes in playback state + if (state == MediaPlayerState.PLAYER_STATE_OPEN_COMPLETED) { + // Read media duration for updating play progress + mediaDuration = mediaPlayer?.duration ?: 0 + } + + // Notify the UI + mediaPlayerListener.onPlayerStateChanged(state, error) + } + + override fun onPositionChanged(position: Long) { + if (mediaDuration > 0) { + // Calculate the progress percentage + val result = (position.toFloat() / mediaDuration.toFloat() * 100).toInt() + // Notify the UI of the progress + mediaPlayerListener.onProgress(result) + } + } + + override fun onPlayerEvent(eventCode: MediaPlayerEvent, elapsedTime: Long, message: String) { + // Required to implement IMediaPlayerObserver + } + + override fun onMetaData(type: MediaPlayerMetadataType, data: ByteArray) { + // Occurs when the media metadata is received + } + + override fun onPlayBufferUpdated(playCachedBuffer: Long) { + // Reports the playback duration that the buffered data can support + } + + override fun onPreloadEvent(src: String, event: MediaPlayerPreloadEvent) { + // Reports the events of preloaded media resources + } + + override fun onPlayerSrcInfoChanged(from: SrcInfo, to: SrcInfo) { + // Occurs when the video bitrate of the media resource changes + } + + override fun onPlayerInfoUpdated(info: PlayerUpdatedInfo) { + // Occurs when information related to the media player changes + } + + override fun onAudioVolumeIndication(volume: Int) { + // Reports the volume of the media player + } + + override fun onAgoraCDNTokenWillExpire() { + // Required to implement IMediaPlayerObserver + } + } + ``` + - IMediaPlayerObserver + + +```csharp +// Internal class for handling media player events +internal class PlayMediaEventHandler : IMediaPlayerSourceObserver +{ + private PlayMediaManager playMediaManager; + + internal PlayMediaEventHandler(PlayMediaManager refPlayMedia) + { + playMediaManager = refPlayMedia; + } + + public override void OnPlayerSourceStateChanged(MEDIA_PLAYER_STATE state, MEDIA_PLAYER_ERROR error) + { + Debug.Log(state.ToString()); + playMediaManager.state = state; + + if (state == MEDIA_PLAYER_STATE.PLAYER_STATE_OPEN_COMPLETED) + { + // Media file opened successfully. Get the duration of the file to set up the progress bar. + playMediaManager.mediaPlayer.GetDuration(ref playMediaManager.mediaDuration); + } + else if (state == MEDIA_PLAYER_STATE.PLAYER_STATE_PLAYING) + { + playMediaManager.PreviewMediaTrack(true); + playMediaManager.PublishMediaFile(); + } + else if (state == MEDIA_PLAYER_STATE.PLAYER_STATE_PLAYBACK_ALL_LOOPS_COMPLETED) + { + playMediaManager.PreviewMediaTrack(false); + playMediaManager.UnpublishMediaFile(); + // Clean up + playMediaManager.mediaPlayer.Dispose(); + playMediaManager.mediaPlayer = null; + } + else if (state == MEDIA_PLAYER_STATE.PLAYER_STATE_PAUSED) + { + playMediaManager.PreviewMediaTrack(false); + playMediaManager.UnpublishMediaFile(); + } + else if (state == MEDIA_PLAYER_STATE.PLAYER_STATE_FAILED) + { + Debug.Log("Media player failed :" + error); + } + } + + public override void OnPositionChanged(long position) + { + if (playMediaManager.mediaDuration > 0) + { + // Update the ProgressBar + playMediaManager.position = position; + } + } + + public override void OnPlayerEvent(MEDIA_PLAYER_EVENT eventCode, long elapsedTime, string message) + { + // Required to implement IMediaPlayerObserver + } + + public override void OnMetaData(byte[] type, int length) + { + // Required to implement IMediaPlayerObserver + } + + public override void OnAudioVolumeIndication(int volume) + { + // Required to implement IMediaPlayerObserver + } + + public override void OnPlayBufferUpdated(Int64 playCachedBuffer) + { + // Required to implement IMediaPlayerObserver + } + + public override void OnPlayerInfoUpdated(PlayerUpdatedInfo info) + { + // Required to implement IMediaPlayerObserver + } + + public override void OnPlayerSrcInfoChanged(SrcInfo from, SrcInfo to) + { + // Required to implement IMediaPlayerObserver + } + + public override void OnPreloadEvent(string src, PLAYER_PRELOAD_EVENT @event) + { + // Required to implement IMediaPlayerObserver + } +} +``` +* IMediaPlayerSourceObserver + + + + ```swift + public func AgoraRtcMediaPlayer( + _ playerKit: AgoraRtcMediaPlayerProtocol, + didChangedTo state: AgoraMediaPlayerState, + error: AgoraMediaPlayerError + ) { + switch state { + case .openCompleted: + // Media file opened successfully + // Update the UI, and start playing + case .playBackAllLoopsCompleted: + // Media file finished playing + case .playing: + // Media started playing + default: break + } + } + + public func AgoraRtcMediaPlayer( + _ playerKit: AgoraRtcMediaPlayerProtocol, + didChangedTo position: Int + ) { + // Progress as a percentage + let progress = Float(position) / Float(playerKit.getDuration()) + } + ``` + + + - AgoraRtcMediaPlayerDelegate + - AgoraRtcMediaPlayer(_:didChangedTo:error:) + - AgoraRtcMediaPlayer(_:didChangedTo:) + + + - AgoraRtcMediaPlayerDelegate + - AgoraRtcMediaPlayer(_:didChangedTo:error:) + - AgoraRtcMediaPlayer(_:didChangedTo:) + + diff --git a/assets/code/video-sdk/play-media/import-library.mdx b/assets/code/video-sdk/play-media/import-library.mdx new file mode 100644 index 000000000..342a6c8c9 --- /dev/null +++ b/assets/code/video-sdk/play-media/import-library.mdx @@ -0,0 +1,24 @@ + + ```kotlin + + ``` + + + ```swift + import AgoraRtcKit + ``` + + +```javascript +import AgoraManager from "../agora_manager/agora_manager.js"; +import AgoraRTC from "agora-rtc-sdk-ng"; +``` + + + ```typescript + import { AgoraRTCProvider, useRTCClient, usePublish, useConnectionState } from "agora-rtc-react"; + import AgoraRTC, { IBufferSourceAudioTrack } from "agora-rtc-sdk-ng"; + import AuthenticationWorkflowManager from "../authentication-workflow/authenticationWorkflowManager"; + import config from "../agora-manager/config"; + ``` + diff --git a/assets/code/video-sdk/play-media/play-pause-resume.mdx b/assets/code/video-sdk/play-media/play-pause-resume.mdx new file mode 100644 index 000000000..f98be18b5 --- /dev/null +++ b/assets/code/video-sdk/play-media/play-pause-resume.mdx @@ -0,0 +1,110 @@ + + ```kotlin + fun playMedia() { + // Start publishing the media player video + updateChannelPublishOptions(true) + mediaPlayer?.play() + } + + fun pauseMedia() { + mediaPlayer?.pause() + } + + fun resumeMedia() { + mediaPlayer?.resume() + } + ``` + - play + - pause + - resume + + + ```csharp + public void PauseMediaFile() + { + mediaPlayer.Pause(); // Pause the media playback + } + + public void ResumeMediaFile() + { + mediaPlayer.Resume(); // Resume paused media playback + } + + public void PlayMediaFile() + { + mediaPlayer.Play(); // Start or resume playing the media file + } + ``` + * Play + * Resume + * Pause + + + ```swift + func playMedia() { self.mediaPlayer?.play() } + func pauseMedia() { self.mediaPlayer?.pause() } + func resumeMedia() { self.mediaPlayer?.resume() } + ``` + + + - play() + - pause() + - resume() + + + - play() + - pause() + - resume() + + + +1. Play and publish the audio file: + ```typescript + const PlayAudioFile: React.FC<{ track: IBufferSourceAudioTrack }> = ({ track }) => { + usePublish([track]); + + useEffect(() => { + track.startProcessAudioBuffer(); + track.play(); // to play the track for the local user + return () => { + track.stopProcessAudioBuffer(); + track.stop(); + }; + }, [track]); + + return
Audio file playing
; + }; + ``` + - usePublish + - startProcessAudioBuffer + - play + - stopProcessAudioBuffer + - stop + +2. Process the selected audio file: + ```typescript + const MediaPlayingComponent: React.FC = () => { + const [isMediaPlaying, setMediaPlaying] = useState(false); + const [audioFileTrack, setAudioFileTrack] = useState(null); + const connectionState = useConnectionState(); + + // Event handler for selecting an audio file + const handleFileChange = (event: React.ChangeEvent) => { + if (event.target.files && event.target.files.length > 0) { + const selectedFile = event.target.files[0]; + try + { + AgoraRTC.createBufferSourceAudioTrack({ source: selectedFile }) + .then((track) => {setAudioFileTrack(track)}) + .catch((error) => {console.error(error);}) + } catch (error) { + console.error("Error creating buffer source audio track:", error); + } + } + }; + } + ``` + - useConnectionState + - BufferSourceAudioTrackInitConfig + - createBufferSourceAudioTrack +
\ No newline at end of file diff --git a/assets/code/video-sdk/play-media/set-variables.mdx b/assets/code/video-sdk/play-media/set-variables.mdx new file mode 100644 index 000000000..deb3de495 --- /dev/null +++ b/assets/code/video-sdk/play-media/set-variables.mdx @@ -0,0 +1,39 @@ + + ```kotlin + private var mediaPlayer: IMediaPlayer? = null // Instance of the media player + private lateinit var mediaPlayerListener: MediaPlayerListener + ``` + + + ```swift + var mediaPlayer: AgoraRtcMediaPlayerProtocol? + ``` + + + - AgoraRtcMediaPlayerProtocol + + + - AgoraRtcMediaPlayerProtocol + + + + ```csharp + // Internal variables for managing media playback + internal IMediaPlayer mediaPlayer; // Instance of the media player + internal bool isMediaPlaying = false; + internal long mediaDuration = 0; + internal MEDIA_PLAYER_STATE state; + internal long position; + ``` + + +```javascript +const streamMedia = async () => { + // Create an audio track from a source file + const track = await AgoraRTC.createBufferSourceAudioTrack({ + source: "./sample.mp3", + }); + // ... + }; +``` + diff --git a/assets/code/video-sdk/play-media/start-streaming.mdx b/assets/code/video-sdk/play-media/start-streaming.mdx new file mode 100644 index 000000000..ccad9ee66 --- /dev/null +++ b/assets/code/video-sdk/play-media/start-streaming.mdx @@ -0,0 +1,102 @@ + + + 1. Set up a media player + + ```kotlin + fun setupMediaPlayer(listener: MediaPlayerListener){ + if (mediaPlayer == null) { + // Create an instance of the media player + mediaPlayer = agoraEngine?.createMediaPlayer() + // Set the mediaPlayerObserver to receive callbacks + mediaPlayer?.registerPlayerObserver(mediaPlayerObserver) + // A listener to notify the UI + mediaPlayerListener = listener + } + } + ``` + - IMediaPlayer + - createMediaPlayer + - registerPlayerObserver + + + 1. Open a media file + + ```kotlin + fun openMediaFile(mediaLocation: String) { + // Opens the media file at mediaLocation url + // Supports URI files starting with content:// + if (mediaPlayer != null) { + // Open the media file + mediaPlayer?.open(mediaLocation, 0) + } + } + ``` + - open + + + ```swift + func startStreaming(from url: URL) { + mediaPlayer = agoraEngine.createMediaPlayer(with: self) + mediaPlayer!.open(url.path, startPos: 0) + } + ``` + + + - createMediaPlayer(with:) + - AgoraRtcMediaPlayerProtocol + - open(_:startPos:) + + + - createMediaPlayer(with:) + - AgoraRtcMediaPlayerProtocol + - open(_:startPos:) + + + + ```csharp + public void InitMediaPlayer() + { + // Check if the engine exists. + if (agoraEngine == null) + { + // Log a message if the Agora engine is not initialized + Debug.Log("Please click `Join` and then `Play Media` to play the video file"); + return; + } + + // Create an instance of the media player + mediaPlayer = agoraEngine.CreateMediaPlayer(); + + // Create an instance of mediaPlayerObserver class + PlayMediaEventHandler mediaPlayerObserver = new PlayMediaEventHandler(this); + + // Set the mediaPlayerObserver to receive callbacks + mediaPlayer.InitEventHandler(mediaPlayerObserver); + + // Open the media file specified in the configuration data + mediaPlayer.Open(configData.videoFileURL, 0); + } + ``` + * IMediaPlayer + * CreateMediaPlayer + * InitEventHandler + * Open + + + + +```javascript +const streamMedia = async () => { + // Create an audio track from a source file + const track = await AgoraRTC.createBufferSourceAudioTrack({ + source: "./sample.mp3", + }); + // Play the track + track.startProcessAudioBuffer({ loop: false }); + track.play(); + }; +``` +- createBufferSourceAudioTrack +- BufferSourceAudioTrackInitConfig + + diff --git a/assets/code/video-sdk/play-media/swift/configure-buttons.mdx b/assets/code/video-sdk/play-media/swift/configure-buttons.mdx index c1298a7b6..19aeaba14 100644 --- a/assets/code/video-sdk/play-media/swift/configure-buttons.mdx +++ b/assets/code/video-sdk/play-media/swift/configure-buttons.mdx @@ -1,6 +1,6 @@ -``` swift +```swift mediaPlayerBtn = NSButton() mediaPlayerBtn.frame = CGRect(x: 300, y: 60, width: 150, height: 200) mediaPlayerBtn.title = "Open Media File" @@ -15,7 +15,7 @@ self.view.addSubview(mediaProgressView) ``` -``` swift +```swift mediaPlayerBtn = UIButton(type: .system) mediaPlayerBtn.frame = CGRect(x: 100, y:550, width:200, height:50) mediaPlayerBtn.setTitle("Open Media File", for: .normal) diff --git a/assets/code/video-sdk/play-media/swift/create-ui.mdx b/assets/code/video-sdk/play-media/swift/create-ui.mdx index 0dc920b1a..85d6b668b 100644 --- a/assets/code/video-sdk/play-media/swift/create-ui.mdx +++ b/assets/code/video-sdk/play-media/swift/create-ui.mdx @@ -1,12 +1,12 @@ -``` swift +```swift var mediaPlayerBtn: NSButton! var mediaProgressView: NSProgressIndicator! ``` -``` swift +```swift var mediaPlayerBtn: UIButton! var mediaProgressView: UIProgressView! ``` diff --git a/assets/code/video-sdk/play-media/swift/open-play-pause-media.mdx b/assets/code/video-sdk/play-media/swift/open-play-pause-media.mdx index 218523776..244fcd777 100644 --- a/assets/code/video-sdk/play-media/swift/open-play-pause-media.mdx +++ b/assets/code/video-sdk/play-media/swift/open-play-pause-media.mdx @@ -1,6 +1,6 @@ - ``` swift + ```swift @objc func mediaPlayerBtnClicked(sender: NSButton!) { // Initialize the mediaPlayer and open a media file if (mediaPlayer == nil) { @@ -41,7 +41,7 @@ ``` - ``` swift + ```swift @objc func mediaPlayerBtnClicked(sender: UIButton!) { // Initialize the mediaPlayer and open a media file if (mediaPlayer == nil) { diff --git a/assets/code/video-sdk/play-media/update-channel-publish-options.mdx b/assets/code/video-sdk/play-media/update-channel-publish-options.mdx new file mode 100644 index 000000000..89d3d8fb4 --- /dev/null +++ b/assets/code/video-sdk/play-media/update-channel-publish-options.mdx @@ -0,0 +1,79 @@ + + ```kotlin + fun updateChannelPublishOptions(publishMediaPlayer: Boolean) { + val channelOptions = ChannelMediaOptions() + // Start or stop publishing the media player tracks + channelOptions.publishMediaPlayerAudioTrack = publishMediaPlayer + channelOptions.publishMediaPlayerVideoTrack = publishMediaPlayer + // Stop or start publishing the microphone and camera tracks + channelOptions.publishMicrophoneTrack = !publishMediaPlayer + channelOptions.publishCameraTrack = !publishMediaPlayer + // Specify the media player Id for publishing + if (publishMediaPlayer) channelOptions.publishMediaPlayerId = mediaPlayer?.mediaPlayerId + // Implement the settings + agoraEngine?.updateChannelMediaOptions(channelOptions) + } + ``` + - updateChannelMediaOptions + + + ```swift + func updateChannelPublishOptions(publishingMedia: Bool) -> Int32 { + let channelOptions = AgoraRtcChannelMediaOptions() + channelOptions.publishMediaPlayerAudioTrack = publishingMedia + channelOptions.publishMediaPlayerVideoTrack = publishingMedia + // If publishing media player, set the media player ID + if publishingMedia { channelOptions.publishMediaPlayerId = Int(mediaPlayer!.getMediaPlayerId()) } + // Set the regular camera to false if publishing media track + channelOptions.publishMicrophoneTrack = true + channelOptions.publishCameraTrack = !publishingMedia + + return agoraEngine.updateChannel(with: channelOptions) + } + ``` + + + - AgoraRtcChannelMediaOptions + - getMediaPlayerId() + - updateChannel(with:) + + + - AgoraRtcChannelMediaOptions + - getMediaPlayerId() + - updateChannel(with:) + + + + + ```csharp + public void PublishMediaFile() + { + // Configure channel options to publish the media player's audio and video tracks + ChannelMediaOptions channelOptions = new ChannelMediaOptions(); + channelOptions.publishMediaPlayerAudioTrack.SetValue(true); + channelOptions.publishMediaPlayerVideoTrack.SetValue(true); + channelOptions.publishMicrophoneTrack.SetValue(false); + channelOptions.publishCameraTrack.SetValue(false); + channelOptions.publishMediaPlayerId.SetValue(mediaPlayer.GetId()); + + // Update the channel's media options with the configured options + agoraEngine.UpdateChannelMediaOptions(channelOptions); + } + public void UnpublishMediaFile() + { + // Configure channel options to unpublish the media player's audio and video tracks + ChannelMediaOptions channelOptions = new ChannelMediaOptions(); + channelOptions.publishMediaPlayerAudioTrack.SetValue(false); + channelOptions.publishMediaPlayerVideoTrack.SetValue(false); + channelOptions.publishMicrophoneTrack.SetValue(true); + channelOptions.publishCameraTrack.SetValue(true); + + // Update the channel's media options with the configured options + agoraEngine.UpdateChannelMediaOptions(channelOptions); + } + ``` + * ChannelMediaOptions + * GetId + * UpdateChannelMediaOptions + + \ No newline at end of file diff --git a/assets/code/video-sdk/product-workflow/import-library.mdx b/assets/code/video-sdk/product-workflow/import-library.mdx new file mode 100644 index 000000000..b22ceb064 --- /dev/null +++ b/assets/code/video-sdk/product-workflow/import-library.mdx @@ -0,0 +1,52 @@ + + ```kotlin + import io.agora.rtc2.ChannelMediaOptions + import io.agora.rtc2.Constants + import io.agora.rtc2.ScreenCaptureParameters + import io.agora.rtc2.video.VideoCanvas + ``` + + + ```swift + import ReplayKit + import AgoraRtcKit + ``` + + + ```swift + import AgoraRtcKit + ``` + + Make sure to also add the ScreenCapture plugin from the Swift Package. + + + + ```csharp + using Agora.Rtc; + ``` + + +```javascript +import AgoraManager from "../agora_manager/agora_manager.js"; +import AgoraRTC from "agora-rtc-sdk-ng"; +``` + + + ```typescript + import React, { useState, useRef, useEffect } from "react"; + import { + AgoraRTCProvider, + useRTCClient, + useRemoteUsers, + useConnectionState, + useJoin, + usePublish, + useLocalScreenTrack, + useTrackEvent, + } from "agora-rtc-react"; + import AgoraRTC, { DeviceInfo, IAgoraRTCError } from "agora-rtc-sdk-ng"; + import AuthenticationWorkflowManager from "../authentication-workflow/authenticationWorkflowManager"; + import config from "../agora-manager/config"; + import { useAgoraContext } from "../agora-manager/agoraManager"; + ``` + diff --git a/assets/code/video-sdk/product-workflow/ios-extension.mdx b/assets/code/video-sdk/product-workflow/ios-extension.mdx new file mode 100644 index 000000000..4bdc8963d --- /dev/null +++ b/assets/code/video-sdk/product-workflow/ios-extension.mdx @@ -0,0 +1,80 @@ + iOS uses a separate target for screen sharing that uses `ReplayKit`. With the `ReplayKit` package product you can get the screen stream back into the main app with ease. + + 1. Create a new Broadcast Upload Extension target, add Agora's ReplayKit package product to the new target and delete the `SampleHandler` swift file that is automatically created. + + 1. Modify the `screenSharer/Info.plist` file to contain the following key/value pair: + + ```xml + NSExtensionPrincipalClass + AgoraReplayKitHandler + ``` + + 1. Add an [`RPSystemBroadcastPickerView`](https://developer.apple.com/documentation/replaykit/rpsystembroadcastpickerview) to your main app target. + + 1. Set up screen sharing app socket: + + ```swift + func setupScreenSharing() { + let capParams = AgoraScreenCaptureParameters2() + capParams.captureAudio = false + capParams.captureVideo = true + agoraEngine.startScreenCapture(capParams) + } + ``` + + - AgoraScreenCaptureParameters2 + - startScreenCapture(_:) + + 1. Listen for the app extension start and stop: + + ```swift + public func rtcEngine( + _ engine: AgoraRtcEngineKit, localVideoStateChangedOf state: AgoraVideoLocalState, + error: AgoraLocalVideoStreamError, sourceType: AgoraVideoSourceType + ) { + // This delegate method catches whenever a screen is being shared + // from a broadcast extension + if sourceType == .screen { + let connection = AgoraRtcConnection( + channelId: <#Channel Name#>, + localUid: screenShareID + ) + switch state { + case .capturing: + self.publishScreenCaptureTrack(connection) + case .encoding: break + case .stopped, .failed: + // The broadcast extension has finished capturing frames + agoraEngine.leaveChannelEx(connection) + @unknown default: break + } + } + } + ``` + + - rtcEngine(_:localVideoStateChangedOf:error:sourceType:) + - AgoraRtcConnection + - leaveChannelEx(_:leaveChannelBlock:) + + 1. Publish the screen capture: + + ```swift + fileprivate func publishScreenCaptureTrack(_ connection: AgoraRtcConnection) { + /* The broadcast extension has started capturing frames */ + let mediaOptions = AgoraRtcChannelMediaOptions() + mediaOptions.publishCameraTrack = false + mediaOptions.publishMicrophoneTrack = false + mediaOptions.publishScreenCaptureAudio = false + mediaOptions.publishScreenCaptureVideo = true + mediaOptions.clientRoleType = .broadcaster + mediaOptions.autoSubscribeAudio = false + + agoraEngine.joinChannelEx( + byToken: <#Screen share token#>, connection: connection, + delegate: nil, mediaOptions: mediaOptions + ) + } + ``` + + - AgoraRtcChannelMediaOptions + - joinChannelEx(byToken:connection:delegate:mediaOptions:joinSuccess:) diff --git a/assets/code/video-sdk/product-workflow/macos-screencapture.mdx b/assets/code/video-sdk/product-workflow/macos-screencapture.mdx new file mode 100644 index 000000000..028937706 --- /dev/null +++ b/assets/code/video-sdk/product-workflow/macos-screencapture.mdx @@ -0,0 +1,88 @@ + +1. **Get available screens and windows** + + ```swift + func getScreensAndWindows() -> [AgoraScreenCaptureSourceInfo]? { + agoraEngine.getScreenCaptureSources( + withThumbSize: .zero, + iconSize: .zero, + includeScreen: true + ) + } + ``` + + - getScreenCaptureSources(withThumbSize:iconSize:includeScreen:) + - AgoraScreenCaptureSourceInfo + + `getScreenCaptureSources` fetches all the available windows currently running on your mac, and if `includeScreen` is set to true, it also fetches the available screens. + + The `AgoraScreenCaptureSourceInfo` object contains useful information such as the application it comes from (found in sourceName), as well as a snapshot of that screen/window, and a thumbnail of the application source. + +1. **Start capturing the screen or window** + + The `AgoraScreenCaptureSourceInfo/sourceId` property shows its matching `CGWindowID` for either the window or screen. + + ```swift + func startScreenShare(displayId: CGWindowID) { + let params = AgoraScreenCaptureParameters() + params.dimensions = AgoraVideoDimension1920x1080 + params.frameRate = AgoraVideoFrameRate.fps15.rawValue + self.agoraEngine.startScreenCapture( + byDisplayId: displayId, regionRect: .zero, + captureParams: params + ) + } + + func startScreenShare(windowId: CGWindowID) { + let params = AgoraScreenCaptureParameters() + params.dimensions = AgoraVideoDimension1920x1080 + params.frameRate = AgoraVideoFrameRate.fps15.rawValue + self.agoraEngine.startScreenCapture( + byWindowId: windowId, regionRect: .zero, + captureParams: params + ) + } + ``` + + - AgoraScreenCaptureParameters + - startScreenCapture(byWindowId:regionRect:captureParams:) + - startScreenCapture(byDisplayId:regionRect:captureParams:) + +1. **Listen for screenshare update events** + + Switch between publishing your camera feed or screen share feed. + + ```swift + public func rtcEngine( + _ engine: AgoraRtcEngineKit, localVideoStateChangedOf state: AgoraVideoLocalState, + error: AgoraLocalVideoStreamError, sourceType: AgoraVideoSourceType + ) { + if sourceType == .screen { + let newChannelOpt = AgoraRtcChannelMediaOptions() + switch state { + case .capturing: + newChannelOpt.publishScreenTrack = true + newChannelOpt.publishCameraTrack = false + case .stopped, .failed: + newChannelOpt.publishScreenTrack = false + newChannelOpt.publishCameraTrack = true + default: return + } + agoraEngine.updateChannel(with: newChannelOpt) + } + } + ``` + + - rtcEngine(_:localVideoStateChangedOf:error:sourceType:) + - AgoraRtcChannelMediaOptions + - updateChannel(with:) + +1. **Stop sharing** + + ```swift + func stopScreenShare() { + self.agoraEngine.stopScreenCapture() + } + ``` + + - stopScreenCapture() \ No newline at end of file diff --git a/assets/code/video-sdk/product-workflow/media-device-changed.mdx b/assets/code/video-sdk/product-workflow/media-device-changed.mdx new file mode 100644 index 000000000..86bc8889c --- /dev/null +++ b/assets/code/video-sdk/product-workflow/media-device-changed.mdx @@ -0,0 +1,86 @@ + + + + ```typescript + const OnMicrophoneChangedHook: React.FC = () => { + const agoraContext = useAgoraContext(); + + useEffect(() => { + const onMicrophoneChanged = (changedDevice: DeviceInfo) => { + if (changedDevice.state === "ACTIVE") { + agoraContext.localMicrophoneTrack?.setDevice(changedDevice.device.deviceId) + .catch((error: IAgoraRTCError) => console.error(error)); + } else if (changedDevice.device.label === agoraContext.localMicrophoneTrack?.getTrackLabel()) { + AgoraRTC.getMicrophones() + .then((devices) => agoraContext.localMicrophoneTrack?.setDevice(devices[0].deviceId)) + .catch((error) => console.error(error)); + } + }; + AgoraRTC.onMicrophoneChanged = onMicrophoneChanged; + + return () => { + AgoraRTC.onMicrophoneChanged = undefined; + }; + }, [agoraContext.localMicrophoneTrack]); + return null; + }; + + const OnCameraChangedHook: React.FC = () => { + const agoraContext = useAgoraContext(); + + useEffect(() => { + const onCameraChanged = (changedDevice: DeviceInfo) => { + if (changedDevice.state === "ACTIVE") { + agoraContext.localCameraTrack?.setDevice(changedDevice.device.deviceId) + .catch((error) => console.error(error)); + } else if (changedDevice.device.label === agoraContext.localCameraTrack?.getTrackLabel()) { + AgoraRTC.getCameras() + .then((devices) => agoraContext.localCameraTrack?.setDevice(devices[0].deviceId)) + .catch((error) => console.error(error)); + } + }; + + AgoraRTC.onCameraChanged = onCameraChanged; + return () => { + AgoraRTC.onCameraChanged = undefined; + }; + }, [agoraContext.localCameraTrack]); + return null; + }; + ``` + - onMicrophoneChanged + - onCameraChanged + + + + + ```typescript + const OnMicrophoneChangedHook: React.FC = () => { + const agoraContext = useAgoraContext(); + + useEffect(() => { + const onMicrophoneChanged = (changedDevice: DeviceInfo) => { + if (changedDevice.state === "ACTIVE") { + agoraContext.localMicrophoneTrack?.setDevice(changedDevice.device.deviceId) + .catch((error: IAgoraRTCError) => console.error(error)); + } else if (changedDevice.device.label === agoraContext.localMicrophoneTrack?.getTrackLabel()) { + AgoraRTC.getMicrophones() + .then((devices) => agoraContext.localMicrophoneTrack?.setDevice(devices[0].deviceId)) + .catch((error) => console.error(error)); + } + }; + AgoraRTC.onMicrophoneChanged = onMicrophoneChanged; + + return () => { + AgoraRTC.onMicrophoneChanged = undefined; + }; + }, [agoraContext.localMicrophoneTrack]); + return null; + }; + + ``` + - onMicrophoneChanged + + + + \ No newline at end of file diff --git a/assets/code/video-sdk/product-workflow/microphone-camera-change.mdx b/assets/code/video-sdk/product-workflow/microphone-camera-change.mdx new file mode 100644 index 000000000..bcc609577 --- /dev/null +++ b/assets/code/video-sdk/product-workflow/microphone-camera-change.mdx @@ -0,0 +1,71 @@ + +```javascript + AgoraRTC.onAutoplayFailed = () => { + // Create button for the user interaction. + const btn = document.createElement("button"); + // Set the button text. + btn.innerText = "Click me to resume the audio/video playback"; + // Remove the button when onClick event occurs. + btn.onClick = () => { + btn.remove(); + }; + // Append the button to the UI. + document.body.append(btn); + }; + AgoraRTC.onMicrophoneChanged = async (changedDevice) => { + eventsCallback("microphone-changed", changedDevice) + }; + + AgoraRTC.onCameraChanged = async (changedDevice) => { + eventsCallback("camera-changed", changedDevice) + }; +``` +- onAudioAutoplayFailed + +In `eventsCallback` you can handle the events as follows: + +```javascript + const handleVSDKEvents = async (eventName, ...args) => { + switch (eventName) { + // ... other cases + case "microphone-changed": + // When plugging in a device, switch to a device that is newly plugged in. + if (changedDevice.state === "ACTIVE") { + channelParameters.localAudioTrack.setDevice( + changedDevice.device.deviceId + ); + // Switch to an existing device when the current device is unplugged. + } else if ( + changedDevice.device.label === + channelParameters.localAudioTrack.getTrackLabel() + ) { + const oldMicrophones = await AgoraRTC.getMicrophones(); + oldMicrophones[0] && + channelParameters.localAudioTrack.setDevice( + oldMicrophones[0].deviceId + ); + } + case "camera-changed": + // When plugging in a device, switch to a device that is newly plugged in. + if (changedDevice.state === "ACTIVE") { + channelParameters.localVideoTrack.setDevice( + changedDevice.device.deviceId + ); + // Switch to an existing device when the current device is unplugged. + } else if ( + changedDevice.device.label === + channelParameters.localVideoTrack.getTrackLabel() + ) { + const oldCameras = await AgoraRTC.getCameras(); + oldCameras[0] && + channelParameters.localVideoTrack.setDevice(oldCameras[0].deviceId); + } + } + }; +``` +- onMicrophoneChanged +- onCameraChanged +- IMicrophoneAudioTrack.setDevice +- ICameraVideoTrack.setDevice + + diff --git a/assets/code/video-sdk/product-workflow/mute-local-video.mdx b/assets/code/video-sdk/product-workflow/mute-local-video.mdx new file mode 100644 index 000000000..7345b4456 --- /dev/null +++ b/assets/code/video-sdk/product-workflow/mute-local-video.mdx @@ -0,0 +1,21 @@ + +```typescript +const MuteVideoComponent: React.FC = () => { + const agoraContext = useAgoraContext(); + const [isMuteVideo, setMuteVideo] = useState(false); + + const toggleMuteVideo = () => { + agoraContext.localCameraTrack + ?.setEnabled(isMuteVideo) + .then(() => setMuteVideo((prev) => !prev)) + .catch((error) => console.error(error)); + }; + + return ( + + ); +}; +``` + - setEnabled + + \ No newline at end of file diff --git a/assets/code/video-sdk/product-workflow/mute-remote-user.mdx b/assets/code/video-sdk/product-workflow/mute-remote-user.mdx new file mode 100644 index 000000000..e91a6d8e8 --- /dev/null +++ b/assets/code/video-sdk/product-workflow/mute-remote-user.mdx @@ -0,0 +1,77 @@ + + ```kotlin + fun mute(muted: Boolean) { + // Stop or resume publishing the local video stream + agoraEngine?.muteLocalAudioStream(muted) + // Stop or resume subscribing to the video streams of all remote users + agoraEngine?.muteAllRemoteAudioStreams(muted) + // Stop or resume subscribing to the audio stream of a specified user + // agoraEngine?.muteRemoteAudioStream(remoteUid, muted) + } + ``` + - muteLocalAudioStream + + - muteRemoteAudioStream + + - muteAllRemoteAudioStreams + + - enableLocalAudio + + + + ```csharp + public void MuteRemoteAudio(bool value) + { + if (remoteUid > 0) + { + // Pass the uid of the remote user you want to mute. + agoraEngine.MuteRemoteAudioStream(Convert.ToUInt32(remoteUid), value); + } + else + { + Debug.Log("No remote user in the channel"); + } + } + ``` + + - MuteRemoteAudioStream + + + - MuteRemoteAudioStream + + + + ```swift + func muteRemoteUser(uid: UInt, isMuted: Bool) { + self.agoraEngine.muteRemoteAudioStream(uid, mute: isMuted) + } + ``` + + + - muteRemoteAudioStream(_:​mute:) + + + - muteRemoteAudioStream(_:​mute:) + + + +```javascript + // Mute and unmute the local video. + document.getElementById("muteVideo").onclick = async function () { + if (isMuteVideo == false) { + // Mute the local video. + channelParameters.localVideoTrack.setEnabled(false); + // Update the button text. + document.getElementById(`muteVideo`).innerHTML = "Unmute Video"; + isMuteVideo = true; + } else { + // Unmute the local video. + channelParameters.localVideoTrack.setEnabled(true); + // Update the button text. + document.getElementById(`muteVideo`).innerHTML = "Mute Video"; + isMuteVideo = false; + } + }; +``` +- setEnabled + diff --git a/assets/code/video-sdk/product-workflow/override-broadcast-started.mdx b/assets/code/video-sdk/product-workflow/override-broadcast-started.mdx new file mode 100644 index 000000000..a5c39395d --- /dev/null +++ b/assets/code/video-sdk/product-workflow/override-broadcast-started.mdx @@ -0,0 +1,49 @@ + + ```swift + override func broadcastStarted(withSetupInfo setupInfo: [String: NSObject]?) { + guard let channel = UserDefaults(suiteName: "group.uk.rocketar.Docs-Examples")?.string(forKey: "channel") else { + // Failed to get channel + self.broadcastFinished() + return + } + let channelMediaOptions = AgoraRtcChannelMediaOptions() + channelMediaOptions.publishMicrophoneTrack = false + channelMediaOptions.publishCameraTrack = false + channelMediaOptions.publishCustomVideoTrack = true + channelMediaOptions.publishCustomAudioTrack = true + channelMediaOptions.autoSubscribeAudio = false + channelMediaOptions.autoSubscribeVideo = false + channelMediaOptions.clientRoleType = .broadcaster + + engine.joinChannel( + byToken: DocsAppConfig.shared.rtcToken, channelId: channel, + uid: DocsAppConfig.shared.screenShareId, + mediaOptions: channelMediaOptions + ) + } + ``` + + + ```csharp + public void PublishScreenTrack() + { + // Publish the screen track + ChannelMediaOptions channelOptions = new ChannelMediaOptions(); + channelOptions.publishScreenTrack.SetValue(true); + channelOptions.publishMicrophoneTrack.SetValue(true); + channelOptions.publishSecondaryScreenTrack.SetValue(true); + channelOptions.publishCameraTrack.SetValue(false); + agoraEngine.UpdateChannelMediaOptions(channelOptions); + } + + public void UnPublishScreenTrack() + { + // Unpublish the screen track. + ChannelMediaOptions channelOptions = new ChannelMediaOptions(); + channelOptions.publishScreenTrack.SetValue(false); + channelOptions.publishCameraTrack.SetValue(true); + channelOptions.publishMicrophoneTrack.SetValue(true); + agoraEngine.UpdateChannelMediaOptions(channelOptions); + } + ``` + \ No newline at end of file diff --git a/assets/code/video-sdk/product-workflow/preview-screen-track.mdx b/assets/code/video-sdk/product-workflow/preview-screen-track.mdx new file mode 100644 index 000000000..f76f25d2c --- /dev/null +++ b/assets/code/video-sdk/product-workflow/preview-screen-track.mdx @@ -0,0 +1,40 @@ + + ```kotlin + fun screenShareSurfaceView(): SurfaceView { + // Create render view by RtcEngine + val surfaceView = SurfaceView(mContext) + // Setup and return a SurfaceView to render your screen sharing preview + agoraEngine?.startPreview(Constants.VideoSourceType.VIDEO_SOURCE_SCREEN_PRIMARY) + agoraEngine?.setupLocalVideo(VideoCanvas(surfaceView, Constants.RENDER_MODE_FIT, 0)) + return surfaceView + } + ``` + - startPreview + - setupLocalVideo + + + + ```csharp + public void PlayScreenTrackLocally(bool isScreenSharing, GameObject localViewGo) + { + if (isScreenSharing) + { + // Update the VideoSurface component of the local view GameObject. + LocalView = localViewGo.AddComponent(); + // Render the screen sharing track on the local view GameObject. + LocalView.SetForUser(0, "", VIDEO_SOURCE_TYPE.VIDEO_SOURCE_SCREEN_PRIMARY); + } + else + { + // Update the VideoSurface component of the local view GameObject. + LocalView = localViewGo.AddComponent(); + // Render the local video track on the local view GameObject. + LocalView.SetForUser(0, "", VIDEO_SOURCE_TYPE.VIDEO_SOURCE_CAMERA_PRIMARY); + } + } + ``` + + - SetForUser + - VIDEO_SOURCE_TYPE + + \ No newline at end of file diff --git a/assets/code/video-sdk/product-workflow/publish-screen-track.mdx b/assets/code/video-sdk/product-workflow/publish-screen-track.mdx new file mode 100644 index 000000000..a6d2c6f1e --- /dev/null +++ b/assets/code/video-sdk/product-workflow/publish-screen-track.mdx @@ -0,0 +1,40 @@ + + ```kotlin + private fun updateMediaPublishOptions(publishScreen: Boolean) { + val mediaOptions = ChannelMediaOptions() + mediaOptions.publishCameraTrack = !publishScreen + mediaOptions.publishMicrophoneTrack = !publishScreen + mediaOptions.publishScreenCaptureVideo = publishScreen + mediaOptions.publishScreenCaptureAudio = publishScreen + agoraEngine!!.updateChannelMediaOptions(mediaOptions) + } + ``` + - updateChannelMediaOptions + + + ```csharp + public void PublishScreenTrack() + { + // Publish the screen track + ChannelMediaOptions channelOptions = new ChannelMediaOptions(); + channelOptions.publishScreenTrack.SetValue(true); + channelOptions.publishMicrophoneTrack.SetValue(true); + channelOptions.publishSecondaryScreenTrack.SetValue(true); + channelOptions.publishCameraTrack.SetValue(false); + agoraEngine.UpdateChannelMediaOptions(channelOptions); + } + + public void UnPublishScreenTrack() + { + // Unpublish the screen track. + ChannelMediaOptions channelOptions = new ChannelMediaOptions(); + channelOptions.publishScreenTrack.SetValue(false); + channelOptions.publishCameraTrack.SetValue(true); + channelOptions.publishMicrophoneTrack.SetValue(true); + agoraEngine.UpdateChannelMediaOptions(channelOptions); + } + ``` + - UpdateChannelMediaOptions + - ChannelMediaOptions + + \ No newline at end of file diff --git a/assets/code/video-sdk/product-workflow/screen-sharer-target.mdx b/assets/code/video-sdk/product-workflow/screen-sharer-target.mdx new file mode 100644 index 000000000..c4ad6dc5e --- /dev/null +++ b/assets/code/video-sdk/product-workflow/screen-sharer-target.mdx @@ -0,0 +1,76 @@ + + ```swift + class SampleHandler: RPBroadcastSampleHandler, AgoraRtcEngineDelegate { + var engine: AgoraRtcEngineKit { + let config = AgoraRtcEngineConfig() + config.appId = DocsAppConfig.shared.appId + config.channelProfile = .liveBroadcasting + let agoraEngine = AgoraRtcEngineKit.sharedEngine(with: config, delegate: self) + agoraEngine.enableVideo() + agoraEngine.setExternalVideoSource(true, useTexture: true, sourceType: .videoFrame) + let videoConfig = AgoraVideoEncoderConfiguration( + size: videoDimension, frameRate: .fps10, bitrate: AgoraVideoBitrateStandard, + orientationMode: .adaptative, mirrorMode: .auto + ) + agoraEngine.setVideoEncoderConfiguration(videoConfig) + + agoraEngine.setAudioProfile(.default) + agoraEngine.setExternalAudioSource(true, sampleRate: 44100, channels: 2) + return agoraEngine + } + } + ``` + + + ```csharp + private void StartScreenCaptureAndroid(long sourceId) + { + // Configure screen capture parameters for Android. + var parameters2 = new ScreenCaptureParameters2(); + parameters2.captureAudio = true; + parameters2.captureVideo = true; + // Start screen sharing. + agoraEngine.StartScreenCapture(parameters2); + } + + private void StartScreenCaptureWindows(long sourceId) + { + // Configure screen capture parameters for Windows. + agoraEngine.StartScreenCaptureByDisplayId((uint)sourceId, default(Rectangle), + new ScreenCaptureParameters { captureMouseCursor = true, frameRate = 30 }); + } + // Share the screen + public void StartSharing() + { + if (agoraEngine == null) + { + Debug.Log("Join a channel to start screen sharing"); + return; + } + + // Get a list of shareable screens and windows. + var captureSources = GetScreenCaptureSources(); + + if (captureSources != null && captureSources.Length > 0) + { + var sourceId = captureSources[0].sourceId; + + // Start screen sharing based on platform. +#if UNITY_ANDROID || UNITY_IPHONE + StartScreenCaptureAndroid(sourceId); +#else + StartScreenCaptureWindows(sourceId); +#endif + + // Publish the screen track. + PublishScreenTrack(); + } + else + { + Debug.LogWarning("No screen capture sources found."); + } + } + ``` + - StartScreenCaptureByDisplayId + - StartScreenCapture + diff --git a/assets/code/video-sdk/product-workflow/setup-engine.mdx b/assets/code/video-sdk/product-workflow/setup-engine.mdx new file mode 100644 index 000000000..2640684e2 --- /dev/null +++ b/assets/code/video-sdk/product-workflow/setup-engine.mdx @@ -0,0 +1,20 @@ + +```typescript +export function ProductWorkflow() { + const agoraEngine = useRTCClient(AgoraRTC.createClient({ codec: "vp8", mode: config.selectedProduct })); + + return ( +
+

Screen share, volume control and mute

+ + + + + +
+ ); +} +``` + - useRTCClient + - AgoraRTCProvider +
\ No newline at end of file diff --git a/assets/code/video-sdk/product-workflow/setup-volume.mdx b/assets/code/video-sdk/product-workflow/setup-volume.mdx new file mode 100644 index 000000000..64a0b139f --- /dev/null +++ b/assets/code/video-sdk/product-workflow/setup-volume.mdx @@ -0,0 +1,153 @@ + + + ```kotlin + fun adjustVolume(volumeParameter: VolumeTypes, volume: Int) { + when (volumeParameter) { + VolumeTypes.PLAYBACK_SIGNAL_VOLUME -> { + agoraEngine?.adjustPlaybackSignalVolume(volume) + } + VolumeTypes.RECORDING_SIGNAL_VOLUME -> { + agoraEngine?.adjustRecordingSignalVolume(volume) + } + VolumeTypes.USER_PLAYBACK_SIGNAL_VOLUME -> { + if (remoteUids.size > 0) { + val remoteUid = remoteUids.first() // the uid of the remote user + agoraEngine?.adjustUserPlaybackSignalVolume(remoteUid, volume) + } + } + VolumeTypes.AUDIO_MIXING_VOLUME -> { + agoraEngine?.adjustAudioMixingVolume(volume) + } + VolumeTypes.AUDIO_MIXING_PLAYOUT_VOLUME -> { + agoraEngine?.adjustAudioMixingPlayoutVolume(volume) + } + VolumeTypes.AUDIO_MIXING_PUBLISH_VOLUME -> { + agoraEngine?.adjustAudioMixingPublishVolume(volume) + } + VolumeTypes.CUSTOM_AUDIO_PLAYOUT_VOLUME -> { + agoraEngine?.adjustAudioMixingPlayoutVolume(volume) + } + VolumeTypes.CUSTOM_AUDIO_PUBLISH_VOLUME -> { + val trackId = 0 // use the id of your custom audio track + agoraEngine?.adjustCustomAudioPublishVolume(trackId, volume) + } + } + } + ``` + - adjustPlaybackSignalVolume + + - adjustRecordingSignalVolume + + - adjustUserPlaybackSignalVolume + + - adjustAudioMixingVolume + + - adjustAudioMixingPlayoutVolume + + - adjustAudioMixingPublishVolume + + - setInEarMonitoringVolume + + - adjustCustomAudioPlayoutVolume + + - adjustCustomAudioPublishVolume + + + ```swift + func setVolume(for id: UInt, to volume: Int) -> Int32 { + if id == self.localUserId { + return self.agoraEngine.adjustRecordingSignalVolume(volume) + } else { + return self.agoraEngine.adjustUserPlaybackSignalVolume(id, volume: Int32(volume)) + } + } + ``` + + + - adjustRecordingSignalVolume(_:) + - adjustUserPlaybackSignalVolume(_:volume:) + + + - adjustRecordingSignalVolume(_:) + - adjustUserPlaybackSignalVolume(_:volume:) + + + + ```csharp + public void ChangeVolume(int volume) + { + // Adjust the volume of the recorded signal. + agoraEngine.AdjustRecordingSignalVolume(volume); + } + ``` + + - AdjustRecordingSignalVolume + + + - AdjustRecordingSignalVolume + + + +```javascript + // Set an event listener on the range slider. + document + .getElementById("localAudioVolume") + .addEventListener("change", function (evt) { + console.log("Volume of local audio :" + evt.target.value); + // Set the local audio volume. + channelParameters.localAudioTrack.setVolume(parseInt(evt.target.value)); + }); + + // Set an event listener on the range slider. + document + .getElementById("remoteAudioVolume") + .addEventListener("change", function (evt) { + console.log("Volume of remote audio :" + evt.target.value); + // Set the remote audio volume. + channelParameters.remoteAudioTrack.setVolume(parseInt(evt.target.value)); + }); +``` +- setVolume + + + ```typescript + const RemoteAndLocalVolumeComponent: React.FC = () => { + const agoraContext = useAgoraContext(); + const remoteUsers = useRemoteUsers(); + const numberOfRemoteUsers = remoteUsers.length; + const remoteUser = remoteUsers[numberOfRemoteUsers - 1]; + + const handleLocalAudioVolumeChange = (evt: React.ChangeEvent) => { + const volume = parseInt(evt.target.value); + console.log("Volume of local audio:", volume); + agoraContext.localMicrophoneTrack?.setVolume(volume); + }; + + const handleRemoteAudioVolumeChange = (evt: React.ChangeEvent) => { + if (remoteUser) { + const volume = parseInt(evt.target.value); + console.log("Volume of remote audio:", volume); + remoteUser.audioTrack?.setVolume(volume); + } else { + console.log("No remote user in the channel"); + } + }; + + return ( + <> +
+ + +
+
+ + +
+ + ); + } + ``` + - useRemoteUsers + - localMicrophoneTrack.setVolume + - remoteUser.audioTrack.setVolume +
diff --git a/assets/code/video-sdk/product-workflow/start-sharing.mdx b/assets/code/video-sdk/product-workflow/start-sharing.mdx new file mode 100644 index 000000000..60148fc13 --- /dev/null +++ b/assets/code/video-sdk/product-workflow/start-sharing.mdx @@ -0,0 +1,167 @@ + + ```kotlin + fun startScreenSharing() { + // Set screen capture parameters + val screenCaptureParameters = ScreenCaptureParameters() + screenCaptureParameters.captureVideo = true + screenCaptureParameters.captureAudio = true + screenCaptureParameters.videoCaptureParameters.framerate = 15 + screenCaptureParameters.audioCaptureParameters.captureSignalVolume = 100 + + // Start screen sharing + agoraEngine!!.startScreenCapture(screenCaptureParameters) + // Update channel media options to publish the screen sharing video stream + updateMediaPublishOptions(true) + } + ``` + - ScreenCaptureParameters + - startScreenCapture + + + ```swift + class SampleHandler: RPBroadcastSampleHandler, AgoraRtcEngineDelegate { + var engine: AgoraRtcEngineKit { + let config = AgoraRtcEngineConfig() + config.appId = DocsAppConfig.shared.appId + config.channelProfile = .liveBroadcasting + let agoraEngine = AgoraRtcEngineKit.sharedEngine(with: config, delegate: self) + agoraEngine.enableVideo() + agoraEngine.setExternalVideoSource(true, useTexture: true, sourceType: .videoFrame) + let videoConfig = AgoraVideoEncoderConfiguration( + size: videoDimension, frameRate: .fps10, bitrate: AgoraVideoBitrateStandard, + orientationMode: .adaptative, mirrorMode: .auto + ) + agoraEngine.setVideoEncoderConfiguration(videoConfig) + + agoraEngine.setAudioProfile(.default) + agoraEngine.setExternalAudioSource(true, sampleRate: 44100, channels: 2) + return agoraEngine + } + } + ``` + + + 1. Get a list of shareable screens + + ```csharp + // Get the list of shareable screens + private ScreenCaptureSourceInfo[] GetScreenCaptureSources() + { + SIZE targetSize = new SIZE(360, 660); + return agoraEngine.GetScreenCaptureSources(targetSize, targetSize, true); + } + ``` + For more details, see the following: + - GetScreenCaptureSources + - ScreenCaptureSourceInfo + + 1. Start screen capture and share the screen + + ```csharp + private void StartScreenCaptureAndroid(long sourceId) + { + // Configure screen capture parameters for Android. + var parameters2 = new ScreenCaptureParameters2(); + parameters2.captureAudio = true; + parameters2.captureVideo = true; + // Start screen sharing. + agoraEngine.StartScreenCapture(parameters2); + } + + private void StartScreenCaptureWindows(long sourceId) + { + // Configure screen capture parameters for Windows. + agoraEngine.StartScreenCaptureByDisplayId((uint)sourceId, default(Rectangle), + new ScreenCaptureParameters { captureMouseCursor = true, frameRate = 30 }); + } + // Share the screen + public void StartSharing() + { + if (agoraEngine == null) + { + Debug.Log("Join a channel to start screen sharing"); + return; + } + + // Get a list of shareable screens and windows. + var captureSources = GetScreenCaptureSources(); + + if (captureSources != null && captureSources.Length > 0) + { + var sourceId = captureSources[0].sourceId; + + // Start screen sharing based on platform. + #if UNITY_ANDROID || UNITY_IPHONE + StartScreenCaptureAndroid(sourceId); + #else + StartScreenCaptureWindows(sourceId); + #endif + + // Publish the screen track. + PublishScreenTrack(); + } + else + { + Debug.LogWarning("No screen capture sources found."); + } + } + ``` + - StartScreenCaptureByDisplayId + - StartScreenCapture + + +```javascript + const startScreenShare = async (channelParameters, screenPlayerContainer) => { + // Create a screen track for screen sharing. + channelParameters.screenTrack = await AgoraRTC.createScreenVideoTrack(); + await agoraManager + .getAgoraEngine() + .unpublish([channelParameters.localVideoTrack]); + channelParameters.localVideoTrack.close(); + // Replace the video track with the screen track. + await agoraManager + .getAgoraEngine() + .publish([channelParameters.screenTrack]); + // Play the screen track. + channelParameters.screenTrack.play(screenPlayerContainer); + }; +``` +- createScreenVideoTrack + + + ```typescript + const ShareScreenComponent: React.FC<{ setScreenSharing: React.Dispatch> }> = ({ + setScreenSharing, + }) => { + const screenShareClient = useRef(AgoraRTC.createClient({ codec: "vp8", mode: "rtc" })); + const { screenTrack, isLoading, error } = useLocalScreenTrack(true, {}, "disable", screenShareClient.current); + + useJoin({ + appid: config.appId, + channel: config.channelName, + token: config.rtcToken, + uid: 0, + }, true, screenShareClient.current); + + useTrackEvent(screenTrack, "track-ended", () => { + setScreenSharing(false); + }); + useEffect(() => { + if (error) setScreenSharing(false); + }, [error, setScreenSharing]); + + usePublish([screenTrack], screenTrack !== null, screenShareClient.current); + + if (isLoading) { + return

Sharing screen...

; + } + return <>; + }; + ``` + - useRTCClient + - useLocalScreenTrack + - useJoin + - useTrackEvent + - usePublish + +
diff --git a/assets/code/video-sdk/product-workflow/stop-sharing.mdx b/assets/code/video-sdk/product-workflow/stop-sharing.mdx new file mode 100644 index 000000000..a25c929ed --- /dev/null +++ b/assets/code/video-sdk/product-workflow/stop-sharing.mdx @@ -0,0 +1,41 @@ + + ```kotlin + fun stopScreenSharing() { + agoraEngine!!.stopScreenCapture() + // Restore camera and microphone publishing + updateMediaPublishOptions(false) + } + ``` + - stopScreenCapture + + + ```csharp + public void StopSharing() + { + // Stop screen sharing. + agoraEngine.StopScreenCapture(); + + // Publish the local video track when you stop sharing your screen. + UnPublishScreenTrack(); + + } + ``` + - StopScreenCapture + + +```javascript + const stopScreenShare = async (channelParameters, localPlayerContainer) => { + // Replace the screen track with the video track. + await agoraManager + .getAgoraEngine() + .unpublish([channelParameters.screenTrack]); + channelParameters.screenTrack.close(); + channelParameters.localVideoTrack = await AgoraRTC.createCameraVideoTrack(); + await agoraManager + .getAgoraEngine() + .publish([channelParameters.localVideoTrack]); + // Play the video track. + channelParameters.localVideoTrack.play(localPlayerContainer); + }; +``` + diff --git a/assets/code/video-sdk/raw-video-audio/configure-engine.mdx b/assets/code/video-sdk/raw-video-audio/configure-engine.mdx new file mode 100644 index 000000000..3fe89ddbf --- /dev/null +++ b/assets/code/video-sdk/raw-video-audio/configure-engine.mdx @@ -0,0 +1,77 @@ + + + ```csharp + public override void SetupAgoraEngine() + { + var bufferLength = SAMPLE_RATE * CHANNEL; // 1-sec-length buffer + _audioBuffer = new RingBuffer(bufferLength, true); + var canvas = GameObject.Find("Canvas"); + var aud = canvas.AddComponent(); + SetupAudio(aud, "externalClip"); + base.SetupAgoraEngine(); + InitializeTexture(); + agoraEngine.InitEventHandler(new UserEventHandler(this)); + agoraEngine.RegisterVideoFrameObserver(new RawAudioVideoEventHandler(this), + VIDEO_OBSERVER_FRAME_TYPE.FRAME_TYPE_RGBA, + VIDEO_OBSERVER_POSITION.POSITION_POST_CAPTURER | + VIDEO_OBSERVER_POSITION.POSITION_PRE_RENDERER | + VIDEO_OBSERVER_POSITION.POSITION_PRE_ENCODER, + OBSERVER_MODE.RAW_DATA); + SetVideoEncoderConfiguration(); + agoraEngine.SetPlaybackAudioFrameParameters(SAMPLE_RATE, CHANNEL, + RAW_AUDIO_FRAME_OP_MODE_TYPE.RAW_AUDIO_FRAME_OP_MODE_READ_ONLY, 1024); + agoraEngine.SetRecordingAudioFrameParameters(SAMPLE_RATE, CHANNEL, + RAW_AUDIO_FRAME_OP_MODE_TYPE.RAW_AUDIO_FRAME_OP_MODE_READ_ONLY, 1024); + agoraEngine.SetMixedAudioFrameParameters(SAMPLE_RATE, CHANNEL, 1024); + agoraEngine.SetEarMonitoringAudioFrameParameters(SAMPLE_RATE, CHANNEL, + RAW_AUDIO_FRAME_OP_MODE_TYPE.RAW_AUDIO_FRAME_OP_MODE_READ_ONLY, 1024); + agoraEngine.RegisterAudioFrameObserver(new RawAudioEventHandler(this), + AUDIO_FRAME_POSITION.AUDIO_FRAME_POSITION_PLAYBACK | + AUDIO_FRAME_POSITION.AUDIO_FRAME_POSITION_RECORD | + AUDIO_FRAME_POSITION.AUDIO_FRAME_POSITION_MIXED | + AUDIO_FRAME_POSITION.AUDIO_FRAME_POSITION_BEFORE_MIXING | + AUDIO_FRAME_POSITION.AUDIO_FRAME_POSITION_EAR_MONITORING, + OBSERVER_MODE.RAW_DATA); + } + ``` + - InitEventHandler + - RegisterVideoFrameObserver + - SetPlaybackAudioFrameParameters + - SetRecordingAudioFrameParameters + - SetEarMonitoringAudioFrameParameters + - RegisterAudioFrameObserver + + + ```csharp + public override void SetupAgoraEngine() + { + var bufferLength = SAMPLE_RATE * CHANNEL; // 1-sec-length buffer + _audioBuffer = new RingBuffer(bufferLength, true); + var canvas = GameObject.Find("Canvas"); + var aud = canvas.AddComponent(); + SetupAudio(aud, "externalClip"); + base.SetupAgoraEngine(); + agoraEngine.InitEventHandler(new UserEventHandler(this)); + agoraEngine.SetPlaybackAudioFrameParameters(SAMPLE_RATE, CHANNEL, + RAW_AUDIO_FRAME_OP_MODE_TYPE.RAW_AUDIO_FRAME_OP_MODE_READ_ONLY, 1024); + agoraEngine.SetRecordingAudioFrameParameters(SAMPLE_RATE, CHANNEL, + RAW_AUDIO_FRAME_OP_MODE_TYPE.RAW_AUDIO_FRAME_OP_MODE_READ_ONLY, 1024); + agoraEngine.SetMixedAudioFrameParameters(SAMPLE_RATE, CHANNEL, 1024); + agoraEngine.SetEarMonitoringAudioFrameParameters(SAMPLE_RATE, CHANNEL, + RAW_AUDIO_FRAME_OP_MODE_TYPE.RAW_AUDIO_FRAME_OP_MODE_READ_ONLY, 1024); + agoraEngine.RegisterAudioFrameObserver(new RawAudioEventHandler(this), + AUDIO_FRAME_POSITION.AUDIO_FRAME_POSITION_PLAYBACK | + AUDIO_FRAME_POSITION.AUDIO_FRAME_POSITION_RECORD | + AUDIO_FRAME_POSITION.AUDIO_FRAME_POSITION_MIXED | + AUDIO_FRAME_POSITION.AUDIO_FRAME_POSITION_BEFORE_MIXING | + AUDIO_FRAME_POSITION.AUDIO_FRAME_POSITION_EAR_MONITORING, + OBSERVER_MODE.RAW_DATA); + } + ``` + - InitEventHandler + - SetPlaybackAudioFrameParameters + - SetRecordingAudioFrameParameters + - SetEarMonitoringAudioFrameParameters + - RegisterAudioFrameObserver + + \ No newline at end of file diff --git a/assets/code/video-sdk/raw-video-audio/import-library.mdx b/assets/code/video-sdk/raw-video-audio/import-library.mdx new file mode 100644 index 000000000..b2207f0b4 --- /dev/null +++ b/assets/code/video-sdk/raw-video-audio/import-library.mdx @@ -0,0 +1,31 @@ + + + ```kotlin + import io.agora.rtc2.Constants + import io.agora.rtc2.IAudioFrameObserver + import io.agora.rtc2.audio.AudioParams + import io.agora.base.VideoFrame + import io.agora.rtc2.video.IVideoFrameObserver + ``` + + + ```kotlin + import io.agora.rtc2.Constants + import io.agora.rtc2.IAudioFrameObserver + import io.agora.rtc2.audio.AudioParams + ``` + + + + + ```swift + import AgoraRtcKit + ``` + + + ```csharp + using Agora.Rtc; + using RingBuffer; + using UnityEngine.UI; + ``` + \ No newline at end of file diff --git a/assets/code/video-sdk/raw-video-audio/modify-audio-video.mdx b/assets/code/video-sdk/raw-video-audio/modify-audio-video.mdx new file mode 100644 index 000000000..803ecb5e1 --- /dev/null +++ b/assets/code/video-sdk/raw-video-audio/modify-audio-video.mdx @@ -0,0 +1,176 @@ + + In this example, your modify the captured video frame buffer to crop and scale the frame and play a zoomed-in version of the video. + + ```kotlin + private fun modifyVideoBuffer(videoFrame: VideoFrame) { + if (isZoomed) { + // Read the videoFrame buffer + var buffer = videoFrame.buffer + + val w = buffer.width + val h = buffer.height + val cropX = (w - 320) / 2 + val cropY = (h - 240) / 2 + val cropWidth = 320 + val cropHeight = 240 + val scaleWidth = 320 + val scaleHeight = 240 + + // modify the buffer + buffer = buffer.cropAndScale( + cropX, cropY, + cropWidth, cropHeight, + scaleWidth, scaleHeight + ) + + // replace the videoFrame buffer with the modified buffer + videoFrame.replaceBuffer(buffer, 270, videoFrame.timestampNs) + } + } + ``` + - VideoFrame + - AudioFrame + + + + To modify the video frame: + + ```swift + public class ModifyVideoFrameDelegate: NSObject, AgoraVideoFrameDelegate { + public func onCapture( + _ videoFrame: AgoraOutputVideoFrame, sourceType: AgoraVideoSourceType + ) -> Bool { + // Change the video frame immediately after recording it + true + } + + // Indicate the video frame mode of the observer + public func getVideoFrameProcessMode() -> AgoraVideoFrameProcessMode { + // The process mode of the video frame: readOnly, readWrite + // Default is `.readOnly` function is required to change the output. + .readWrite + } + } + ``` + + + - onCapture(_:sourceType:) + - pixelBuffer + + + - onCapture(_:sourceType:) + - pixelBuffer + + + To modify the audio frame: + + ```swift + public class ModifyAudioFrameDelegate: NSObject, AgoraAudioFrameDelegate { + public func onRecordAudioFrame(_ frame: AgoraAudioFrame, channelId: String) -> Bool { + true + } + } + ``` + + + - onRecordAudioFrame(_:channelId:) + - buffer + + + - onRecordAudioFrame(_:channelId:) + - buffer + + + +1. Convert the raw audio data to a float array: + ```csharp + internal static float[] ConvertByteToFloat16(byte[] byteArray) + { + var floatArray = new float[byteArray.Length / 2]; + for (var i = 0; i < floatArray.Length; i++) + { + floatArray[i] = BitConverter.ToInt16(byteArray, i * 2) / 32768f; // -Int16.MinValue + } + + return floatArray; + } + ``` +1. Set up an audio source to play the recorded audio frame: + ```csharp + void SetupAudio(AudioSource aud, string clipName) + { + _audioClip = AudioClip.Create(clipName, + SAMPLE_RATE / PULL_FREQ_PER_SEC * CHANNEL, + CHANNEL, SAMPLE_RATE, true, + OnAudioRead); + aud.clip = _audioClip; + aud.loop = true; + if (isPlaying) + { + aud.Play(); + } + } + ``` +1. Feed the raw audio frame data to the audio source: + ```csharp + private void OnAudioRead(float[] data) + { + lock (_audioBuffer) + { + for (var i = 0; i < data.Length; i++) + { + if (_audioBuffer.Count > 0) + { + data[i] = _audioBuffer.Get(); + _readCount += 1; + } + } + Debug.Log(string.Format("{0},{1},{2},{3},{4},{5},{6},{7},{8}", data[0], data[1], data[2], data[3], data[4], data[5], data[6], data[7], data[8])); + } + + //Debug.LogFormat("buffer length remains: {0}", _writeCount - _readCount); + } + ``` + +4. Resize the captured video frame: + ```csharp + public void ResizeVideoFrame() + { + if (!_isTextureAttach) + { + var rd = LocalView.GetComponent(); + rd.texture = _texture; + _isTextureAttach = true; + } + else if (VideoBuffer != null && VideoBuffer.Length != 0 && !_needResize) + { + lock (VideoBuffer) + { + _texture.LoadRawTextureData(VideoBuffer); + _texture.Apply(); + } + } + else if (_needResize) + { + Debug.Log("Resized frame ==> (Width: " + _videoFrameHeight + " Height: " + _videoFrameWidth + ")"); + _texture.Reinitialize(_videoFrameWidth, _videoFrameHeight); + _texture.Apply(); + _needResize = false; + } + } + ``` + +1. Configure the video encoder according to the resized frame: + ```csharp + // Set video encoder configuration + public void SetVideoEncoderConfiguration() + { + VideoEncoderConfiguration config = new VideoEncoderConfiguration(); + config.dimensions = new VideoDimensions(_videoFrameWidth, _videoFrameHeight); + agoraEngine.SetVideoEncoderConfiguration(config); + } + ``` + - VideoEncoderConfiguration + - SetVideoEncoderConfiguration + + \ No newline at end of file diff --git a/assets/code/video-sdk/raw-video-audio/register-video-audio-frame-observers.mdx b/assets/code/video-sdk/raw-video-audio/register-video-audio-frame-observers.mdx new file mode 100644 index 000000000..492c273c9 --- /dev/null +++ b/assets/code/video-sdk/raw-video-audio/register-video-audio-frame-observers.mdx @@ -0,0 +1,106 @@ + + + To receive callbacks declared in `IVideoFrameObserver` and `IAudioFrameObserver`, you must register the video and audio frame observers with the before joining a channel. To specify the format of audio frames captured by each `IAudioFrameObserver` callback, use the `setRecordingAudioFrameParameters`, `setMixedAudioFrameParameters` and `setPlaybackAudioFrameParameters` methods. + + ```kotlin + override fun joinChannel(channelName: String, token: String?): Int { + // Register the video frame observer + agoraEngine!!.registerVideoFrameObserver(iVideoFrameObserver) + // Register the audio frame observer + agoraEngine!!.registerAudioFrameObserver(iAudioFrameObserver) + + agoraEngine!!.setRecordingAudioFrameParameters( + sampleRate, numberOfChannels, + Constants.RAW_AUDIO_FRAME_OP_MODE_READ_WRITE, samplesPerCall + ) + agoraEngine!!.setPlaybackAudioFrameParameters( + sampleRate, numberOfChannels, + Constants.RAW_AUDIO_FRAME_OP_MODE_READ_WRITE, samplesPerCall + ) + agoraEngine!!.setMixedAudioFrameParameters( + sampleRate, + numberOfChannels, + samplesPerCall + ) + + return super.joinChannel(channelName, token) + } + ``` + - registerAudioFrameObserver + - registerVideoFrameObserver + - setRecordingAudioFrameParameters + - setRecordingAudioFrameParameters + - setPlaybackAudioFrameParameters + + + + To receive callbacks declared in `IAudioFrameObserver`, you must register the audio frame observer with the before joining a channel. To specify the format of audio frames captured by each `IAudioFrameObserver` callback, use the `setRecordingAudioFrameParameters`, `setMixedAudioFrameParameters` and `setPlaybackAudioFrameParameters` methods. + + ```kotlin + override fun joinChannel(channelName: String, token: String?): Int { + // Register the audio frame observer + agoraEngine!!.registerAudioFrameObserver(iAudioFrameObserver) + + agoraEngine!!.setRecordingAudioFrameParameters( + sampleRate, numberOfChannels, + Constants.RAW_AUDIO_FRAME_OP_MODE_READ_WRITE, samplesPerCall + ) + agoraEngine!!.setPlaybackAudioFrameParameters( + sampleRate, numberOfChannels, + Constants.RAW_AUDIO_FRAME_OP_MODE_READ_WRITE, samplesPerCall + ) + agoraEngine!!.setMixedAudioFrameParameters( + sampleRate, + numberOfChannels, + samplesPerCall + ) + + return super.joinChannel(channelName, token) + } + ``` + - registerAudioFrameObserver + - setRecordingAudioFrameParameters + - setRecordingAudioFrameParameters + - setPlaybackAudioFrameParameters + + + + To receive the callbacks that you declared in `AgoraVideoFrameDelegate` and `AgoraAudioFrameDelegate`, you must register the video and audio frame observers with the Agora Engine, before joining a channel. To specify the format of the audio frames captured by each `AgoraAudioFrameDelegate` callback, use the `setRecordingAudioFrameParametersWithSampleRate`, `setPlaybackAudioFrameParametersWithSampleRate` and `setMixedAudioFrameParametersWithSampleRate` methods. + + To do these, add the following lines to the `init()` method of `AgoraManager`: + + ```swift + // Video Setup + self.videoFrameDelegate = ModifyVideoFrameDelegate(modifyController: self) + agoraEngine.setVideoFrameDelegate(videoFrameDelegate) + + // Audio Setup + self.audioFrameDelegate = ModifyAudioFrameDelegate(modifyController: self) + agoraEngine.setAudioFrameDelegate(audioFrameDelegate) + agoraEngine.setRecordingAudioFrameParametersWithSampleRate( + 44100, channel: 1, mode: .readWrite, samplesPerCall: 4410 + ) + agoraEngine.setMixedAudioFrameParametersWithSampleRate( + 44100, channel: 1, samplesPerCall: 4410 + ) + agoraEngine.setPlaybackAudioFrameParametersWithSampleRate( + 44100, channel: 1, mode: .readWrite, samplesPerCall: 4410 + ) + ``` + + + - setVideoFrameDelegate + - setAudioFrameDelegate + - setRecordingAudioFrameParametersWithSampleRate + - setMixedAudioFrameParametersWithSampleRate + - setPlaybackAudioFrameParametersWithSampleRate + + + - setVideoFrameDelegate + - setAudioFrameDelegate + - setRecordingAudioFrameParametersWithSampleRate + - setMixedAudioFrameParametersWithSampleRate + - setPlaybackAudioFrameParametersWithSampleRate + + + \ No newline at end of file diff --git a/assets/code/video-sdk/raw-video-audio/set-audio-frame-observer.mdx b/assets/code/video-sdk/raw-video-audio/set-audio-frame-observer.mdx new file mode 100644 index 000000000..7235b7c04 --- /dev/null +++ b/assets/code/video-sdk/raw-video-audio/set-audio-frame-observer.mdx @@ -0,0 +1,168 @@ + + ```kotlin + private val iAudioFrameObserver: IAudioFrameObserver = object : IAudioFrameObserver { + override fun onRecordAudioFrame( + channelId: String?, + type: Int, + samplesPerChannel: Int, + bytesPerSample: Int, + channels: Int, + samplesPerSec: Int, + buffer: ByteBuffer?, + renderTimeMs: Long, + avsync_type: Int + ): Boolean { + // Gets the captured audio frame. + // Add code here to process the recorded audio. + return false + } + + override fun onPlaybackAudioFrame( + channelId: String?, + type: Int, + samplesPerChannel: Int, + bytesPerSample: Int, + channels: Int, + samplesPerSec: Int, + buffer: ByteBuffer?, + renderTimeMs: Long, + avsync_type: Int + ): Boolean { + // Gets the audio frame for playback. + // Add code here to process the playback audio. + // return true to indicate that Data has been processed + return false + } + + override fun onMixedAudioFrame( + channelId: String?, + type: Int, + samplesPerChannel: Int, + bytesPerSample: Int, + channels: Int, + samplesPerSec: Int, + buffer: ByteBuffer?, + renderTimeMs: Long, + avsync_type: Int + ): Boolean { + // Retrieves the mixed captured and playback audio frame. + return false + } + + override fun onEarMonitoringAudioFrame( + type: Int, + samplesPerChannel: Int, + bytesPerSample: Int, + channels: Int, + samplesPerSec: Int, + buffer: ByteBuffer?, + renderTimeMs: Long, + avsync_type: Int + ): Boolean { + return false + } + + override fun onPlaybackAudioFrameBeforeMixing( + channelId: String?, + userId: Int, + type: Int, + samplesPerChannel: Int, + bytesPerSample: Int, + channels: Int, + samplesPerSec: Int, + buffer: ByteBuffer?, + renderTimeMs: Long, + avsync_type: Int + ): Boolean { + // Retrieves the audio frame of a specified user before mixing. + return false + } + + override fun getObservedAudioFramePosition(): Int { + return 0 + } + + override fun getRecordAudioParams(): AudioParams { + return AudioParams(sampleRate,numberOfChannels, 0 ,samplesPerCall) + } + + override fun getPlaybackAudioParams(): AudioParams { + return AudioParams(sampleRate,numberOfChannels, 0 ,samplesPerCall) + } + + override fun getMixedAudioParams(): AudioParams { + return AudioParams(sampleRate,numberOfChannels, 0 ,samplesPerCall) + } + + override fun getEarMonitoringAudioParams(): AudioParams { + return AudioParams(sampleRate,numberOfChannels, 0 ,samplesPerCall) + } + } + ``` + - IAudioFrameObserver + + + + ```swift + extension ModifyAudioFrameDelegate: AgoraAudioFrameDelegate { + public func onRecordAudioFrame(_ frame: AgoraAudioFrame, channelId: String) -> Bool { + // Change the audio frame immediately after recording it + true + } + public func onPlaybackAudioFrame(_ frame: AgoraAudioFrame, channelId: String) -> Bool { + // Change the audio frame just before playback + true + } + } + ``` + + + - AgoraAudioFrameDelegate + + + - AgoraAudioFrameDelegate + + + + ```csharp + // Internal class for handling audio events + internal class RawAudioEventHandler : IAudioFrameObserver + { + private RawAudioVideoManager _agoraAudioRawData; + + internal RawAudioEventHandler(RawAudioVideoManager agoraAudioRawData) + { + _agoraAudioRawData = agoraAudioRawData; + } + + public override bool OnRecordAudioFrame(string channelId, AudioFrame audioFrame) + { + var floatArray = RawAudioVideoManager.ConvertByteToFloat16(audioFrame.RawBuffer); + + lock (_agoraAudioRawData._audioBuffer) + { + _agoraAudioRawData._audioBuffer.Put(floatArray); + _agoraAudioRawData._writeCount += floatArray.Length; + _agoraAudioRawData._count++; + } + return true; + } + public override bool OnPlaybackAudioFrame(string channelId, AudioFrame audioFrame) + { + return true; + } + public override bool OnPlaybackAudioFrameBeforeMixing(string channel_id, uint uid, AudioFrame audio_frame) + { + return false; + } + + public override bool OnPlaybackAudioFrameBeforeMixing(string channel_id, + string uid, + AudioFrame audio_frame) + { + return false; + } + } + ``` + - IAudioFrameObserver + diff --git a/assets/code/video-sdk/raw-video-audio/set-variables.mdx b/assets/code/video-sdk/raw-video-audio/set-variables.mdx new file mode 100644 index 000000000..b6e917755 --- /dev/null +++ b/assets/code/video-sdk/raw-video-audio/set-variables.mdx @@ -0,0 +1,101 @@ + + + ```kotlin + private var isZoomed = false + // Set the format of the captured raw audio data. + private val sampleRate = 16000 + private val numberOfChannels = 1 + private val samplesPerCall = 1024 + ``` + + + ```kotlin + // Set the format of the captured raw audio data. + private val sampleRate = 16000 + private val numberOfChannels = 1 + private val samplesPerCall = 1024 + ``` + + + + ```swift + var videoFrameDelegate: ModifyVideoFrameDelegate? + var audioFrameDelegate: ModifyAudioFrameDelegate? + ``` + + + - AgoraVideoFrameDelegate + - AgoraAudioFrameDelegate + + + - AgoraVideoFrameDelegate + - AgoraAudioFrameDelegate + + + + + ```csharp + internal byte[] VideoBuffer = new byte[0]; + private bool _needResize = false; + public int _videoFrameWidth = 1080; + public int VideoFrameWidth + { + set + { + if (value != _videoFrameWidth) + { + _needResize = true; + } + } + + get + { + return _videoFrameWidth; + } + } + + public int _videoFrameHeight = 720; + public int VideoFrameHeight + { + set + { + if (value != _videoFrameHeight) + { + _needResize = true; + } + } + + get + { + return _videoFrameHeight; + } + } + private bool _isTextureAttach = false; + private Texture2D _texture; + public int CHANNEL = 2; + public int PULL_FREQ_PER_SEC = 100; + public int SAMPLE_RATE = 48000; + internal int _count; + internal int _writeCount; + internal int _readCount; + internal RingBuffer _audioBuffer; + internal AudioClip _audioClip; + internal bool isPlaying = false; + ``` + + + ```csharp + private bool _isTextureAttach = false; + private Texture2D _texture; + public int CHANNEL = 2; + public int PULL_FREQ_PER_SEC = 100; + public int SAMPLE_RATE = 48000; + internal int _count; + internal int _writeCount; + internal int _readCount; + internal RingBuffer _audioBuffer; + internal AudioClip _audioClip; + internal bool isPlaying = false; + ``` + + \ No newline at end of file diff --git a/assets/code/video-sdk/raw-video-audio/set-video-frame-observer.mdx b/assets/code/video-sdk/raw-video-audio/set-video-frame-observer.mdx new file mode 100644 index 000000000..c7b6bb945 --- /dev/null +++ b/assets/code/video-sdk/raw-video-audio/set-video-frame-observer.mdx @@ -0,0 +1,104 @@ + + ```kotlin + private val iVideoFrameObserver: IVideoFrameObserver = object : IVideoFrameObserver { + override fun onCaptureVideoFrame(sourceType: Int, videoFrame: VideoFrame): Boolean { + modifyVideoBuffer(videoFrame) + return true + } + + override fun onPreEncodeVideoFrame(sourceType: Int, videoFrame: VideoFrame?): Boolean { + return false + } + + override fun onMediaPlayerVideoFrame(videoFrame: VideoFrame, i: Int): Boolean { + return false + } + + override fun onRenderVideoFrame(s: String, i: Int, videoFrame: VideoFrame): Boolean { + return true + } + + override fun getVideoFrameProcessMode(): Int { + // The process mode of the video frame. 0 means read-only, and 1 means read-and-write. + return 1 + } + + override fun getVideoFormatPreference(): Int { + return 1 + } + + override fun getRotationApplied(): Boolean { + return false + } + + override fun getMirrorApplied(): Boolean { + return false + } + + override fun getObservedFramePosition(): Int { + return 0 + } + } + ``` + - IVideoFrameObserver + + + ```swift + extension ModifyVideoFrameDelegate: AgoraVideoFrameDelegate { + public func onCapture( + _ videoFrame: AgoraOutputVideoFrame, sourceType: AgoraVideoSourceType + ) -> Bool { + // Change the video frame immediately after recording it + true + } + + // Indicate the video frame mode of the observer + public func getVideoFrameProcessMode() -> AgoraVideoFrameProcessMode { + // The process mode of the video frame: readOnly, readWrite + // Default is `.readOnly` function is required to change the output. + .readWrite + } + } + ``` + + + - AgoraVideoFrameDelegate + + + - AgoraVideoFrameDelegate + + + + ```csharp + // Internal class for handling media player events + internal class RawAudioVideoEventHandler : IVideoFrameObserver + { + private RawAudioVideoManager rawAudioVideoManager; + + internal RawAudioVideoEventHandler(RawAudioVideoManager refRawAudioVideoManager) + { + rawAudioVideoManager = refRawAudioVideoManager; + } + + public override bool OnCaptureVideoFrame(VIDEO_SOURCE_TYPE type, VideoFrame videoFrame) + { + rawAudioVideoManager.VideoFrameWidth = videoFrame.width; + rawAudioVideoManager.VideoFrameHeight = videoFrame.height; + lock (rawAudioVideoManager.VideoBuffer) + { + rawAudioVideoManager.VideoBuffer = videoFrame.yBuffer; + } + return true; + } + + public override bool OnRenderVideoFrame(string channelId, uint uid, VideoFrame videoFrame) + { + Debug.Log("OnRenderVideoFrameHandler-----------" + " uid:" + uid + " width:" + videoFrame.width + + " height:" + videoFrame.height); + return true; + } + } + ``` + - IVideoFrameObserver + + diff --git a/assets/code/video-sdk/raw-video-audio/swift/register-frame-observers.mdx b/assets/code/video-sdk/raw-video-audio/swift/register-frame-observers.mdx index c68e616ce..8abbc28f3 100644 --- a/assets/code/video-sdk/raw-video-audio/swift/register-frame-observers.mdx +++ b/assets/code/video-sdk/raw-video-audio/swift/register-frame-observers.mdx @@ -1,6 +1,6 @@ -``` swift +```swift agoraEngine.setAudioFrameDelegate(self) agoraEngine.setVideoFrameDelegate(self) @@ -22,7 +22,7 @@ agoraEngine.setMixedAudioFrameParametersWithSampleRate( -``` swift +```swift agoraEngine.setAudioFrameDelegate(self) // Set the format of the captured raw audio data. diff --git a/assets/code/video-sdk/raw-video-audio/swift/unregister-frame-observers.mdx b/assets/code/video-sdk/raw-video-audio/swift/unregister-frame-observers.mdx index dee28f9ea..7a276bc3c 100644 --- a/assets/code/video-sdk/raw-video-audio/swift/unregister-frame-observers.mdx +++ b/assets/code/video-sdk/raw-video-audio/swift/unregister-frame-observers.mdx @@ -1,13 +1,13 @@ -``` swift +```swift agoraEngine.setAudioFrameDelegate(nil) agoraEngine.setVideoFrameDelegate(nil) ``` -``` swift +```swift agoraEngine.setAudioFrameDelegate(nil) ``` diff --git a/assets/code/video-sdk/raw-video-audio/unregister-video-audio-frame-observers.mdx b/assets/code/video-sdk/raw-video-audio/unregister-video-audio-frame-observers.mdx new file mode 100644 index 000000000..4a4606363 --- /dev/null +++ b/assets/code/video-sdk/raw-video-audio/unregister-video-audio-frame-observers.mdx @@ -0,0 +1,57 @@ + + ```kotlin + override fun leaveChannel() { + agoraEngine!!.registerVideoFrameObserver(null) + agoraEngine!!.registerAudioFrameObserver(null) + + super.leaveChannel() + } + ``` + + + ```swift + agoraEngine.setAudioFrameDelegate(nil) + agoraEngine.setVideoFrameDelegate(nil) + ``` + + + - setVideoFrameDelegate + - setAudioFrameDelegate + + + - setVideoFrameDelegate + - setAudioFrameDelegate + + + + + ```csharp + public override void DestroyEngine() + { + if (agoraEngine == null) + { + return; + } + agoraEngine.UnRegisterVideoFrameObserver(); + agoraEngine.UnRegisterAudioFrameObserver(); + base.DestroyEngine(); + } + ``` + - UnRegisterVideoFrameObserver + - UnRegisterAudioFrameObserver + + + ```csharp + public override void DestroyEngine() + { + if (agoraEngine == null) + { + return; + } + agoraEngine.UnRegisterAudioFrameObserver(); + base.DestroyEngine(); + } + ``` + - UnRegisterAudioFrameObserver + + \ No newline at end of file diff --git a/assets/code/video-sdk/spatial-audio/import-library.mdx b/assets/code/video-sdk/spatial-audio/import-library.mdx new file mode 100644 index 000000000..393079017 --- /dev/null +++ b/assets/code/video-sdk/spatial-audio/import-library.mdx @@ -0,0 +1,35 @@ + + ```kotlin + import io.agora.spatialaudio.ILocalSpatialAudioEngine + import io.agora.spatialaudio.LocalSpatialAudioConfig + import io.agora.spatialaudio.RemoteVoicePositionInfo + ``` + + + ```swift + import AgoraRtcKit + ``` + + +```javascript +import AgoraManager from "../agora_manager/agora_manager.js"; +import AgoraRTC from "agora-rtc-sdk-ng"; +``` + + + ```typescript + import { useEffect, useRef, useState } from "react"; + import AgoraRTC, { IBufferSourceAudioTrack, UID } from "agora-rtc-sdk-ng"; + import { + SpatialAudioExtension, + SpatialAudioProcessor + } from "agora-extension-spatial-audio"; + import { + useConnectionState, + useRemoteUsers, + useRTCClient, + AgoraRTCProvider + } from "agora-rtc-react"; + import AuthenticationWorkflowManager from "../authentication-workflow/authenticationWorkflowManager"; + ``` + diff --git a/assets/code/video-sdk/spatial-audio/play-media.mdx b/assets/code/video-sdk/spatial-audio/play-media.mdx new file mode 100644 index 000000000..9d3995d3e --- /dev/null +++ b/assets/code/video-sdk/spatial-audio/play-media.mdx @@ -0,0 +1,49 @@ + +Add a method to allow playing media files with spatial audio effects: +```javascript +const playMediaWithSpatialAudio = async () => { + const processor = spatialAudioExtension.createProcessor(); + processors.set("media-player", processor); + + const track = await AgoraRTC.createBufferSourceAudioTrack({ + source: "./sample.mp3", + }); + + // Define the spatial position for the local audio player. + const mockLocalPlayerNewPosition = { + position: [0, 0, 0], + forward: [0, 0, 0], + }; + + // Update the spatial position for the local audio player. + processor.updatePlayerPositionInfo(mockLocalPlayerNewPosition); + + track.startProcessAudioBuffer({ loop: true }); + track.pipe(processor).pipe(track.processorDestination); + track.play(); + return track; +}; +``` + +You can call this method in the UI as follows: +```javascript +document.getElementById("playAudioFile").onclick = + async function localPlayerStart() { + if (isMediaPlaying) { + channelParameters.mediaPlayerTrack.stop(); + isMediaPlaying = false; + document.getElementById("playAudioFile").textContent = + "Play audio file"; + return; + } + + let track = agoraManager.playMediaWithSpatialAudio(); + console.log(track) + + isMediaPlaying = true; + document.getElementById("playAudioFile").textContent = + "Stop playing audio"; + channelParameters.mediaPlayerTrack = track; + }; +``` + diff --git a/assets/code/video-sdk/spatial-audio/remove-spatial.mdx b/assets/code/video-sdk/spatial-audio/remove-spatial.mdx new file mode 100644 index 000000000..1eea20390 --- /dev/null +++ b/assets/code/video-sdk/spatial-audio/remove-spatial.mdx @@ -0,0 +1,37 @@ + + ```csharp + public override void Leave() + { + if(localSpatial != null) + { + localSpatial.ClearRemotePositions(); + } + base.Leave(); + } + ``` + + - ClearRemotePositions + + + - ClearRemotePositions + + + + ```typescript + const cleanupFunction = () => { + try { + const disablePromises = Array.from(processors.current.values()).map(async (processor) => { + if (processor) { + await processor.disable(); + } + }); + + Promise.all(disablePromises).catch((reason) => console.log(reason)); + processors.current.clear(); + AgoraRTC.registerExtensions([]); + } catch (error) { + console.error("Error in cleanup:", error); + } + }; + ``` + \ No newline at end of file diff --git a/assets/code/video-sdk/spatial-audio/set-variables.mdx b/assets/code/video-sdk/spatial-audio/set-variables.mdx new file mode 100644 index 000000000..ba4296069 --- /dev/null +++ b/assets/code/video-sdk/spatial-audio/set-variables.mdx @@ -0,0 +1,58 @@ + + ```kotlin + // Instance of the spatial audio engine + private var spatialAudioEngine: ILocalSpatialAudioEngine? = null + ``` + - ILocalSpatialAudioEngine + + + ```swift + var localSpatial: AgoraLocalSpatialAudioKit! + ``` + + - AgoraLocalSpatialAudioKit + + + - AgoraLocalSpatialAudioKit + + + + + ```csharp + private ILocalSpatialAudioEngine localSpatial; + ``` + + +```javascript +var distance = 0; // Used to define and change the the spatial position +var isMediaPlaying = false; + +const processors = new Map(); +const spatialAudioExtension = new SpatialAudioExtension({ + assetsPath: "/node_modules/agora-extension-spatial-audio/external/", +}); + +const mockLocalUserNewPosition = { + // In a production app, the position can be generated by + // dragging the local user's avatar in a 3D scene. + position: [1, 1, 1], // Coordinates in the world coordinate system + forward: [1, 0, 0], // The unit vector of the front axis + right: [0, 1, 0], // The unit vector of the right axis + up: [0, 0, 1], // The unit vector of the vertical axis +}; +``` + + + ```typescript + const [isMediaPlaying, setMediaPlaying] = useState(false); + const [isRegistered, setRegistered] = useState(false); + const [audioFileTrack, setAudioFileTrack] = useState(null); + const remoteUsers = useRemoteUsers(); + const numberOfRemoteUsers = remoteUsers.length; + const remoteUser = remoteUsers[numberOfRemoteUsers - 1]; + const extension = useRef(null); + const processors = useRef>(new Map()); + const [distance, setDistance] = useState(0); + const mediaPlayerKey = "media-player"; + ``` + diff --git a/assets/code/video-sdk/spatial-audio/setup-local.mdx b/assets/code/video-sdk/spatial-audio/setup-local.mdx new file mode 100644 index 000000000..6ee629292 --- /dev/null +++ b/assets/code/video-sdk/spatial-audio/setup-local.mdx @@ -0,0 +1,98 @@ + + ```kotlin + // Define the position of the local user + val pos = floatArrayOf(0.0f, 0.0f, 0.0f) + val forward = floatArrayOf(1.0f, 0.0f, 0.0f) + val right = floatArrayOf(0.0f, 1.0f, 0.0f) + val up = floatArrayOf(0.0f, 0.0f, 1.0f) + // Set the position of the local user + spatialAudioEngine?.updateSelfPosition(pos, forward, right, up) + ``` + - ILocalSpatialAudioEngine.updateSelfPosition + + + ```swift + func updateLocalUser() { + // Self position at origin, x-right, y-up, facing -Z axis + let pos: [NSNumber] = [0, 0, 0] + let right: [NSNumber] = [1, 0, 0] + let up: [NSNumber] = [0, 1, 0] + let forward: [NSNumber] = [0, 0, -1] + + self.localSpatial.updateSelfPosition( + pos, + axisForward: forward, + axisRight: right, + axisUp: up + ) + } + ``` + + - updateSelfPosition(_:axisForward:axisRight:axisUp:) + + + - updateSelfPosition(_:axisForward:axisRight:axisUp:) + + + + + ```javascript + spatialAudioExtension.updateSelfPosition( + mockLocalUserNewPosition.position, + mockLocalUserNewPosition.forward, + mockLocalUserNewPosition.right, + mockLocalUserNewPosition.up + ); + ``` + - [updateSelfPosition](#updateselfposition) + + + 1. Process and play the audio file to test the local spatial audio features: + ```typescript + const AudioFileTrack: React.FC<{ track: IBufferSourceAudioTrack }> = ({ track }) => { + useEffect(() => { + track.startProcessAudioBuffer({ loop: true }); + track.play(); // to play the track for the local user + return () => { + track.stopProcessAudioBuffer(); + track.stop(); + }; + }, [track]); + return
Audio file is playing. Use +/- to change the spatial audio position
; + }; + const PlayMediaFile = () => { + const processor = processors.current.get(mediaPlayerKey); + if(!processor) + { + const processorRef = extension.current!.createProcessor(); + processors.current.set(mediaPlayerKey, processorRef); + AgoraRTC.createBufferSourceAudioTrack({ + source: "../src/assets/sample.wav", // Replace with the actual audio file path + }) + .then((track) => + { + track.pipe(processorRef).pipe(track.processorDestination); + setAudioFileTrack(track); + }) + .catch((error) => console.log(error)); + } + setMediaPlaying(!isMediaPlaying); + }; + ``` + - createBufferSourceAudioTrack + - BufferSourceAudioTrackInitConfig + - pipe + + 2. Update the spatial audio position of the audio file: + ```typescript + if (isMediaPlaying) { + // update the spatial position of the audio file. + const processorRef = processors.current.get(mediaPlayerKey); + processorRef?.updatePlayerPositionInfo({ + position: [distance, 0, 0], + forward: [1, 0, 0], + }); + } + ``` + +
diff --git a/assets/code/video-sdk/spatial-audio/setup-remote.mdx b/assets/code/video-sdk/spatial-audio/setup-remote.mdx new file mode 100644 index 000000000..2f7580a84 --- /dev/null +++ b/assets/code/video-sdk/spatial-audio/setup-remote.mdx @@ -0,0 +1,147 @@ + + ```kotlin + fun updateRemoteSpatialAudioPosition(remoteUid: Int, front: Float, right: Float, top: Float) { + // Define a remote user's spatial position + val positionInfo = RemoteVoicePositionInfo() + // The three values represent the front, right, and top coordinates + positionInfo.position = floatArrayOf(front, right, top) + positionInfo.forward = floatArrayOf(0.0f, 0.0f, -1.0f) + + // Update the spatial position of the specified remote user + spatialAudioEngine?.updateRemotePosition(remoteUid, positionInfo) + sendMessage("Spatial position of remote user ${remoteUid} updated") + } + ``` + - RemoteVoicePositionInfo + - ILocalSpatialAudioEngine.updateRemotePosition + + + ```swift + func updateRemoteUser(_ uid: UInt, position: [NSNumber], forward: [NSNumber]) { + let positionInfo = AgoraRemoteVoicePositionInfo() + positionInfo.position = position + positionInfo.forward = forward + + self.localSpatial.updateRemotePosition( + uid, positionInfo: positionInfo + ) + } + ``` + + + - AgoraRemoteVoicePositionInfo + - updateRemotePosition(_:positionInfo:) + + + - AgoraRemoteVoicePositionInfo + - updateRemotePosition(_:positionInfo:) + + + + ```csharp + public void UpdateSpatialAudioPosition(float sourceDistance) + { + if (remoteUid < 1) + { + Debug.Log("No remote user in the channel"); + return; + } + // Set the coordinates in the world coordinate system. + // This parameter is an array of length 3 + // The three values represent the front, right, and top coordinates + float[] position = new float[] { sourceDistance, 4.0F, 0.0F }; + // Set the unit vector of the x-axis in the coordinate system. + // This parameter is an array of length 3, + // The three values represent the front, right, and top coordinates + float[] forward = new float[] { sourceDistance, 0.0F, 0.0F }; + // Update the spatial position of the specified remote user + RemoteVoicePositionInfo remotePosInfo = new RemoteVoicePositionInfo(position, forward); + int res = localSpatial.UpdateRemotePosition((uint)remoteUid, remotePosInfo); + if (res == 0) + { + Debug.Log("Remote user spatial position updated"); + } + else + { + Debug.Log("Updating position failed with error: " + res); + } + } + ``` + + - RemoteVoicePositionInfo + - UpdateRemotePosition + + + - RemoteVoicePositionInfo + - UpdateRemotePosition + + + + + Add a method `updatePosition` to update remote positions: + + ```javascript + function updatePosition(distance, channelParameters) { + if (isMediaPlaying) { + const processor = processors.get("media-player"); + processor.updatePlayerPositionInfo({ + position: [distance, 0, 0], + forward: [1, 0, 0], + }); + } else { + const processor = processors.get(channelParameters.remoteUid); + processor.updateRemotePosition({ + position: [distance, 0, 0], + forward: [1, 0, 0], + }); + } + } + ``` + - [updateRemotePosition](#updateremoteposition) + + You can call `updatePosition` in the UI as follows: + + ```javascript + document.getElementById("decreaseDistance").onclick = async function () { + distance -= 5; + document.getElementById("distanceLabel").textContent = distance; + agoraManager.updatePosition(distance, channelParameters); + }; + + document.getElementById("increaseDistance").onclick = async function () { + distance += 5; + document.getElementById("distanceLabel").textContent = distance; + agoraManager.updatePosition(distance, channelParameters); + }; + ``` + + + 1. Create processors for each remote user: + ```typescript + if (remoteUser && !processors.current.has(remoteUser.uid)) { + console.log("Initializing spatial audio processor..."); + try { + const processor = extension.createProcessor(); + processors.current.set(remoteUser.uid, processor); + remoteUser.audioTrack?.pipe(processor).pipe(remoteUser.audioTrack.processorDestination); + await processor.enable(); + } catch (error) { + console.error("Error enabling spatial extension:", error); + } + } + ``` + - pipe + + 1. Update the spatial position of a remote user: + ```typescript + else if (remoteUser) { + // Update the spatial position of the remote user. + const processorRef = processors.current.get(remoteUser.uid); + processorRef?.updateRemotePosition({ + position: [distance, 0, 0], + forward: [1, 0, 0], + }); + } + ``` + + diff --git a/assets/code/video-sdk/spatial-audio/setup-spatial.mdx b/assets/code/video-sdk/spatial-audio/setup-spatial.mdx new file mode 100644 index 000000000..0ac3d70cb --- /dev/null +++ b/assets/code/video-sdk/spatial-audio/setup-spatial.mdx @@ -0,0 +1,151 @@ + + ```kotlin + private fun configureSpatialAudioEngine() { + // Enable spatial audio + agoraEngine!!.enableSpatialAudio(true) + + // Create and initialize the spatial audio engine + val localSpatialAudioConfig = LocalSpatialAudioConfig() + localSpatialAudioConfig.mRtcEngine = agoraEngine + spatialAudioEngine = ILocalSpatialAudioEngine.create() + spatialAudioEngine?.initialize(localSpatialAudioConfig) + + // Set the audio reception range of the local user in meters + spatialAudioEngine?.setAudioRecvRange(50F) + + // Set the length of unit distance in meters + spatialAudioEngine?.setDistanceUnit(1F) + } + ``` + - enableSpatialAudio + - LocalSpatialAudioConfig + - ILocalSpatialAudioEngine.create + - ILocalSpatialAudioEngine.initialize + - ILocalSpatialAudioEngine.setAudioRecvRange + - ILocalSpatialAudioEngine.setDistanceUnit + + + + ```swift + func configureSpatialAudioEngine() { + agoraEngine.setAudioProfile(.speechStandard, scenario: .gameStreaming) + + // The next line is only required if using bluetooth headphones from iOS/iPadOS + agoraEngine.setParameters(#"{"che.audio.force_bluetooth_a2dp":true}"#) + + agoraEngine.enableSpatialAudio(true) + let localSpatialAudioConfig = AgoraLocalSpatialAudioConfig() + localSpatialAudioConfig.rtcEngine = agoraEngine + localSpatial = AgoraLocalSpatialAudioKit.sharedLocalSpatialAudio(with: localSpatialAudioConfig) + + // By default Agora subscribes to the audio streams of all remote users. + // Unsubscribe all remote users; otherwise, the audio reception range you set + // is invalid. + localSpatial.muteLocalAudioStream(false) + localSpatial.muteAllRemoteAudioStreams(false) + + // Set the audio reception range, in meters, of the local user + localSpatial.setAudioRecvRange(50) + // Set the length, in meters, of unit distance + localSpatial.setDistanceUnit(1) + } + ``` + + + - setAudioProfile(_:scenario:) + - enableSpatialAudio(_:) + - AgoraLocalSpatialAudioConfig + - sharedLocalSpatialAudio(with:) + - muteLocalAudioStream(_:) + - muteAllRemoteAudioStreams(_:) + - setAudioRecvRange(_:) + - setDistanceUnit(_:) + + + - setAudioProfile(_:scenario:) + - enableSpatialAudio(_:) + - AgoraLocalSpatialAudioConfig + - sharedLocalSpatialAudio(with:) + - muteLocalAudioStream(_:) + - muteAllRemoteAudioStreams(_:) + - setAudioRecvRange(_:) + - setDistanceUnit(_:) + + + + + ```csharp + private void ConfigureSpatialAudioEngine() + { + agoraEngine.EnableSpatialAudio(true); + LocalSpatialAudioConfig localSpatialAudioConfig = new LocalSpatialAudioConfig(); + localSpatialAudioConfig.rtcEngine = agoraEngine; + localSpatial = agoraEngine.GetLocalSpatialAudioEngine(); + localSpatial.Initialize(); + // By default Agora subscribes to the audio streams of all remote users. + // Unsubscribe all remote users; otherwise, the audio reception range you set + // is invalid. + localSpatial.MuteLocalAudioStream(true); + localSpatial.MuteAllRemoteAudioStreams(true); + + // Set the audio reception range, in meters, of the local user + localSpatial.SetAudioRecvRange(50); + + // Set the length, in meters, of unit distance + localSpatial.SetDistanceUnit(1); + + // Update self position + float[] pos = new float[] { 0.0F, 0.0F, 0.0F }; + float[] forward = new float[] { 1.0F, 0.0F, 0.0F }; + float[] right = new float[] { 0.0F, 1.0F, 0.0F }; + float[] up = new float[] { 0.0F, 0.0F, 1.0F }; + // Set the position of the local user + localSpatial.UpdateSelfPosition(pos, forward, right, up); + } + ``` + + - GetLocalSpatialAudioEngine + - LocalSpatialAudioConfig + - Initialize + - MuteLocalAudioStream + - MuteAllRemoteAudioStreams + - SetAudioRecvRange + - SetDistanceUnit + - UpdateSelfPosition + + + - GetLocalSpatialAudioEngine + - LocalSpatialAudioConfig + - Initialize + - MuteLocalAudioStream + - MuteAllRemoteAudioStreams + - SetAudioRecvRange + - SetDistanceUnit + - UpdateSelfPosition + + + +```javascript +// Enable spatial audio +AgoraRTC.registerExtensions([spatialAudioExtension]); +``` + + + 1. Create an instance of the spatial audio engine: + ```typescript + const extension = new SpatialAudioExtension({ + assetsPath: "./node_modules/agora-extension-spatial-audio/external/", + }); + ``` + 2. Register the extension with the engine: + ```typescript + const initializeSpatialProcessor = async () => { + if(!isRegistered) + { + console.log("Registering spatial audio extension..."); + AgoraRTC.registerExtensions([extension]); + setRegistered(true); + } + }; + ``` + diff --git a/assets/code/video-sdk/virtual-background/blur-background.mdx b/assets/code/video-sdk/virtual-background/blur-background.mdx new file mode 100644 index 000000000..b1f3ca434 --- /dev/null +++ b/assets/code/video-sdk/virtual-background/blur-background.mdx @@ -0,0 +1,75 @@ + + ```kotlin + fun setBlurBackground() { + val virtualBackgroundSource = VirtualBackgroundSource() + virtualBackgroundSource.backgroundSourceType = VirtualBackgroundSource.BACKGROUND_BLUR + virtualBackgroundSource.blurDegree = VirtualBackgroundSource.BLUR_DEGREE_MEDIUM + setBackground(virtualBackgroundSource) + } + + private fun setBackground(virtualBackgroundSource: VirtualBackgroundSource) { + // Set processing properties for background + val segmentationProperty = SegmentationProperty() + segmentationProperty.modelType = SegmentationProperty.SEG_MODEL_AI + // Use SEG_MODEL_GREEN if you have a green background + segmentationProperty.greenCapacity = + 0.5f // Accuracy for identifying green colors (range 0-1) + + // Enable or disable virtual background + agoraEngine!!.enableVirtualBackground( + true, + virtualBackgroundSource, segmentationProperty + ) + } + ``` + - VirtualBackgroundSource + - SegmentationProperty + - enableVirtualBackground + + + + ```swift + func blurBackground() { + let virtualBackgroundSource = AgoraVirtualBackgroundSource() + virtualBackgroundSource.backgroundSourceType = .blur + virtualBackgroundSource.blurDegree = .high + + let segData = AgoraSegmentationProperty() + segData.modelType = .agoraAi + + agoraEngine.enableVirtualBackground(true, backData: virtualBackgroundSource, segData: segData) + } + ``` + + + - AgoraVirtualBackgroundSource + - AgoraSegmentationProperty + - enableVirtualBackground + + + - AgoraVirtualBackgroundSource + - AgoraSegmentationProperty + - enableVirtualBackground + + + + ```typescript + const blurBackground = () => { + processor.current?.setOptions({ type: "blur", blurDegree: 2 }); + }; + ``` + + +```js +// Blur the user's actual background +async function setBackgroundBlurring(channelParameters) { + if (channelParameters.localVideoTrack) { + let processor = await getProcessorInstance(channelParameters); + processor.setOptions({ type: "blur", blurDegree: 2 }); + await processor.enable(); + + isVirtualBackGroundEnabled = true; + } +} +``` + diff --git a/assets/code/video-sdk/virtual-background/color-background.mdx b/assets/code/video-sdk/virtual-background/color-background.mdx new file mode 100644 index 000000000..fd00c7ca0 --- /dev/null +++ b/assets/code/video-sdk/virtual-background/color-background.mdx @@ -0,0 +1,101 @@ + + ```kotlin + fun setSolidBackground() { + val virtualBackgroundSource = VirtualBackgroundSource() + virtualBackgroundSource.backgroundSourceType = VirtualBackgroundSource.BACKGROUND_COLOR + virtualBackgroundSource.color = 0x0000FF + setBackground(virtualBackgroundSource) + } + ``` + - VirtualBackgroundSource + + + ```swift + func colorBackground() { + let virtualBackgroundSource = AgoraVirtualBackgroundSource() + virtualBackgroundSource.backgroundSourceType = .color + virtualBackgroundSource.color = convertColorToHex(.red) + + let segData = AgoraSegmentationProperty() + segData.modelType = .agoraAi + + agoraEngine.enableVirtualBackground(true, backData: virtualBackgroundSource, segData: segData) + } + ``` + + For converting the color to a hex Integer: + + + ```swift + func convertColorToHex(_ color: NSColor) -> UInt { + var red: CGFloat = 0 + var green: CGFloat = 0 + var blue: CGFloat = 0 + var alpha: CGFloat = 0 + + color.getRed(&red, green: &green, blue: &blue, alpha: &alpha) + + let redInt = UInt(red * 255) + let greenInt = UInt(green * 255) + let blueInt = UInt(blue * 255) + + let hexValue = (redInt << 16) | (greenInt << 8) | blueInt + + return hexValue + } + ``` + + + ```swift + func convertColorToHex(_ color: UIColor) -> UInt { + var red: CGFloat = 0 + var green: CGFloat = 0 + var blue: CGFloat = 0 + var alpha: CGFloat = 0 + + color.getRed(&red, green: &green, blue: &blue, alpha: &alpha) + + let redInt = UInt(red * 255) + let greenInt = UInt(green * 255) + let blueInt = UInt(blue * 255) + + let hexValue = (redInt << 16) | (greenInt << 8) | blueInt + + return hexValue + } + ``` + + + + - AgoraVirtualBackgroundSource + - AgoraSegmentationProperty + - enableVirtualBackground + + + - AgoraVirtualBackgroundSource + - AgoraSegmentationProperty + - enableVirtualBackground + + + + + ```typescript + const colorBackground = () => { + processor.current?.setOptions({ type: "color", color: "#00ff00" }); + }; + ``` + + +```js +// Set a solid color as the background +async function setBackgroundColor(channelParameters) { + if (channelParameters.localVideoTrack) { + let processor = await getProcessorInstance(channelParameters); + processor.setOptions({ type: "color", color: "#00ff00" }); + await processor.enable(); + + isVirtualBackGroundEnabled = true; + } +} +``` + diff --git a/assets/code/video-sdk/virtual-background/configure-engine.mdx b/assets/code/video-sdk/virtual-background/configure-engine.mdx new file mode 100644 index 000000000..c0ec078e5 --- /dev/null +++ b/assets/code/video-sdk/virtual-background/configure-engine.mdx @@ -0,0 +1,26 @@ + +```typescript +function VirtualBackground() { + const agoraEngine = useRTCClient(AgoraRTC.createClient({ codec: "vp8", mode: config.selectedProduct })); + + return ( +
+

Virtual Background

+ + + + + +
+ ); +} +``` + - useRTCClient + - AgoraRTCProvider + +
+ +```js +const agoraManager = await AgoraManager(eventsCallback); +``` + diff --git a/assets/code/video-sdk/virtual-background/device-compatibility.mdx b/assets/code/video-sdk/virtual-background/device-compatibility.mdx new file mode 100644 index 000000000..672c068dd --- /dev/null +++ b/assets/code/video-sdk/virtual-background/device-compatibility.mdx @@ -0,0 +1,48 @@ + + ```kotlin + fun isFeatureAvailable() :Boolean { + return agoraEngine!!.isFeatureAvailableOnDevice( + Constants.FEATURE_VIDEO_VIRTUAL_BACKGROUND + ) + } + ``` + - isFeatureAvailableOnDevice + + + ```swift + guard agoraEngine.isFeatureAvailable(onDevice: .videoPreprocessVirtualBackground) else { + // Device doesn't support virtual background + return + } + ``` + + + - isFeatureAvailable + + + - isFeatureAvailable + + + + + ```typescript + const checkCompatibility = () => { + if (!extension.current.checkCompatibility()) { + console.error("Does not support virtual background!"); + return; + } + } + ``` + + +Add the following code after `const extension = new VirtualBackgroundExtension();`: +```js +// Create a VirtualBackgroundExtension instance + const extension = new VirtualBackgroundExtension(); + // Check browser compatibility virtual background extension + if (!extension.checkCompatibility()) { + console.error("Does not support Virtual Background!"); + // Handle exit code + } +``` + diff --git a/assets/code/video-sdk/virtual-background/image-background.mdx b/assets/code/video-sdk/virtual-background/image-background.mdx new file mode 100644 index 000000000..90105d8d2 --- /dev/null +++ b/assets/code/video-sdk/virtual-background/image-background.mdx @@ -0,0 +1,66 @@ + + ```kotlin + fun setImageBackground() { + val virtualBackgroundSource = VirtualBackgroundSource() + virtualBackgroundSource.backgroundSourceType = VirtualBackgroundSource.BACKGROUND_IMG + virtualBackgroundSource.source = "" + setBackground(virtualBackgroundSource) + } + ``` + - VirtualBackgroundSource + + + + For this example, you should include an image `"background_ss.jpg"` in your App's bundle. + + ```swift + func imageBackground() { + let virtualBackgroundSource = AgoraVirtualBackgroundSource() + virtualBackgroundSource.backgroundSourceType = .img + virtualBackgroundSource.source = Bundle.main.path(forResource: "background_ss", ofType: "jpg") + + let segData = AgoraSegmentationProperty() + segData.modelType = .agoraAi + + agoraEngine.enableVirtualBackground(true, backData: virtualBackgroundSource, segData: segData) + } + ``` + + - AgoraVirtualBackgroundSource + - AgoraSegmentationProperty + - enableVirtualBackground + + + - AgoraVirtualBackgroundSource + - AgoraSegmentationProperty + - enableVirtualBackground + + + + ```typescript + const imageBackground = () => { + const image = new Image(); + image.onload = () => { + processor.current?.setOptions({ type: "img", source: image }); + }; + image.src = demoImage; + }; + ``` + + +```js +// Set an image as the background +async function setBackgroundImage(channelParameters) { + const imgElement = document.createElement("img"); + + imgElement.onload = async () => { + let processor = await getProcessorInstance(channelParameters); + processor.setOptions({ type: "img", source: imgElement }); + await processor.enable(); + + isVirtualBackGroundEnabled = true; + }; + imgElement.src = "./background.jpg"; +} +``` + diff --git a/assets/code/video-sdk/virtual-background/import-library.mdx b/assets/code/video-sdk/virtual-background/import-library.mdx new file mode 100644 index 000000000..4b4cb3536 --- /dev/null +++ b/assets/code/video-sdk/virtual-background/import-library.mdx @@ -0,0 +1,61 @@ + + ```kotlin + import io.agora.rtc2.Constants + import io.agora.rtc2.video.SegmentationProperty + import io.agora.rtc2.video.VirtualBackgroundSource + ``` + + + + + ```swift + import AgoraRtcKit + ``` + + --- + + You must also import the virtual background plugin to your target. + + i. **Swift Package Manager** + + Add the product "VirtualBackground" to your app target. This is part of the AgoraRtcEngine Swift Package. + + ii. **CocoaPods** + + Include "VirtualBackground" in the subspecs in your Podfile: + + + ```rb + target 'Your App' do + pod 'AgoraRtcEngine_iOS', '~> 4.2', :subspecs => ['RtcBasic', 'VirtualBackground'] + end + ``` + + + ```rb + target 'Your App' do + pod 'AgoraRtcEngine_macOS', '~> 4.2', :subspecs => ['RtcBasic', 'VirtualBackground'] + end + ``` + + + --- + + +```typescript +import AgoraRTC from "agora-rtc-sdk-ng"; +import AuthenticationWorkflowManager from "../authentication-workflow/authenticationWorkflowManager"; +import VirtualBackgroundExtension, { IVirtualBackgroundProcessor } from "agora-extension-virtual-background"; +import { useConnectionState } from 'agora-rtc-react'; +import { useAgoraContext } from "../agora-manager/agoraManager"; +import wasm from "agora-extension-virtual-background/wasms/agora-wasm.wasm?url"; +import demoImage from '../assets/image.webp'; +``` + + +```js +import AgoraManager from "../agora_manager/agora_manager.js"; +import AgoraRTC from "agora-rtc-sdk-ng"; +import VirtualBackgroundExtension from "agora-extension-virtual-background"; +``` + diff --git a/assets/code/video-sdk/virtual-background/reset-background.mdx b/assets/code/video-sdk/virtual-background/reset-background.mdx new file mode 100644 index 000000000..f8b0fb5d8 --- /dev/null +++ b/assets/code/video-sdk/virtual-background/reset-background.mdx @@ -0,0 +1,60 @@ + + ```kotlin + fun removeBackground() { + // Disable virtual background + agoraEngine!!.enableVirtualBackground( + false, + VirtualBackgroundSource(), SegmentationProperty() + ) + } + ``` + + + ```swift + agoraEngine.enableVirtualBackground(false, backData: nil, segData: nil) + ``` + + + - enableVirtualBackground + + + - enableVirtualBackground + + + + +```typescript +function VirtualBackgroundComponent() { + const [isVirtualBackground, setVirtualBackground] = useState(false); + const connectionState = useConnectionState(); + + return ( +
+ {isVirtualBackground ? ( +
+ + +
+ ) : ( + + )} +
+ ); +} +``` + - useConnectionState + +
+ +```js +// Disable background +async function disableBackground(channelParameters) { + let processor = await getProcessorInstance(channelParameters); + processor.disable(); + + isVirtualBackGroundEnabled = false; +} +``` + diff --git a/assets/code/video-sdk/virtual-background/set-virtual-background.mdx b/assets/code/video-sdk/virtual-background/set-virtual-background.mdx new file mode 100644 index 000000000..fd3fc833c --- /dev/null +++ b/assets/code/video-sdk/virtual-background/set-virtual-background.mdx @@ -0,0 +1,115 @@ + + ```csharp + public void setVirtualBackground(bool enableVirtualBackgroud, string option) + { + if(agoraEngine == null) + { + Debug.Log("Please join a channel to enable virtual background"); + return; + } + VirtualBackgroundSource virtualBackgroundSource = new VirtualBackgroundSource(); + + // Set the type of virtual background + if (option == "Blur") + { // Set background blur + virtualBackgroundSource.background_source_type = BACKGROUND_SOURCE_TYPE.BACKGROUND_BLUR; + virtualBackgroundSource.blur_degree = BACKGROUND_BLUR_DEGREE.BLUR_DEGREE_HIGH; + Debug.Log("Blur background enabled"); + } + else if (option == "Color") + { // Set a solid background color + virtualBackgroundSource.background_source_type = BACKGROUND_SOURCE_TYPE.BACKGROUND_COLOR; + virtualBackgroundSource.color = 0x0000FF; + Debug.Log("Color background enabled"); + } + else if (option == "Image") + { // Set a background image + virtualBackgroundSource.background_source_type = BACKGROUND_SOURCE_TYPE.BACKGROUND_IMG; + virtualBackgroundSource.source = "Assets/agora.png"; + Debug.Log("Image background enabled"); + } + + // Set processing properties for background + SegmentationProperty segmentationProperty = new SegmentationProperty(); + segmentationProperty.modelType = SEG_MODEL_TYPE.SEG_MODEL_AI; // Use SEG_MODEL_GREEN if you have a green background + segmentationProperty.greenCapacity = 0.5F; // Accuracy for identifying green colors (range 0-1) + + // Enable or disable virtual background + agoraEngine.EnableVirtualBackground(enableVirtualBackgroud, virtualBackgroundSource, segmentationProperty); + } + ``` + - VirtualBackgroundSource + - EnableVirtualBackground + - SegmentationProperty + + + +```typescript +useEffect(() => { + const initializeVirtualBackgroundProcessor = async () => { + AgoraRTC.registerExtensions([extension.current]); + + checkCompatibility(); + + if (agoraContext.localCameraTrack) { + console.log("Initializing virtual background processor..."); + try { + processor.current = extension.current.createProcessor(); + await processor.current.init(wasm); + agoraContext.localCameraTrack.pipe(processor.current).pipe(agoraContext.localCameraTrack.processorDestination); + processor.current.setOptions({ type: "color", color: "#00ff00" }); + await processor.current.enable(); + setSelectedOption('color'); + } catch (error) { + console.error("Error initializing virtual background:", error); + } + } + }; + + void initializeVirtualBackgroundProcessor(); + + return () => { + const disableVirtualBackground = async () => { + processor.current?.unpipe(); + agoraContext.localCameraTrack?.unpipe(); + await processor.current?.disable(); + }; + void disableVirtualBackground(); + }; + }, [agoraContext.localCameraTrack]); +``` + - pipe + - unpipe + + + +```js + // Create a VirtualBackgroundExtension instance + const extension = new VirtualBackgroundExtension(); + + // Register the extension + AgoraRTC.registerExtensions([extension]); + let processor = null; + + // Initialization + async function getProcessorInstance(channelParameters) { + if (!processor && channelParameters.localVideoTrack) { + // Create a VirtualBackgroundProcessor instance + processor = extension.createProcessor(); + + try { + // Initialize the extension and pass in the URL of the Wasm file + await processor.init("./assets/wasms"); + } catch (e) { + console.log("Fail to load WASM resource!"); + return null; + } + // Inject the extension into the video processing pipeline in the SDK + channelParameters.localVideoTrack + .pipe(processor) + .pipe(channelParameters.localVideoTrack.processorDestination); + } + return processor; + } +``` + diff --git a/assets/code/voice-sdk/get-started-sdk/swift/create-ui.mdx b/assets/code/voice-sdk/get-started-sdk/swift/create-ui.mdx index ed008c1e4..59158e8ac 100644 --- a/assets/code/voice-sdk/get-started-sdk/swift/create-ui.mdx +++ b/assets/code/voice-sdk/get-started-sdk/swift/create-ui.mdx @@ -1,5 +1,5 @@ -``` swift +```swift import Cocoa import AppKit import Foundation @@ -47,7 +47,7 @@ class ViewController: NSViewController { - ``` swift + ```swift import UIKit import AVFoundation diff --git a/assets/code/voice-sdk/get-started-sdk/swift/show-message.mdx b/assets/code/voice-sdk/get-started-sdk/swift/show-message.mdx index fc02562b3..6aba83486 100644 --- a/assets/code/voice-sdk/get-started-sdk/swift/show-message.mdx +++ b/assets/code/voice-sdk/get-started-sdk/swift/show-message.mdx @@ -1,5 +1,5 @@ -``` swift +```swift func showMessage(title: String, text: String, delay: Int = 2) -> Void { let deadlineTime = DispatchTime.now() + .seconds(delay) DispatchQueue.main.asyncAfter(deadline: deadlineTime, execute: { @@ -13,7 +13,7 @@ func showMessage(title: String, text: String, delay: Int = 2) -> Void { ``` -``` swift +```swift func showMessage(title: String, text: String, delay: Int = 2) -> Void { let deadlineTime = DispatchTime.now() + .seconds(delay) DispatchQueue.main.asyncAfter(deadline: deadlineTime, execute: { diff --git a/assets/code/voice-sdk/get-started-sdk/swift/view-did-disappear.mdx b/assets/code/voice-sdk/get-started-sdk/swift/view-did-disappear.mdx index f1055a74e..9ec4b7c43 100644 --- a/assets/code/voice-sdk/get-started-sdk/swift/view-did-disappear.mdx +++ b/assets/code/voice-sdk/get-started-sdk/swift/view-did-disappear.mdx @@ -1,5 +1,5 @@ -``` swift +```swift override func viewDidDisappear() { super.viewDidDisappear() @@ -9,7 +9,7 @@ ``` -``` swift +```swift override func viewDidDisappear(_ animated: Bool) { super.viewDidDisappear(animated) leaveChannel() diff --git a/assets/images/chat/chat-call-logic-android.svg b/assets/images/chat/chat-call-logic-android.svg index 39eff30bf..8da563373 100644 --- a/assets/images/chat/chat-call-logic-android.svg +++ b/assets/images/chat/chat-call-logic-android.svg @@ -1 +1 @@ -Your appAgoraUserUserChat SDKChat SDKChatServerChatServerInitializeOpen appGet a ChatClient instanceagoraChatClient = ChatClient.getInstance()Initialize the instance:agoraChatClient.init(context, options)Add message event callbacks:agoraChatClient.chatManager().addMessageListener(...)Add connection event callbacks:agoraChatClient.addConnectionListener(...)Log inJoin a chatRetrieve authentication token for the userLog in to the chat server:agoraChatClient.loginWithAgoraToken(...)onConnected() callbackSend messagesSend a messageCreate a ChatMessageSend the ChatMessage:agoraChatClient.chatManager().sendMessage(message)Receive messagesonMessageReceived(messages) callbackDisplay messageCloseLeave the chatLog out:agoraChatClient.logout(...) \ No newline at end of file +Your appAgoraUserUserChat SDKChat SDKChatServerChatServerInitializeOpen appGet a ChatClient instanceagoraChatClient = ChatClient.getInstance()Initialize the instance:agoraChatClient.init(context, options)Add message event callbacks:agoraChatClient.chatManager().addMessageListener(...)Add connection event callbacks:agoraChatClient.addConnectionListener(...)Log inJoin a chatRetrieve authentication token for the userLog in to the chat server:agoraChatClient.loginWithAgoraToken(...)onConnected() callbackSend messagesSend a messageCreate a ChatMessageSend the ChatMessage:agoraChatClient.chatManager().sendMessage(message)Receive messagesonMessageReceived(messages) callbackDisplay messageCloseLeave the chatLog out:agoraChatClient.logout(...) \ No newline at end of file diff --git a/assets/images/chat/chat-call-logic-flutter.svg b/assets/images/chat/chat-call-logic-flutter.svg index 8753c4cf9..1d3d56b62 100644 --- a/assets/images/chat/chat-call-logic-flutter.svg +++ b/assets/images/chat/chat-call-logic-flutter.svg @@ -1 +1 @@ -Your appAgoraUserUserChat SDKChat SDKChatServerChatServerInitializeOpen appGet a ChatClient instanceagoraChatClient = ChatClient.getInstanceInitialize the instance:agoraChatClient.init(options)Add message event callbacks:agoraChatClient.chatManager.addEventHandler(...)Add connection event callbacks:agoraChatClient.addConnectionEventHandler(...)Log inJoin a chatRetrieve authentication token for the userLog in to the chat server:agoraChatClient.loginWithAgoraToken(userId, token)onConnected() callbackSend messagesSend a messageCreate a ChatMessageSend the ChatMessage:agoraChatClient.chatManager.sendMessage(message)Receive messagesonMessagesReceived(List<ChatMessage> messages) callbackDisplay messageCloseLeave the chatLog out:agoraChatClient.logout(...) \ No newline at end of file +Your appAgoraUserUserChat SDKChat SDKChatServerChatServerInitializeOpen appGet a ChatClient instanceagoraChatClient = ChatClient.getInstanceInitialize the instance:agoraChatClient.init(options)Add message event callbacks:agoraChatClient.chatManager.addEventHandler(...)Add connection event callbacks:agoraChatClient.addConnectionEventHandler(...)Log inJoin a chatRetrieve authentication token for the userLog in to the chat server:agoraChatClient.loginWithAgoraToken(userId, token)onConnected() callbackSend messagesSend a messageCreate a ChatMessageSend the ChatMessage:agoraChatClient.chatManager.sendMessage(message)Receive messagesonMessagesReceived(List<ChatMessage> messages) callbackDisplay messageCloseLeave the chatLog out:agoraChatClient.logout(...) \ No newline at end of file diff --git a/assets/images/chat/chat-call-logic-unity.svg b/assets/images/chat/chat-call-logic-unity.svg index 2cb29b852..4e7cb397f 100644 --- a/assets/images/chat/chat-call-logic-unity.svg +++ b/assets/images/chat/chat-call-logic-unity.svg @@ -1 +1 @@ -Your appAgoraUserUserChat SDKChat SDKChatServerChatServerInitializeOpen appInit a chat SDK instance:SDKClient.Instance.InitWithOptions(options);Add message event callbacks:SDKClient.Instance.ChatManager.AddChatManagerDelegate(this);Log inJoin a chatRetrieve authentication token for the userLog in to the chat server:SDKClient.Instance.LoginWithAgoraToken(...)onConnected() callbackSend messagesSend a messageCreate a ChatMessageSend the ChatMessage: SDKClient.Instance.ChatManager.SendMessage(...)Receive messagesonMessageReceived(messages) callbackDisplay messageCloseLeave the chatLog out:agoraChatClient.Logout(...) \ No newline at end of file +Your appAgoraUserUserChat SDKChat SDKChatServerChatServerInitializeOpen appInit a chat SDK instance:SDKClient.Instance.InitWithOptions(options);Add message event callbacks:SDKClient.Instance.ChatManager.AddChatManagerDelegate(this);Log inJoin a chatRetrieve authentication token for the userLog in to the chat server:SDKClient.Instance.LoginWithAgoraToken(...)onConnected() callbackSend messagesSend a messageCreate a ChatMessageSend the ChatMessage: SDKClient.Instance.ChatManager.SendMessage(...)Receive messagesonMessageReceived(messages) callbackDisplay messageCloseLeave the chatLog out:agoraChatClient.Logout(...) \ No newline at end of file diff --git a/assets/images/chat/chat-call-logic-windows.svg b/assets/images/chat/chat-call-logic-windows.svg index 57b2508b8..9e6776848 100644 --- a/assets/images/chat/chat-call-logic-windows.svg +++ b/assets/images/chat/chat-call-logic-windows.svg @@ -1 +1 @@ -Your appAgoraUserUserChat SDKChat SDKAgoraChatAgoraChatInitializeOpen appGet a AgoraChat SDKClient instanceSDKClient.InstanceInitialize the instance:SDKClient.Instance.InitWithOptions(options)Add message event callbacks:SDKClient.Instance.ChatManager.AddChatManagerDelegate(...)Add connection event callbacks:SDKClient.Instance.AddConnectionDelegate(...)Log inJoin a chatRetrieve authentication token for the userLog in to Agora Chat:SDKClient.Instance.LoginWithAgoraToken(...)onConnected() callbackSend messagesSend a messageCreate a ChatMessageSend the ChatMessage:SDKClient.Instance.ChatManager.SendMessage(...)Receive messagesonMessageReceived(messages) callbackDisplay messageCloseLeave the chatLog out:SDKClient.Instance.Logout(...) \ No newline at end of file +Your appAgoraUserUserChat SDKChat SDKAgoraChatAgoraChatInitializeOpen appGet a AgoraChat SDKClient instanceSDKClient.InstanceInitialize the instance:SDKClient.Instance.InitWithOptions(options)Add message event callbacks:SDKClient.Instance.ChatManager.AddChatManagerDelegate(...)Add connection event callbacks:SDKClient.Instance.AddConnectionDelegate(...)Log inJoin a chatRetrieve authentication token for the userLog in to Agora Chat:SDKClient.Instance.LoginWithAgoraToken(...)onConnected() callbackSend messagesSend a messageCreate a ChatMessageSend the ChatMessage:SDKClient.Instance.ChatManager.SendMessage(...)Receive messagesonMessageReceived(messages) callbackDisplay messageCloseLeave the chatLog out:SDKClient.Instance.Logout(...) \ No newline at end of file diff --git a/assets/images/extensions-marketplace/geofencing.svg b/assets/images/extensions-marketplace/geofencing.svg new file mode 100644 index 000000000..6aa7d43e0 --- /dev/null +++ b/assets/images/extensions-marketplace/geofencing.svg @@ -0,0 +1 @@ +Audio and video dataCapturePre-processEncodeTransmitDecodePost-processPlayAI Noise suppressionAudio data \ No newline at end of file diff --git a/assets/images/extensions-marketplace/ncs-worflow.svg b/assets/images/extensions-marketplace/ncs-worflow.svg new file mode 100644 index 000000000..949ba5855 --- /dev/null +++ b/assets/images/extensions-marketplace/ncs-worflow.svg @@ -0,0 +1 @@ +Implemented by youProvided by AgoraProvided by ActiveFenceUserUserWeb serverWeb serverAppAppSD-RTNSD-RTNActiveFenceActiveFenceLoginLogin authenticationJoin channelJoin channel withActiveFence activatedMonitor contentin channelContent matchesworkflow riskAction webhookAct on webhook. For example, useChannel Management REST APIto remove user from channelLog user outLogout \ No newline at end of file diff --git a/assets/images/interactive-live-streaming/ils-call-logic-android.svg b/assets/images/interactive-live-streaming/ils-call-logic-android.svg index 5a79b96f7..92170cfb3 100644 --- a/assets/images/interactive-live-streaming/ils-call-logic-android.svg +++ b/assets/images/interactive-live-streaming/ils-call-logic-android.svg @@ -1,490 +1 @@ -Your appAgoraUserUserVideo SDKVideo SDKSD-RTNSD-RTNOpen AppInitiate the Agora Video SDK engine:agoraEngine = RtcEngine.createEnable the video module:agoraEngine.enableVideo()Set the channel profile:agoraEngine.setChannelProfile(CHANNEL_PROFILE_LIVE_BROADCASTING)HostStart a live streaming eventRetrieve authentication token to join channelSet the role as host:agoraEngine.setClientRole(Constants.CLIENT_ROLE_BROADCASTER)Setup local video:agoraEngine.setupLocalVideo(VideoCanvas)Start local preview:agoraEngine.startPreview()Join a channel as host:agoraEngine.joinChannelSend data streamAudienceJoin a live streaming eventRetrieve authentication token to join channelSet the user role as audience:agoraEngine.setClientRole(CLIENT_ROLE_AUDIENCE)Join the channel:agoraEngine.joinChannelRetrieve streaming from the hosts:agoraEngine.setupRemoteVideoReceive data streamLeave the live streaming eventLeave the channel:agoraEngine.leaveChannel()Close appClean up local resourcesagoraEngine.destroy() \ No newline at end of file +Your appAgoraUserUserVideo SDKVideo SDKSD-RTNSD-RTNOpen AppInitiate the Agora Video SDK engine:agoraEngine = RtcEngine.createEnable the video module:agoraEngine.enableVideo()Set the channel profile:agoraEngine.setChannelProfile(CHANNEL_PROFILE_LIVE_BROADCASTING)HostStart a live streaming eventRetrieve authentication token to join channelSet the role as host:agoraEngine.setClientRole(Constants.CLIENT_ROLE_BROADCASTER)Setup local video:agoraEngine.setupLocalVideo(VideoCanvas)Start local preview:agoraEngine.startPreview()Join a channel as host:agoraEngine.joinChannelSend data streamAudienceJoin a live streaming eventRetrieve authentication token to join channelSet the user role as audience:agoraEngine.setClientRole(CLIENT_ROLE_AUDIENCE)Join the channel:agoraEngine.joinChannelRetrieve streaming from the hosts:agoraEngine.setupRemoteVideoReceive data streamLeave the live streaming eventLeave the channel:agoraEngine.leaveChannel()Close appClean up local resourcesagoraEngine.destroy() \ No newline at end of file diff --git a/assets/images/interactive-live-streaming/ils-call-logic-flutter.svg b/assets/images/interactive-live-streaming/ils-call-logic-flutter.svg index c3f5006ad..38b2888b0 100644 --- a/assets/images/interactive-live-streaming/ils-call-logic-flutter.svg +++ b/assets/images/interactive-live-streaming/ils-call-logic-flutter.svg @@ -1,488 +1 @@ -Your appAgoraUserUserVideo SDKVideo SDKSD-RTNSD-RTNOpen AppInitiate the Agora Video SDK engineagoraEngine = RtcEngine.createEnable the video module:agoraEngine.enableVideoHostStart a live streaming eventRetrieve authentication token to join channelEnable live streaming in the channel:agoraEngine.setChannelProfile(ChannelProfile.LiveBroadcasting)Set the role as host:agoraEngine.setClientRole(ClientRole.Broadcaster)Join a channel as host:agoraEngine.joinChannelon "joinChannelSuccess"PublishSend data streamWidget = RtcLocalView.SurfaceView()AudienceJoin a live streaming eventRetrieve authentication token to join channelSet the user role as audience:agoraEngine.setClientRole(ClientRole.Audience)Join the channel:agoraEngine.joinChannelon "joinChannelSuccess"Retrieve streaming from the hosts:on "userJoined"Receive data streamWidget = RtcRemoteView.SurfaceView()Leave broadcastagoraEngine.leaveChannel()Close appClean up local resourcesagoraEngine.destroy() \ No newline at end of file +Your appAgoraUserUserVideo SDKVideo SDKSD-RTNSD-RTNOpen AppInitiate the Agora Video SDK engineagoraEngine = RtcEngine.createEnable the video module:agoraEngine.enableVideoHostStart a live streaming eventRetrieve authentication token to join channelEnable live streaming in the channel:agoraEngine.setChannelProfile(ChannelProfile.LiveBroadcasting)Set the role as host:agoraEngine.setClientRole(ClientRole.Broadcaster)Join a channel as host:agoraEngine.joinChannelon "joinChannelSuccess"PublishSend data streamWidget = RtcLocalView.SurfaceView()AudienceJoin a live streaming eventRetrieve authentication token to join channelSet the user role as audience:agoraEngine.setClientRole(ClientRole.Audience)Join the channel:agoraEngine.joinChannelon "joinChannelSuccess"Retrieve streaming from the hosts:on "userJoined"Receive data streamWidget = RtcRemoteView.SurfaceView()Leave broadcastagoraEngine.leaveChannel()Close appClean up local resourcesagoraEngine.destroy() \ No newline at end of file diff --git a/assets/images/interactive-live-streaming/ils-call-logic-ios.svg b/assets/images/interactive-live-streaming/ils-call-logic-ios.svg index b925b4213..1c32a1c60 100644 --- a/assets/images/interactive-live-streaming/ils-call-logic-ios.svg +++ b/assets/images/interactive-live-streaming/ils-call-logic-ios.svg @@ -1,484 +1 @@ -Your appAgoraUserUserVideo SDKVideo SDKSD-RTNSD-RTNOpen appInitiate the Video SDK engine:agoraEngine = AgoraRtcEngineKit.sharedEngineStart video in the engine:agoraEngine.enableVideo()HostStart a live streaming eventIn an live streaming event, only the hosts broadcast to the channel:agoraEngine.setClientRole(.broadcaster)Start local video:agoraEngine.setupLocalVideo(videoCanvas)Join the channel:agoraEngine?.joinChannelSend data streamAudienceJoin live streaming eventIn an live streaming event, the audience views the stream sent by channel hosts:agoraEngine.setClientRole(.audience)Join the channel:agoraEngine?.joinChannelRetrieve streaming from the hosts:agoraEngine.setupRemoteVideo(videoCanvas)Receive data streamsLeave live streaming eventStop local video:agoraEngine.stopPreview()Leave the channel:agoraEngine.leaveChannel(nil)Close appClean up local resources:AgoraRtcEngineKit.destroy() \ No newline at end of file +Your appAgoraUserUserVideo SDKVideo SDKSD-RTNSD-RTNOpen appInitiate the Video SDK engine:agoraEngine = AgoraRtcEngineKit.sharedEngineStart video in the engine:agoraEngine.enableVideo()HostStart a live streaming eventIn an live streaming event, only the hosts broadcast to the channel:agoraEngine.setClientRole(.broadcaster)Start local video:agoraEngine.setupLocalVideo(videoCanvas)Join the channel:agoraEngine?.joinChannelSend data streamAudienceJoin live streaming eventIn an live streaming event, the audience views the stream sent by channel hosts:agoraEngine.setClientRole(.audience)Join the channel:agoraEngine?.joinChannelRetrieve streaming from the hosts:agoraEngine.setupRemoteVideo(videoCanvas)Receive data streamsLeave live streaming eventStop local video:agoraEngine.stopPreview()Leave the channel:agoraEngine.leaveChannel(nil)Close appClean up local resources:AgoraRtcEngineKit.destroy() \ No newline at end of file diff --git a/assets/images/interactive-live-streaming/ils-call-logic-template.svg b/assets/images/interactive-live-streaming/ils-call-logic-template.svg index da5be5cdf..947f772d7 100644 --- a/assets/images/interactive-live-streaming/ils-call-logic-template.svg +++ b/assets/images/interactive-live-streaming/ils-call-logic-template.svg @@ -1,484 +1 @@ -Your appAgoraUserUserVideo SDKVideo SDKSD-RTNSD-RTNOpen appInitiate the Video SDK engine.Start video in the engine.HostStart ILS eventIn an ILS event, only the hosts broadcast to the channel.Start local video.Join the channel.Send data stream.AudienceJoin ILS eventIn an ILS event, the audience views the broadcast made by channel hosts.Join the channel.Retrieve streaming from the other user.Receive data streamsLeave ILS eventStop local video.Leave the channel.Close appClean up local resources. \ No newline at end of file +Your appAgoraUserUserVideo SDKVideo SDKSD-RTNSD-RTNOpen appInitiate the Video SDK engine.Start video in the engine.HostStart ILS eventIn an ILS event, only the hosts broadcast to the channel.Start local video.Join the channel.Send data stream.AudienceJoin ILS eventIn an ILS event, the audience views the broadcast made by channel hosts.Join the channel.Retrieve streaming from the other user.Receive data streamsLeave ILS eventStop local video.Leave the channel.Close appClean up local resources. \ No newline at end of file diff --git a/assets/images/interactive-live-streaming/ils-call-logic-unity.svg b/assets/images/interactive-live-streaming/ils-call-logic-unity.svg index 70d108cd5..20af4dabe 100644 --- a/assets/images/interactive-live-streaming/ils-call-logic-unity.svg +++ b/assets/images/interactive-live-streaming/ils-call-logic-unity.svg @@ -1,494 +1 @@ -Your gameAgoraUserUserVideo SDKVideo SDKSD-RTNSD-RTNOpen gameInitiate the Agora Video SDK engine:agoraEngine=IRtcEngine.GetEngine()Setup the local video stream:agoraEngine.EnableVideo()agoraEngine.EnableVideoObserver()HostStart a live streaming eventRetrieve authentication token to join channelEnable live streaming in the channel:agoraEngine.SetChannelProfile(CHANNEL_PROFILE.CHANNEL_PROFILE_LIVE_BROADCASTING)Set the user role as host:agoraEngine.SetClientRole(CLIENT_ROLE_TYPE.CLIENT_ROLE_BROADCASTER)Join a channel as host:agoraEngine.JoinChannelByKey()Send data streamAudienceJoin the live streaming eventRetrieve authentication token to join channelSet the user role as audience:agoraEngine.SetClientRole(CLIENT_ROLE_TYPE.CLIENT_ROLE_AUDIENCE)Join the channel:agoraEngine.JoinChannelByKey()A callback to start remote video:onUserJoined()Retrieve streaming from the hosts:RemoteView.SetForUser(uid)Receive data StreamLeave the live streaming eventLeave the channel:agoraEngine.leaveChannel()Stop local video stream:agoraEngine.DisableVideo()Disable the video observer:agoraEngine.DisableVideoObserver()Close gameClean up local resources:agoraEngine.destroy() \ No newline at end of file +Your gameAgoraUserUserVideo SDKVideo SDKSD-RTNSD-RTNOpen gameInitiate the Agora Video SDK engine:agoraEngine=IRtcEngine.GetEngine()Setup the local video stream:agoraEngine.EnableVideo()agoraEngine.EnableVideoObserver()HostStart a live streaming eventRetrieve authentication token to join channelEnable live streaming in the channel:agoraEngine.SetChannelProfile(CHANNEL_PROFILE.CHANNEL_PROFILE_LIVE_BROADCASTING)Set the user role as host:agoraEngine.SetClientRole(CLIENT_ROLE_TYPE.CLIENT_ROLE_BROADCASTER)Join a channel as host:agoraEngine.JoinChannelByKey()Send data streamAudienceJoin the live streaming eventRetrieve authentication token to join channelSet the user role as audience:agoraEngine.SetClientRole(CLIENT_ROLE_TYPE.CLIENT_ROLE_AUDIENCE)Join the channel:agoraEngine.JoinChannelByKey()A callback to start remote video:onUserJoined()Retrieve streaming from the hosts: RemoteView.SetForUser(uid)Receive data StreamLeave the live streaming eventLeave the channel:agoraEngine.leaveChannel()Stop local video stream:agoraEngine.DisableVideo()Disable the video observer:agoraEngine.DisableVideoObserver()Close gameClean up local resources:agoraEngine.destroy() \ No newline at end of file diff --git a/assets/images/interactive-live-streaming/ils-call-logic-web.svg b/assets/images/interactive-live-streaming/ils-call-logic-web.svg index 3bbb766c5..46e2f8dc1 100644 --- a/assets/images/interactive-live-streaming/ils-call-logic-web.svg +++ b/assets/images/interactive-live-streaming/ils-call-logic-web.svg @@ -1,484 +1 @@ -Your appAgoraUserUserVideo SDKVideo SDKSD-RTNSD-RTNOpen AppInitiate the Video SDK engine:agoraEngine = AgoraRTC.createClientSet the required event listners:agoraEngine.on("user-published")agoraEngine.on("user-unpublished")HostStart live streaming eventRetrieve authentication token to join channelSet the user role as host:agoraEngine.setClientRole("host")Join a channel as host:agoraEngine.joinCreate local media tracks :AgoraRTC.createMicrophoneAudioTrackAgoraRTC.createCameraVideoTrackPush local media tracks to the channel:agoraEngine.publishStop the remote video and play the local video:rtc.localVideoTrack.playrtc.remoteVideoTrack.stopRetrieve streaming from the other user:agoraEngine.on("user-published")AudienceJoin live streaming eventRetrieve authentication token to join channelSet the user role as audience:agoraEngine.setClientRole("audience")Join the live streaming event:agoraEngine.joinRetrieve streaming from the other user:agoraEngine.on("user-published")agoraEngine.subscribeStop the local video and play the remote video:rtc.localVideoTrack.stoprtc.remoteVideoTrack.playReceive data streamLeave live streaming eventagoraEngine.leave \ No newline at end of file +Your appAgoraUserUserVideo SDKVideo SDKSD-RTNSD-RTNOpen AppInitiate the Video SDK engine:agoraEngine = AgoraRTC.createClientStart video in the engine:App.initStart local media:stream = AgoraRTC.createStreamstream.initstream.playHostStart live streaming eventRetrieve authentication token to join channelSet the user role as host:agoraEngine.setClientRole("host")Join a channel as host:agoraEngine.joinPush local media to the channel:agoraEngine.publishAudienceJoin live streaming eventRetrieve authentication token to join channelSet the user role as audience:agoraEngine.setClientRole("audience")Join the live streaming event:agoraEngine.joinRetrieve streaming from the other user:agoraEngine.on("stream-added")agoraEngine.subscribeagoraEngine.on("stream-subscribed")Receive data streamLeave live streaming eventagoraEngine.leave \ No newline at end of file diff --git a/assets/images/interactive-live-streaming/live-streaming-over-multiple-channels.svg b/assets/images/interactive-live-streaming/live-streaming-over-multiple-channels.svg new file mode 100644 index 000000000..90d3f2dde --- /dev/null +++ b/assets/images/interactive-live-streaming/live-streaming-over-multiple-channels.svg @@ -0,0 +1 @@ +Your appAgoraUserUserVideo SDKVideo SDKSD-RTNSD-RTNOpen AppInitiate the Agora engineJoin a channelStart live streamingSet the user role as hostJoin a channel as hostPublish local media to the channelChannel media relayStart multi-channel live streamingSet the source channel info:Source channel name, token, and uidSet the destination channel info:Destination channel name, token, and uidCall the method tostart media relayingRelay stream to thedestination channelReport the media relayingstate with a callback functionsJoin multiple channelsStart multi-channel live streamingCreate a new channelSet the user role as hostfor the new channelJoin the new channelJoin acceptedPublish to the new channelLeave the live streaming eventStop media relayingLeave all the channels \ No newline at end of file diff --git a/assets/images/iot/iot-channel-quality.svg b/assets/images/iot/iot-channel-quality.svg index 9e5b5be9a..84e5ce9bd 100644 --- a/assets/images/iot/iot-channel-quality.svg +++ b/assets/images/iot/iot-channel-quality.svg @@ -1 +1 @@ -Your appAgoraUserUserIoT SDKIoT SDKSD-RTNSD-RTNSetup Agora engineInstantiate the Agora RTC engineSet options includinglog file path and logging levelVerify license and initialize the engineCreate a connectionJoin a channelJoin a channelSet bandwidth estimation parametersSpecify the audio codec, sampling rate,and the number of channelsCall the join channel methodon join channel success callbackSend and receive audio and videoStart threads to send audio and video dataReceive audio and video dataRender audio and video framesDetect and respond to network bandwidth changesOn target bitrate changed callbackAdjust the sending bit rate and resolutionOn key frame generation request callbackSend a key frameManage audio and video streamsMute local audio or videoCall mute local audio ormute local videoMute remote audio or videoCall mute remote audio ormute remote videoOn user mute audio callbackSuspend or resume audio feedOn user mute video callbackSuspend or resume video feed \ No newline at end of file +Your appAgoraUserUserIoT SDKIoT SDKSD-RTNSD-RTNSetup Agora engineInstantiate the Agora RTC engineSet options includinglog file path and logging levelVerify license and initialize the engineCreate a connectionJoin a channelJoin a channelSet bandwidth estimation parametersSpecify the audio codec, sampling rate,and the number of channelsCall the join channel methodon join channel success callbackSend and receive audio and videoStart threads to send audio and video dataReceive audio and video dataRender audio and video framesDetect and respond to network bandwidth changesOn target bitrate changed callbackAdjust the sending bit rate and resolutionOn key frame generation request callbackSend a key frameManage audio and video streamsMute local audio or videoCall mute local audio ormute local videoMute remote audio or videoCall mute remote audio ormute remote videoOn user mute audio callbackSuspend or resume audio feedOn user mute video callbackSuspend or resume video feed \ No newline at end of file diff --git a/assets/images/iot/iot-get-started.svg b/assets/images/iot/iot-get-started.svg index bfdcf514e..6d11e9764 100644 --- a/assets/images/iot/iot-get-started.svg +++ b/assets/images/iot/iot-get-started.svg @@ -1 +1 @@ -Your appAgoraUserUserIoT SDKIoT SDKSD-RTNSD-RTNSet up Agora engineInstantiate the Agora engineSet engine optionsVerify license and initialize the engineCreate a connectionJoin a channelJoin a channelCall the method to join a channelOn join channel success callbackSend audio and videoStart thread to send audio dataStart thread to send video dataReceive audio and videoOn audio data callbackRender audio frameOn video data callbackRender video frameLeave channelLeave channelCall the leave channel methodClean upDestroy the connectionCall the finish method to release resources +Your appAgoraUserUserIoT SDKIoT SDKSD-RTNSD-RTNSet up Agora engineInstantiate the Agora engineSet engine optionsVerify license and initialize the engineCreate a connectionJoin a channelJoin a channelCall the method to join a channelOn join channel success callbackSend audio and videoStart thread to send audio dataStart thread to send video dataReceive audio and videoOn audio data callbackRender audio frameOn video data callbackRender video frameLeave channelLeave channelCall the leave channel methodClean upDestroy the connectionCall the finish method to release resources \ No newline at end of file diff --git a/assets/images/iot/iot-licensing.svg b/assets/images/iot/iot-licensing.svg index ad925f79f..d958d4972 100644 --- a/assets/images/iot/iot-licensing.svg +++ b/assets/images/iot/iot-licensing.svg @@ -1 +1 @@ -Your IoT deviceAgoraYouYouIoT SDKIoT SDKAgora SalesAgora SalesREST APIREST APIContact Agora sales to request a licenseSend license informationActivate the licenseSend activation infoWrite a license to the device \ No newline at end of file +Your IoT deviceAgoraYouYouIoT SDKIoT SDKAgora SalesAgora SalesREST APIREST APIContact Agora sales to request a licenseSend license informationActivate the licenseSend activation infoWrite a license to the device \ No newline at end of file diff --git a/assets/images/iot/iot-multi-channel.svg b/assets/images/iot/iot-multi-channel.svg index 9b6dec9c1..814d8610e 100644 --- a/assets/images/iot/iot-multi-channel.svg +++ b/assets/images/iot/iot-multi-channel.svg @@ -1 +1 @@ -Your appAgoraUserUserIoT SDKIoT SDKSD-RTNSD-RTNSetup Agora engineInstantiate the Agora engineSet engine optionsVerify license and initializeCreate multiple connectionsCreate connection-1Create connection-2Stream to multiple channelsJoin channelsJoin channel-1 using connectionId-1Join channel-2 using connectionId-2Send audio and video data usingconnectionId-1 to stream to channel-1Send audio and video data usingconnectionId-2 to stream to channel-2Push multiple streams to a single channelJoin channelJoin channel-1 usingconnectionId-1 and userId-1Join channel-1 usingconnectionId-2 and userId-2Send audio and video datausing connectionId-1Send audio and video datausing connectionId-2Leave channel(s)Leave channelCall leave channel using connectionId-1Call leave channel using connectionId-2Clean upCall destroy connectionusing connectionId-1Call destroy connectionusing connectionId-2Call the finish method to release resources \ No newline at end of file +Your appAgoraUserUserIoT SDKIoT SDKSD-RTNSD-RTNSetup Agora engineInstantiate the Agora engineSet engine optionsVerify license and initializeCreate multiple connectionsCreate connection-1Create connection-2Stream to multiple channelsJoin channelsJoin channel-1 using connectionId-1Join channel-2 using connectionId-2Send audio and video data usingconnectionId-1 to stream to channel-1Send audio and video data usingconnectionId-2 to stream to channel-2Push multiple streams to a single channelJoin channelJoin channel-1 usingconnectionId-1 and userId-1Join channel-1 usingconnectionId-2 and userId-2Send audio and video datausing connectionId-1Send audio and video datausing connectionId-2Leave channel(s)Leave channelCall leave channel using connectionId-1Call leave channel using connectionId-2Clean upCall destroy connectionusing connectionId-1Call destroy connectionusing connectionId-2Call the finish method to release resources \ No newline at end of file diff --git a/assets/images/notification-center-service/ncs-cloud-recording-workflow.svg b/assets/images/notification-center-service/ncs-cloud-recording-workflow.svg index c15877eee..e27e9fd5d 100644 --- a/assets/images/notification-center-service/ncs-cloud-recording-workflow.svg +++ b/assets/images/notification-center-service/ncs-cloud-recording-workflow.svg @@ -1 +1 @@ -Implemented by youProvided by AgoraUserUserWeb serverWeb serverYour app serverYour app serverYourthird-partycloud storageYourthird-partycloud storageAgoraCloud RecordingAgoraCloud RecordingStart recordingGet recording resourcesresource IDStart recordingNotification 40: `recorder_started` <status>200 OKNotifications sent and acknowledged for all recording eventsStop recordingEnd recordingNotification 30: `uploader_started` <status>200 OKUpload recording fileNotification 31: `uploaded` <status>200 OKNotification 11: `session_exit` <exitStatus>200 OK \ No newline at end of file +Implemented by youProvided by AgoraUserUserWeb serverWeb serverYour app serverYour app serverYourthird-partycloud storageYourthird-partycloud storageAgoraCloud RecordingAgoraCloud RecordingStart recordingGet recording resourcesresource IDStart recordingNotification 40: `recorder_started` <status>200 OKNotifications sent and acknowledged for all recording eventsStop recordingEnd recordingNotification 30: `uploader_started` <status>200 OKUpload recording fileNotification 31: `uploaded` <status>200 OKNotification 11: `session_exit` <exitStatus>200 OK \ No newline at end of file diff --git a/assets/images/notification-center-service/ncs-media-pull.svg b/assets/images/notification-center-service/ncs-media-pull.svg index 420cfe301..66a6c67fb 100644 --- a/assets/images/notification-center-service/ncs-media-pull.svg +++ b/assets/images/notification-center-service/ncs-media-pull.svg @@ -1 +1 @@ -Implemented by youProvided by AgoraUserUserWeb serverWeb serverYour app serverYour app serverAgoraMedia PullAgoraMedia PullStart pulling mediainto an Agora channelCreate player APIResponse containing player parametersNotification eventType=1: Player createdAuthenticate notification signature200 OKChange player statusUpdate player APIResponse containing player parametersNotification eventType=4: Player status changedAuthenticate notification signature200 OKStop playing mediaDelete player APIResponse confirming player deleteNotification eventType=2: Player destroyedAuthenticate notification signature200 OK \ No newline at end of file +Implemented by youProvided by AgoraUserUserWeb serverWeb serverYour app serverYour app serverAgoraMedia PullAgoraMedia PullStart pulling mediainto an Agora channelCreate player APIResponse containing player parametersNotification eventType=1: Player createdAuthenticate notification signature200 OKChange player statusUpdate player APIResponse containing player parametersNotification eventType=4: Player status changedAuthenticate notification signature200 OKStop playing mediaDelete player APIResponse confirming player deleteNotification eventType=2: Player destroyedAuthenticate notification signature200 OK \ No newline at end of file diff --git a/assets/images/notification-center-service/ncs-media-push.svg b/assets/images/notification-center-service/ncs-media-push.svg index aab6eab79..2ea74fd43 100644 --- a/assets/images/notification-center-service/ncs-media-push.svg +++ b/assets/images/notification-center-service/ncs-media-push.svg @@ -1 +1 @@ -Implemented by youProvided by AgoraUserUserWeb serverWeb serverYour app serverYour app serverAgoraMedia PushAgoraMedia PushStart media pushCreate a converterResponse confirming conveter creationNotification eventType=1: Converter createdAuthenticate notification signature200 OKChange converter configurationUpdate converter APIResponse confirming configuration updateNotification eventType=2: Converter configuration changedAuthenticate notification signature200 OKNotification eventType=3: Converter status changedAuthenticate notification signature200 OKStop media pushDelete converter APIResponse confirming converter deleteNotification eventType=4: Converter destroyedAuthenticate notification signature200 OK \ No newline at end of file +Implemented by youProvided by AgoraUserUserWeb serverWeb serverYour app serverYour app serverAgoraMedia PushAgoraMedia PushStart media pushCreate a converterResponse confirming conveter creationNotification eventType=1: Converter createdAuthenticate notification signature200 OKChange converter configurationUpdate converter APIResponse confirming configuration updateNotification eventType=2: Converter configuration changedAuthenticate notification signature200 OKNotification eventType=3: Converter status changedAuthenticate notification signature200 OKStop media pushDelete converter APIResponse confirming converter deleteNotification eventType=4: Converter destroyedAuthenticate notification signature200 OK \ No newline at end of file diff --git a/assets/images/others/authentication-logic.svg b/assets/images/others/authentication-logic.svg index 7e964a030..71c783045 100644 --- a/assets/images/others/authentication-logic.svg +++ b/assets/images/others/authentication-logic.svg @@ -1,450 +1 @@ -Implemented by youProvided by AgoraToken serverToken serverAppAppSD-RTNSD-RTNJoin a channel with authenticationRequest a token using channel name,role, token type, and user IDValidate user againstinternal securityGenerate a token and return it to the clientJoin a channel with uid,channel name, and tokenValidatethe tokenTrigger the callback afteradding user to the channelRenew tokenTrigger event: Token Privilege will ExpireRequest a fresh token using channel name,role, token type, and user IDValidate user request against internal logicGenerate a fresh token and return it to the clientSend the fresh tokenwith a RenewToken request \ No newline at end of file +Implemented by youProvided by AgoraToken serverToken serverAppAppSD-RTNSD-RTNJoin a channel with authenticationRequest a token using channel name,role, token type, and user IDValidate user againstinternal securityGenerate a token and return it to the clientJoin a channel with uid,channel name, and tokenValidatethe tokenTrigger the callback afteradding user to the channelRenew tokenTrigger event: Token Privilege will ExpireRequest a fresh token using channel name,role, token type, and user IDValidate user request against internal logicGenerate a fresh token and return it to the clientSend the fresh tokenwith a RenewToken request \ No newline at end of file diff --git a/assets/images/others/documentation_way_of_working.svg b/assets/images/others/documentation_way_of_working.svg index 32777d4db..f80109bbe 100644 --- a/assets/images/others/documentation_way_of_working.svg +++ b/assets/images/others/documentation_way_of_working.svg @@ -1,510 +1 @@ -Product OwnerProduct OwnerProject ManagerProject ManagerDeveloperDeveloperTesterTesterTechnical WriterTechnical WriterDevRellerDevRellerRequest implementation of new featuresInform DevRel about new featuresFeature implementationCreate JIRA issues for each feature requirementThis includes documentation issues andusage details for the featureloop[For each JIRA issue]Implementation and documentation tasks completed in parallelRequest implementationRequest documentationImplement featureWrite documentation from requirement,implementation and Developer inputloop[Validate]Request implementation validationValidate implementation against requirementIf necessary, request changesUpdateImplementation approvedloop[Validate]Request documentation validationRequest documentation validationValidate documentation against HLDTechnical validationIf necessary, request changesIf necessary, request changesUpdatePeer review if feature requiresa large documentation changeUpdateDocumentation approvedHLD implementation complete \ No newline at end of file +Welcome to PlantUML! You can start with a simple UML Diagram like: Bob->Alice: Hello Or class Example You will find more information about PlantUML syntax onhttps://plantuml.com (Details by typinglicensekeyword) PlantUML 1.2023.12[From documentation_way_of_working.puml (line 4) ] @startuml !include ../images/agora_skin.iumlcannot include ../images/agora_skin.iuml \ No newline at end of file diff --git a/assets/images/others/media-stream-encryption.svg b/assets/images/others/media-stream-encryption.svg index 692c72a8d..be214faa5 100644 --- a/assets/images/others/media-stream-encryption.svg +++ b/assets/images/others/media-stream-encryption.svg @@ -1,462 +1 @@ -Implemented by youProvided by AgoraUserUserAppAppDeveloper'sAuthentication SystemDeveloper'sAuthentication SystemAPIAPIStart the appInitiate the Video SDK engineMedia stream encryptionLogin to the developer'sauthentication system.Request a 32-byte keyGenerate a 32-bytestring using OpenSSLRequested keyRequest a 32-byte salt inBase64 formatGenerate a 32-bytestring in Base64format using OpenSSLRequested saltCreate a encryption configuration usingthe received salt and keyCall the method to set theencryption configurationSelect a channel to joinJoin a channel with user Id, channel name, and tokenJoin acceptedEncrypted media stream \ No newline at end of file +Implemented by youProvided by AgoraUserUserAppAppDeveloper'sAuthentication SystemDeveloper'sAuthentication SystemAPIAPIStart the appInitiate the Video SDK engineMedia stream encryptionLogin to the developer'sauthentication system.Request a 32-byte keyGenerate a 32-bytestring using OpenSSLRequested keyRequest a 32-byte salt inBase64 formatGenerate a 32-bytestring in Base64format using OpenSSLRequested saltCreate a encryption configuration usingthe received salt and keyCall the method to set theencryption configurationSelect a channel to joinJoin a channel with user Id, channel name, and tokenJoin acceptedEncrypted media stream \ No newline at end of file diff --git a/assets/images/others/play-media.svg b/assets/images/others/play-media.svg index edbf6e3e4..576767e5c 100644 --- a/assets/images/others/play-media.svg +++ b/assets/images/others/play-media.svg @@ -1,468 +1 @@ -Your appAgoraUserUserVideo SDKVideo SDKSD-RTNSD-RTNOpen AppUse Video SDK to create an instance of Agora EngineEnable audio and video in the engineJoinJoin a channelRetrieve authentication token to join a channelJoin the channelPlay media filesSelect media fileUse Video SDK to create an instance of Media PlayerOpen media file using Media PlayerOpen media file completedPlay media fileSet up local video panelto display Media Player outputUpdate channel media optionsto publish Media Player outputPlay the media file on the Media PlayerPause or resume playCall pause or resume methodsMedia file played completelyResume publishing of camera and microphone \ No newline at end of file +Your appAgoraUserUserVideo SDKVideo SDKSD-RTNSD-RTNOpen AppUse Video SDK to create an instance of Agora EngineEnable audio and video in the engineJoinJoin a channelRetrieve authentication token to join a channelJoin the channelPlay media filesSelect media fileUse Video SDK to create an instance of Media PlayerOpen media file using Media PlayerOpen media file completedPlay media fileSet up local video panelto display Media Player outputUpdate channel media optionsto publish Media Player outputPlay the media file on the Media PlayerPause or resume playCall pause or resume methodsMedia file played completelyResume publishing of camera and microphone \ No newline at end of file diff --git a/assets/images/shared/ncs-worflow.svg b/assets/images/shared/ncs-worflow.svg new file mode 100644 index 000000000..6dff5e073 --- /dev/null +++ b/assets/images/shared/ncs-worflow.svg @@ -0,0 +1 @@ +Implemented by youProvided by AgoraUserUserWeb serverWeb serverAppAppSD-RTNSD-RTNLoginLogin authenticationNo notificationCreate channelCreate channelOpen channelNotification 101:`channel name` opened at `time`.Authenticate notificationsignature200 OKNotifications sent and acknowledged for all channel eventsClose channelNotification 102:`channel name` destroyed at `time`.Authenticate notificationsignature200 OKLogoutLogoutNo notification \ No newline at end of file diff --git a/assets/images/video-calling/geofencing.svg b/assets/images/video-calling/geofencing.svg index bf480faab..1ded373b7 100644 --- a/assets/images/video-calling/geofencing.svg +++ b/assets/images/video-calling/geofencing.svg @@ -1,444 +1 @@ -Implemented by youProvided by AgoraUserUserAppAppSD-RTNSD-RTNStart the appGeofencingSet SD-RTN region in the Agoraengine configurationInitiate the Agora engineConnect to SD-RTN in aspecific regionSuccess responseSelect a channel to joinJoin a channel with userId, channel name, and tokenJoin accepted \ No newline at end of file +Implemented by youProvided by AgoraUserUserAppAppSD-RTNSD-RTNStart the appGeofencingSet SD-RTN region in the Agoraengine configurationInitiate the Agora engineConnect to SD-RTN in aspecific regionSuccess responseSelect a channel to joinJoin a channel with userId, channel name, and tokenJoin accepted \ No newline at end of file diff --git a/assets/images/video-calling/process-raw-video-audio.svg b/assets/images/video-calling/process-raw-video-audio.svg index b01432ed0..7c93f7595 100644 --- a/assets/images/video-calling/process-raw-video-audio.svg +++ b/assets/images/video-calling/process-raw-video-audio.svg @@ -1,478 +1 @@ -Your appAgoraUserUserVideo SDKVideo SDKSD-RTNSD-RTNOpen AppCreate an instance of Agora Engine using Video SDKEnable audio and video in Agora EngineSetup raw data processingSetup the audio frame observerSetup the video frame observerJoinJoin a channelRegister the video frame observerRegister the audio frame observerSet audio frame parametersRetrieve authentication token to join a channelJoin the channelProcess raw audio and video dataGet the raw data in the callbacksSend the processed data back with the callbacksLeaveLeave the channelUnegister the video frame observerUnegister the audio frame observerLeave the channel \ No newline at end of file +Your appAgoraUserUserVideo SDKVideo SDKSD-RTNSD-RTNOpen AppCreate an instance of Agora Engine using Video SDKEnable audio and video in Agora EngineSetup raw data processingSetup the audio frame observerSetup the video frame observerJoinJoin a channelRegister the video frame observerRegister the audio frame observerSet audio frame parametersRetrieve authentication token to join a channelJoin the channelProcess raw audio and video dataGet the raw data in the callbacksSend the processed data back with the callbacksLeaveLeave the channelUnegister the video frame observerUnegister the audio frame observerLeave the channel \ No newline at end of file diff --git a/assets/images/video-calling/video-call-logic-android.svg b/assets/images/video-calling/video-call-logic-android.svg index 22e0fb1be..cd0dd558f 100644 --- a/assets/images/video-calling/video-call-logic-android.svg +++ b/assets/images/video-calling/video-call-logic-android.svg @@ -1,470 +1 @@ -Your appAgoraUserUserVideo SDKVideo SDKSD-RTNSD-RTNOpen AppInitiate the Agora Video SDK engine:agoraEngine = RtcEngine.createSetup the local video stream:agoraEngine.enableVideo()agoraEngine.setupLocalVideo(VideoCanvas)UserJoin a callRetrieve authentication token to join channelJoin the channel:agoraEngine.joinChannel()Retrieve streaming from the other user:agoraEngine.setupRemoteVideo(VideoCanvas)Receive and send data streamLeave the callLeave the channel:agoraEngine.leaveChannel()Close appClean up local resources:agoraEngine.destroy() \ No newline at end of file +Your appAgoraUserUserVideo SDKVideo SDKSD-RTNSD-RTNOpen AppInitiate the Agora Video SDK engine:agoraEngine = RtcEngine.createSetup the local video stream:agoraEngine.enableVideo()agoraEngine.setupLocalVideo(VideoCanvas)UserJoin a callRetrieve authentication token to join channelJoin the channel:agoraEngine.joinChannel()Retrieve streaming from the other user:agoraEngine.setupRemoteVideo(VideoCanvas)Receive and send data streamLeave the callLeave the channel:agoraEngine.leaveChannel()Close appClean up local resources:agoraEngine.destroy() \ No newline at end of file diff --git a/assets/images/video-calling/video-call-logic-flutter.svg b/assets/images/video-calling/video-call-logic-flutter.svg index 2767cfd95..bf3eae44b 100644 --- a/assets/images/video-calling/video-call-logic-flutter.svg +++ b/assets/images/video-calling/video-call-logic-flutter.svg @@ -1,464 +1 @@ -Your appAgoraUserUserVideo SDKVideo SDKSD-RTNSD-RTNOpen AppInitiate the Agora Video SDK engine:agoraEngine = RtcEngine.createSetup the local video stream:agoraEngine.enableVideoWidget = RtcLocalView.SurfaceView()UserJoin a callRetrieve authentication token to join channelJoin the channel:agoraEngine.joinChannelRetrieve streaming from the other user:Widget = RtcRemoteView.SurfaceView()Receive and send data streamLeave callLeave the channelagoraEngine.leaveChannelClose appClean up local resources:agoraEngine.destroy() \ No newline at end of file +Your appAgoraUserUserVideo SDKVideo SDKSD-RTNSD-RTNOpen AppInitiate the Agora Video SDK engine:agoraEngine = RtcEngine.createSetup the local video stream:agoraEngine.enableVideoWidget = RtcLocalView.SurfaceView()UserJoin a callRetrieve authentication token to join channelJoin the channel:agoraEngine.joinChannelRetrieve streaming from the other user:Widget = RtcRemoteView.SurfaceView()Receive and send data streamLeave callLeave the channelagoraEngine.leaveChannelClose appClean up local resources:agoraEngine.destroy() \ No newline at end of file diff --git a/assets/images/video-calling/video-call-logic-ios.svg b/assets/images/video-calling/video-call-logic-ios.svg index 24a90881b..939684c80 100644 --- a/assets/images/video-calling/video-call-logic-ios.svg +++ b/assets/images/video-calling/video-call-logic-ios.svg @@ -1,468 +1 @@ -Your appAgoraUserUserVideo SDKVideo SDKSD-RTNSD-RTNOpen appInitiate the Video SDK engine:agoraEngine = AgoraRtcEngineKit.sharedEngineSetup the local video stream:agoraEngine.enableVideo()HostStart a callIn a call, all users send to the channel:agoraEngine.setClientRole(.broadcaster)Start local video:agoraEngine.setupLocalVideo(videoCanvas)Join the channel:agoraEngine?.joinChannelRetrieve streaming from the other user:agoraEngine.setupRemoteVideo(videoCanvas)Receive and send data streamsLeave the callStop local video:agoraEngine.stopPreview()Leave the channel:agoraEngine.leaveChannel(nil)Close appClean up local resources:AgoraRtcEngineKit.destroy() \ No newline at end of file +Your appAgoraUserUserVideo SDKVideo SDKSD-RTNSD-RTNOpen appInitiate the Video SDK engine:agoraEngine = AgoraRtcEngineKit.sharedEngineSetup the local video stream:agoraEngine.enableVideo()HostStart a callIn a call, all users send to the channel:agoraEngine.setClientRole(.broadcaster)Start local video:agoraEngine.setupLocalVideo(videoCanvas)Join the channel:agoraEngine?.joinChannelRetrieve streaming from the other user:agoraEngine.setupRemoteVideo(videoCanvas)Receive and send data streamsLeave the callStop local video:agoraEngine.stopPreview()Leave the channel:agoraEngine.leaveChannel(nil)Close appClean up local resources:AgoraRtcEngineKit.destroy() \ No newline at end of file diff --git a/assets/images/video-calling/video-call-logic-template.svg b/assets/images/video-calling/video-call-logic-template.svg index 571701706..163ea25d6 100644 --- a/assets/images/video-calling/video-call-logic-template.svg +++ b/assets/images/video-calling/video-call-logic-template.svg @@ -1,468 +1 @@ -Your appAgoraUserUserVideo SDKVideo SDKSD-RTNSD-RTNOpen appInitiate the Video SDK engine.Start video in the engine.HostStart callIn a call, all users broadcast to the channel.Start local video.Join the channel.Retrieve streaming from the other user.Receive and send data streamsLeave callStop local video.Leave the channel.Close appClean up local resources. \ No newline at end of file +Your appAgoraUserUserVideo SDKVideo SDKSD-RTNSD-RTNOpen appInitiate the Video SDK engine.Start video in the engine.HostStart callIn a call, all users broadcast to the channel.Start local video.Join the channel.Retrieve streaming from the other user.Receive and send data streamsLeave callStop local video.Leave the channel.Close appClean up local resources. \ No newline at end of file diff --git a/assets/images/video-calling/video-call-logic-unity.svg b/assets/images/video-calling/video-call-logic-unity.svg index ee8bb32ad..79330c265 100644 --- a/assets/images/video-calling/video-call-logic-unity.svg +++ b/assets/images/video-calling/video-call-logic-unity.svg @@ -1,472 +1 @@ -Your appAgoraUserUserVideo SDKVideo SDKSD-RTNSD-RTNOpen gameInitiate the Agora Video SDK engine:agoraEngine=IRtcEngine.GetEngine()Setup the local video stream:agoraEngine.EnableVideo()agoraEngine.EnableVideoObserver()UserStart callJoin the channel:agoraEngine.JoinChannelByKey()Join callA callback to start remote video:OnUserJoined()Retrieve streaming from the other user:RemoteView.SetForUser(uid)Receive and send data streamLeave callLeave the channel:agoraEngine.leaveChannel()Stop local video stream:agoraEngine.DisableVideo()Disable the video observer:agoraEngine.DisableVideoObserver()Close gameClean up local resources:agoraEngine.destroy() \ No newline at end of file +Your appAgoraUserUserVideo SDKVideo SDKSD-RTNSD-RTNOpen gameInitiate the Agora Video SDK engine:agoraEngine=IRtcEngine.GetEngine()Setup the local video stream:agoraEngine.EnableVideo()agoraEngine.EnableVideoObserver()UserStart callJoin the channel:agoraEngine.JoinChannelByKey()Join callA callback to start remote video:OnUserJoined()Retrieve streaming from the other user:RemoteView.SetForUser(uid)Receive and send data streamLeave callLeave the channel:agoraEngine.leaveChannel()Stop local video stream:agoraEngine.DisableVideo()Disable the video observer:agoraEngine.DisableVideoObserver()Close gameClean up local resources:agoraEngine.destroy() \ No newline at end of file diff --git a/assets/images/video-calling/video-call-logic-web.svg b/assets/images/video-calling/video-call-logic-web.svg index 233a4544d..a8ac6ff5f 100644 --- a/assets/images/video-calling/video-call-logic-web.svg +++ b/assets/images/video-calling/video-call-logic-web.svg @@ -1,464 +1 @@ -Your appAgoraUserUserVideo SDKVideo SDKSD-RTNSD-RTNOpen AppInitiate the Video SDK engine:agoraEngine = AgoraRTC.createClientSet the required event listners:agoraEngine.on("user-published")agoraEngine.on("user-unpublished")UserStart callRetrieve authentication token to join channelJoin a channel:agoraEngine.joinJoin acceptedCreate local media tracks :AgoraRTC.createMicrophoneAudioTrackAgoraRTC.createCameraVideoTrackPush local media tracks to the channel:agoraEngine.publishRetrieve streaming from the other user:agoraEngine.on("user-published")Play remote media tracks: remoteVideoTrack.playremoteAudioTrack.playReceive and send data streamsLeave callleave the channel:agoraEngine.leave \ No newline at end of file +Your appAgoraUserUserVideo SDKVideo SDKSD-RTNSD-RTNOpen AppInitiate the Video SDK engine:agoraEngine = AgoraRTC.createClientSet the required event listners:agoraEngine.on("user-published")agoraEngine.on("user-unpublished")UserStart callRetrieve authentication token to join channelJoin a channel:agoraEngine.joinJoin acceptedCreate local media tracks :AgoraRTC.createMicrophoneAudioTrackAgoraRTC.createCameraVideoTrackPush local media tracks to the channel:agoraEngine.publishRetrieve streaming from the other user:agoraEngine.on("user-published")Play remote media tracks: remoteVideoTrack.playremoteAudioTrack.playReceive and send data streamsLeave callleave the channel:agoraEngine.leave \ No newline at end of file diff --git a/assets/images/video-calling/video_call_workflow.svg b/assets/images/video-calling/video_call_workflow.svg new file mode 100644 index 000000000..83aa472a0 --- /dev/null +++ b/assets/images/video-calling/video_call_workflow.svg @@ -0,0 +1 @@ +Call implementerAgoraClientClientToken ServerToken ServerClientClientAgora PlatformAgora PlatformSetup clientUser login to implementor securityAuthenticate and retrieve tokenUser login to implementor securityAuthenticate and retrieve tokenCreate client instanceSet client role: hostCreate client instanceSet client role: audienceRun callConnect to local audio and video resourcesJoin channelSend audio and video to channelJoin channelSend audio and video to channelCommunicateEnd callClose audio and videoLeave callClose audio and videoLeave call \ No newline at end of file diff --git a/assets/images/video-calling/video_call_workflow_run_end.svg b/assets/images/video-calling/video_call_workflow_run_end.svg new file mode 100644 index 000000000..ae2b2a5a8 --- /dev/null +++ b/assets/images/video-calling/video_call_workflow_run_end.svg @@ -0,0 +1 @@ +Call implementerAgoraClientClientClientClientAgora PlatformAgora PlatformRun callConnect to local audio and video resourcesJoin channelSend audio and video to channelConnect to local audio and video resourcesJoin channelSend audio and video to channelCommunicateEnd callClose audio and videoLeave channelClose audio and videoLeave channel \ No newline at end of file diff --git a/assets/images/video-sdk/audio-and-voice-effects-web.puml b/assets/images/video-sdk/audio-and-voice-effects-web.puml new file mode 100644 index 000000000..6412f5b39 --- /dev/null +++ b/assets/images/video-sdk/audio-and-voice-effects-web.puml @@ -0,0 +1,41 @@ +@startuml audio-and-voice-effects +!include agora_skin.iuml + +actor "User" as USR + +box "Your app" +participant "Video SDK" as APP +end box + +box "Agora" +participant "SD-RTN" as API +end box + +USR -> APP: Open App +APP -> APP: Create an Agora Engine instance using Video SDK +APP -> APP: Enable audio and video in Agora Engine + +group Join +USR -> APP: Join a channel +APP -> APP: Retrieve authentication token to join a channel +APP -> API: Join the channel +end + +group Audio mixing +USR -> APP: Select an audio file +APP -> API: Proccess the audio file +USR -> APP: Start audio mixing +APP -> API: Play the audio file +USR -> APP: Stop mixing +APP -> API: Stop the audio file +end + +group Change audio route +APP -> API: Set default audio route +APP -> API: Change the audio route temporarily +end + +USR -> APP: Leave the channel +APP -> API: Leave the channel + +@enduml diff --git a/assets/images/video-sdk/audio-and-voice-effects-web.svg b/assets/images/video-sdk/audio-and-voice-effects-web.svg new file mode 100644 index 000000000..cb72696ec --- /dev/null +++ b/assets/images/video-sdk/audio-and-voice-effects-web.svg @@ -0,0 +1 @@ +Your appAgoraUserUserVideo SDKVideo SDKSD-RTNSD-RTNOpen AppCreate an Agora Engine instance using Video SDKEnable audio and video in Agora EngineJoinJoin a channelRetrieve authentication token to join a channelJoin the channelAudio mixingSelect an audio fileProccess the audio fileStart audio mixingPlay the audio fileStop mixingStop the audio fileChange audio routeSet default audio routeChange the audio route temporarilyLeave the channelLeave the channel \ No newline at end of file diff --git a/assets/images/video-sdk/audio-and-voice-effects.svg b/assets/images/video-sdk/audio-and-voice-effects.svg index f98dfbd3c..e4acdf7ad 100644 --- a/assets/images/video-sdk/audio-and-voice-effects.svg +++ b/assets/images/video-sdk/audio-and-voice-effects.svg @@ -1,496 +1 @@ -Your appAgoraUserUserVideo SDKVideo SDKSD-RTNSD-RTNOpen AppCreate an Agora Engine instance using Video SDKEnable audio and video in Agora EngineJoinJoin a channelRetrieve authentication token to join a channelJoin the channelPlay audio effectTrigger sound effectPlay effectPause and resume effectSet effect positionSet effect volumeOn audio effect finishedAudio mixingControl audio mixingStart audio mixingStop audio mixingVoice effectsApply a voice effectSet preset voice effectDisable the voice effectChange audio routeSet default audio routeChange the audio route temporarilyLeave the channelLeave the channel \ No newline at end of file +Your appAgoraUserUserVideo SDKVideo SDKSD-RTNSD-RTNOpen AppCreate an Agora Engine instance using Video SDKEnable audio and video in Agora EngineJoinJoin a channelRetrieve authentication token to join a channelJoin the channelPlay audio effectTrigger sound effectPlay effectPause and resume effectSet effect positionSet effect volumeOn audio effect finishedAudio mixingControl audio mixingStart audio mixingStop audio mixingVoice effectsApply a voice effectSet preset voice effectDisable the voice effectLeave the channelLeave the channel \ No newline at end of file diff --git a/assets/images/video-sdk/authentication-logic.svg b/assets/images/video-sdk/authentication-logic.svg index 7e964a030..71c783045 100644 --- a/assets/images/video-sdk/authentication-logic.svg +++ b/assets/images/video-sdk/authentication-logic.svg @@ -1,450 +1 @@ -Implemented by youProvided by AgoraToken serverToken serverAppAppSD-RTNSD-RTNJoin a channel with authenticationRequest a token using channel name,role, token type, and user IDValidate user againstinternal securityGenerate a token and return it to the clientJoin a channel with uid,channel name, and tokenValidatethe tokenTrigger the callback afteradding user to the channelRenew tokenTrigger event: Token Privilege will ExpireRequest a fresh token using channel name,role, token type, and user IDValidate user request against internal logicGenerate a fresh token and return it to the clientSend the fresh tokenwith a RenewToken request \ No newline at end of file +Implemented by youProvided by AgoraToken serverToken serverAppAppSD-RTNSD-RTNJoin a channel with authenticationRequest a token using channel name,role, token type, and user IDValidate user againstinternal securityGenerate a token and return it to the clientJoin a channel with uid,channel name, and tokenValidatethe tokenTrigger the callback afteradding user to the channelRenew tokenTrigger event: Token Privilege will ExpireRequest a fresh token using channel name,role, token type, and user IDValidate user request against internal logicGenerate a fresh token and return it to the clientSend the fresh tokenwith a RenewToken request \ No newline at end of file diff --git a/assets/images/video-sdk/cloud-proxy.svg b/assets/images/video-sdk/cloud-proxy.svg index 4b3bbbb34..b5328a643 100644 --- a/assets/images/video-sdk/cloud-proxy.svg +++ b/assets/images/video-sdk/cloud-proxy.svg @@ -1 +1 @@ -Implemented by youProvided by AgoraUserUserAdminAdminAppAppEnterprise FirewallEnterprise FirewallCloud ProxyCloud ProxySD-RTNSD-RTNWhitelist IP addresses and portsfor Cloud Proxy in the firewall.Open the appInitialize the Agora VideoSDK engineJoin a channelalt[Join a channel directly]Join a ChannelJoin SuccessSend and receive data[Join Channel failed]Video SDK automatically attempts to connect securely on TLS 443Join SuccessSend and receive data[Connection attempt on TLS 443 failed: Enable cloud proxy]Call the method to enablea Cloud Proxy connectionRequest access toCloud ProxyCheck whitelist to grantaccessRequest access toCloud ProxyProxy informationJoin a channelRequest to join a channelJoin successJoin successSend data streamSend and receive data streamReceive data stream \ No newline at end of file +Implemented by youProvided by AgoraUserUserAdminAdminAppAppEnterprise FirewallEnterprise FirewallCloud ProxyCloud ProxySD-RTNSD-RTNWhitelist IP addresses and portsfor Cloud Proxy in the firewall.Open the appInitialize the Agora VideoSDK engineJoin a channelalt[Join a channel directly]Join a ChannelJoin SuccessSend and receive data[Join Channel failed]Video SDK automatically attempts to connect securely on TLS 443Join SuccessSend and receive data[Connection attempt on TLS 443 failed: Enable cloud proxy]Call the method to enablea Cloud Proxy connectionRequest access toCloud ProxyCheck whitelist to grantaccessRequest access toCloud ProxyProxy informationJoin a channelRequest to join a channelJoin successJoin successSend data streamSend and receive data streamReceive data stream \ No newline at end of file diff --git a/assets/images/video-sdk/custom-source-video-audio.svg b/assets/images/video-sdk/custom-source-video-audio.svg index 12305b496..be688a7ed 100644 --- a/assets/images/video-sdk/custom-source-video-audio.svg +++ b/assets/images/video-sdk/custom-source-video-audio.svg @@ -1,470 +1 @@ -Your appAgoraUserUserVideo SDKVideo SDKSD-RTNSD-RTNOpen AppCreate an instance of the Video SDK engineEnable audio and video in the engineSetup external sourceCheck the external source for compatibilitySet external video or audio sourceJoinJoin a channelRetrieve authentication token to join a channelJoin the channelProcess dataManage capturing and processingusing external methodsStream dataPush external video or audio frameLeave the channelLeave the channel \ No newline at end of file +Your appAgoraUserUserVideo SDKVideo SDKSD-RTNSD-RTNOpen AppCreate an instance of the Video SDK engineEnable audio and video in the engineSetup external sourceCheck the external source for compatibilitySet external video or audio sourceJoinJoin a channelRetrieve authentication token to join a channelJoin the channelProcess dataManage capturing and processingusing external methodsStream dataPush external video or audio frameLeave the channelLeave the channel \ No newline at end of file diff --git a/assets/images/video-sdk/ensure-channel-quality.svg b/assets/images/video-sdk/ensure-channel-quality.svg index cf56f4d12..af0a3cfc9 100644 --- a/assets/images/video-sdk/ensure-channel-quality.svg +++ b/assets/images/video-sdk/ensure-channel-quality.svg @@ -1,480 +1 @@ -Your appAgoraUserUserVideo SDKVideo SDKSD-RTNSD-RTNOpen AppCreate Agora engineSet log file path, log level and file sizeCreate Agora EngineBest practice for app initiationEnable dual-stream modeto allow remote users to choose a stream typeSet local publish and remote subscribe fallback optionsSettings checkSpecify audio profile and scenariobased on the nature of the appSet video encoder configurationCall the method to start the network probe testDeliver network quality scoreand network statisticsJoin channelEnable videoJoin channelIn-call quality monitoringEnable the quality statisticsReceive network, call, audio and video quality statisticsReceive state change notificationsNotify the userTake corrective action \ No newline at end of file +Your appAgoraUserUserVideo SDKVideo SDKSD-RTNSD-RTNOpen AppCreate Agora engineSet log file path, log level and file sizeCreate Agora EngineBest practice for app initiationEnable dual-stream modeto allow remote users to choose a stream typeSet local publish and remote subscribe fallback optionsSettings checkSpecify audio profile and scenariobased on the nature of the appSet video encoder configurationCall the method to start the network probe testDeliver network quality scoreand network statisticsJoin channelEnable videoJoin channelIn-call quality monitoringEnable the quality statisticsReceive network, call, audio and video quality statisticsReceive state change notificationsNotify the userTake corrective action \ No newline at end of file diff --git a/assets/images/video-sdk/ils-call-logic-android.svg b/assets/images/video-sdk/ils-call-logic-android.svg index 5a79b96f7..92170cfb3 100644 --- a/assets/images/video-sdk/ils-call-logic-android.svg +++ b/assets/images/video-sdk/ils-call-logic-android.svg @@ -1,490 +1 @@ -Your appAgoraUserUserVideo SDKVideo SDKSD-RTNSD-RTNOpen AppInitiate the Agora Video SDK engine:agoraEngine = RtcEngine.createEnable the video module:agoraEngine.enableVideo()Set the channel profile:agoraEngine.setChannelProfile(CHANNEL_PROFILE_LIVE_BROADCASTING)HostStart a live streaming eventRetrieve authentication token to join channelSet the role as host:agoraEngine.setClientRole(Constants.CLIENT_ROLE_BROADCASTER)Setup local video:agoraEngine.setupLocalVideo(VideoCanvas)Start local preview:agoraEngine.startPreview()Join a channel as host:agoraEngine.joinChannelSend data streamAudienceJoin a live streaming eventRetrieve authentication token to join channelSet the user role as audience:agoraEngine.setClientRole(CLIENT_ROLE_AUDIENCE)Join the channel:agoraEngine.joinChannelRetrieve streaming from the hosts:agoraEngine.setupRemoteVideoReceive data streamLeave the live streaming eventLeave the channel:agoraEngine.leaveChannel()Close appClean up local resourcesagoraEngine.destroy() \ No newline at end of file +Your appAgoraUserUserVideo SDKVideo SDKSD-RTNSD-RTNOpen AppInitiate the Agora Video SDK engine:agoraEngine = RtcEngine.createEnable the video module:agoraEngine.enableVideo()Set the channel profile:agoraEngine.setChannelProfile(CHANNEL_PROFILE_LIVE_BROADCASTING)HostStart a live streaming eventRetrieve authentication token to join channelSet the role as host:agoraEngine.setClientRole(Constants.CLIENT_ROLE_BROADCASTER)Setup local video:agoraEngine.setupLocalVideo(VideoCanvas)Start local preview:agoraEngine.startPreview()Join a channel as host:agoraEngine.joinChannelSend data streamAudienceJoin a live streaming eventRetrieve authentication token to join channelSet the user role as audience:agoraEngine.setClientRole(CLIENT_ROLE_AUDIENCE)Join the channel:agoraEngine.joinChannelRetrieve streaming from the hosts:agoraEngine.setupRemoteVideoReceive data streamLeave the live streaming eventLeave the channel:agoraEngine.leaveChannel()Close appClean up local resourcesagoraEngine.destroy() \ No newline at end of file diff --git a/assets/images/video-sdk/ils-call-logic-electron.svg b/assets/images/video-sdk/ils-call-logic-electron.svg index 109593b4b..7edcfac84 100644 --- a/assets/images/video-sdk/ils-call-logic-electron.svg +++ b/assets/images/video-sdk/ils-call-logic-electron.svg @@ -1,482 +1 @@ -Your appAgoraUserUserVideo SDKVideo SDKSD-RTNSD-RTNOpen AppCreate an instance of the Video SDK engine:agoraEngine = agoraEngine.createAgoraRtcEngineInitialize the created instance:agoraEngine.initializeSetup the callback functions:agoraEngine.registerEventHandlerSet the channel profile:agoraEngine.setChannelProfileHostStart live streaming eventSetup local video:agoraEngine.setupLocalVideoEnable local video capturer:agoraEngine.enableVideoStart local preview:agoraEngine.startPreviewSet the user role as host:agoraEngine.setChannelProfile(ChannelProfileType.ChannelProfileLiveBroadcasting)Retrieve authentication token to join channelJoin a channel as host:agoraEngine.joinChannelSend data streamAudienceJoin live streaming eventRetrieve authentication token to join channelSet the user role as audience:agoraEngine.setChannelProfile(ClientRoleType.ClientRoleAudience)Join the live streaming event:agoraEngine.joinChannelRetrieve streaming from the other user:agoraEngine.setupRemoteVideoLeave live streaming eventagoraEngine.leaveChannel \ No newline at end of file +Your appAgoraUserUserVideo SDKVideo SDKSD-RTNSD-RTNOpen AppCreate an instance of the Video SDK engine:agoraEngine = agoraEngine.createAgoraRtcEngineInitialize the created instance:agoraEngine.initializeSetup the callback functions:agoraEngine.registerEventHandlerSet the channel profile:agoraEngine.setChannelProfileHostStart live streaming eventSetup local video:agoraEngine.setupLocalVideoEnable local video capturer:agoraEngine.enableVideoStart local preview:agoraEngine.startPreviewSet the user role as host:agoraEngine.setChannelProfile(ChannelProfileType.ChannelProfileLiveBroadcasting)Retrieve authentication token to join channelJoin a channel as host:agoraEngine.joinChannelSend data streamAudienceJoin live streaming eventRetrieve authentication token to join channelSet the user role as audience:agoraEngine.setChannelProfile(ClientRoleType.ClientRoleAudience)Join the live streaming event:agoraEngine.joinChannelRetrieve streaming from the other user:agoraEngine.setupRemoteVideoLeave live streaming eventagoraEngine.leaveChannel \ No newline at end of file diff --git a/assets/images/video-sdk/ils-call-logic-flutter.svg b/assets/images/video-sdk/ils-call-logic-flutter.svg index c2b7f6661..926fd8899 100644 --- a/assets/images/video-sdk/ils-call-logic-flutter.svg +++ b/assets/images/video-sdk/ils-call-logic-flutter.svg @@ -1,486 +1 @@ -Your appAgoraUserUserVideo SDKVideo SDKSD-RTNSD-RTNOpen AppInitialize the Agora Video SDK engine:agoraEngine = createAgoraRtcEngine()Enable the video module:agoraEngine.enableVideoRegister the event handler:agoraEngine.registerEventHandlerSetup AgoraVideoView widgetfor local or remote videoHostStart a live streaming eventSet the client role as Host:agoraEngine.setClientRole(role: ClientRoleType.clientRoleBroadcaster);Set a channel profile:.setChannelProfile(ChannelProfileType.channelProfileLiveBroadcasting);Retrieve authentication tokenJoin a channel using the token:agoraEngine.joinChannelSend data streamsRemote user joined:RtcEngineEventHandler onUserJoined:Start local peview:agoraEngine.startPreview()Display local video using AgoraVideoViewAudienceJoin a live streaming eventSet the client role as Audience:agoraEngine.setClientRole(role: ClientRoleType.clientRoleAudience);Set a channel profile:.setChannelProfile(ChannelProfileType.channelProfileLiveBroadcasting);Retrieve authentication tokenJoin a channel using the token:agoraEngine.joinChannelRemote user joined:RtcEngineEventHandler onUserJoined:Receive data streamsRender remote video using AgoraVideoViewLeave broadcastagoraEngine.leaveChannel()Close app \ No newline at end of file +Your appAgoraUserUserVideo SDKVideo SDKSD-RTNSD-RTNOpen AppInitialize the Agora Video SDK engine:agoraEngine = createAgoraRtcEngine()Enable the video module:agoraEngine.enableVideoRegister the event handler:agoraEngine.registerEventHandlerSetup AgoraVideoView widgetfor local or remote videoHostStart a live streaming eventSet the client role as Host:agoraEngine.setClientRole(role: ClientRoleType.clientRoleBroadcaster);Set a channel profile:.setChannelProfile(ChannelProfileType.channelProfileLiveBroadcasting);Retrieve authentication tokenJoin a channel using the token:agoraEngine.joinChannelSend data streamsRemote user joined:RtcEngineEventHandler onUserJoined:Start local peview:agoraEngine.startPreview()Display local video using AgoraVideoViewAudienceJoin a live streaming eventSet the client role as Audience:agoraEngine.setClientRole(role: ClientRoleType.clientRoleAudience);Set a channel profile:.setChannelProfile(ChannelProfileType.channelProfileLiveBroadcasting);Retrieve authentication tokenJoin a channel using the token:agoraEngine.joinChannelRemote user joined:RtcEngineEventHandler onUserJoined:Receive data streamsRender remote video using AgoraVideoViewLeave broadcastagoraEngine.leaveChannel()Close app \ No newline at end of file diff --git a/assets/images/video-sdk/ils-call-logic-ios.svg b/assets/images/video-sdk/ils-call-logic-ios.svg index b925b4213..1c32a1c60 100644 --- a/assets/images/video-sdk/ils-call-logic-ios.svg +++ b/assets/images/video-sdk/ils-call-logic-ios.svg @@ -1,484 +1 @@ -Your appAgoraUserUserVideo SDKVideo SDKSD-RTNSD-RTNOpen appInitiate the Video SDK engine:agoraEngine = AgoraRtcEngineKit.sharedEngineStart video in the engine:agoraEngine.enableVideo()HostStart a live streaming eventIn an live streaming event, only the hosts broadcast to the channel:agoraEngine.setClientRole(.broadcaster)Start local video:agoraEngine.setupLocalVideo(videoCanvas)Join the channel:agoraEngine?.joinChannelSend data streamAudienceJoin live streaming eventIn an live streaming event, the audience views the stream sent by channel hosts:agoraEngine.setClientRole(.audience)Join the channel:agoraEngine?.joinChannelRetrieve streaming from the hosts:agoraEngine.setupRemoteVideo(videoCanvas)Receive data streamsLeave live streaming eventStop local video:agoraEngine.stopPreview()Leave the channel:agoraEngine.leaveChannel(nil)Close appClean up local resources:AgoraRtcEngineKit.destroy() \ No newline at end of file +Your appAgoraUserUserVideo SDKVideo SDKSD-RTNSD-RTNOpen appInitiate the Video SDK engine:agoraEngine = AgoraRtcEngineKit.sharedEngineStart video in the engine:agoraEngine.enableVideo()HostStart a live streaming eventIn an live streaming event, only the hosts broadcast to the channel:agoraEngine.setClientRole(.broadcaster)Start local video:agoraEngine.setupLocalVideo(videoCanvas)Join the channel:agoraEngine?.joinChannelSend data streamAudienceJoin live streaming eventIn an live streaming event, the audience views the stream sent by channel hosts:agoraEngine.setClientRole(.audience)Join the channel:agoraEngine?.joinChannelRetrieve streaming from the hosts:agoraEngine.setupRemoteVideo(videoCanvas)Receive data streamsLeave live streaming eventStop local video:agoraEngine.stopPreview()Leave the channel:agoraEngine.leaveChannel(nil)Close appClean up local resources:AgoraRtcEngineKit.destroy() \ No newline at end of file diff --git a/assets/images/video-sdk/ils-call-logic-template.svg b/assets/images/video-sdk/ils-call-logic-template.svg index da5be5cdf..947f772d7 100644 --- a/assets/images/video-sdk/ils-call-logic-template.svg +++ b/assets/images/video-sdk/ils-call-logic-template.svg @@ -1,484 +1 @@ -Your appAgoraUserUserVideo SDKVideo SDKSD-RTNSD-RTNOpen appInitiate the Video SDK engine.Start video in the engine.HostStart ILS eventIn an ILS event, only the hosts broadcast to the channel.Start local video.Join the channel.Send data stream.AudienceJoin ILS eventIn an ILS event, the audience views the broadcast made by channel hosts.Join the channel.Retrieve streaming from the other user.Receive data streamsLeave ILS eventStop local video.Leave the channel.Close appClean up local resources. \ No newline at end of file +Your appAgoraUserUserVideo SDKVideo SDKSD-RTNSD-RTNOpen appInitiate the Video SDK engine.Start video in the engine.HostStart ILS eventIn an ILS event, only the hosts broadcast to the channel.Start local video.Join the channel.Send data stream.AudienceJoin ILS eventIn an ILS event, the audience views the broadcast made by channel hosts.Join the channel.Retrieve streaming from the other user.Receive data streamsLeave ILS eventStop local video.Leave the channel.Close appClean up local resources. \ No newline at end of file diff --git a/assets/images/video-sdk/ils-call-logic-unity.svg b/assets/images/video-sdk/ils-call-logic-unity.svg index 4d2bcc8a1..2f188fe80 100644 --- a/assets/images/video-sdk/ils-call-logic-unity.svg +++ b/assets/images/video-sdk/ils-call-logic-unity.svg @@ -1,492 +1 @@ -Your gameAgoraUserUserVideo SDKVideo SDKSD-RTNSD-RTNOpen gameInitiate the Agora Video SDK engine:agoraEngine=IRtcEngine.GetEngine()Setup the local video stream:agoraEngine.EnableVideo()agoraEngine.EnableVideoObserver()HostStart a live streaming eventRetrieve authentication token to join channelEnable live streaming in the channel:agoraEngine.SetChannelProfile(CHANNEL_PROFILE.CHANNEL_PROFILE_LIVE_BROADCASTING)Set the user role as host:agoraEngine.SetClientRole(CLIENT_ROLE_TYPE.CLIENT_ROLE_BROADCASTER)Join a channel as host:agoraEngine.JoinChannelByKey()Send data streamAudienceJoin the live streaming eventRetrieve authentication token to join channelSet the user role as audience:agoraEngine.SetClientRole(CLIENT_ROLE_TYPE.CLIENT_ROLE_AUDIENCE)Join the channel:agoraEngine.JoinChannelByKey()A callback to start remote video:onUserJoined()Retrieve streaming from the hosts:RemoteView.SetForUser(uid)Receive data StreamLeave the live streaming eventLeave the channel:agoraEngine.leaveChannel()Stop local video stream:agoraEngine.DisableVideo()Disable the video observer:agoraEngine.DisableVideoObserver()Close gameClean up local resources:agoraEngine.destroy() \ No newline at end of file +PlantUML 1.2023.12[From ils-call-logic-unity.puml (line 22) ] @startuml ils-call-logic-unity scale max 1000 widthskinparam linetype orthohide stereotype...... ( skipping 375 lines )... actor "User" as USR box "Your game" participant "Video SDK" as APP end box box "Agora" participant "SD-RTN" as API end box USR -> APP: Open gameAPP -> APP: Create an RtcEngine instance: \n RtcEngine = Agora.Rtc.RtcEngine.CreateAgoraRtcEngine()APP -> API: Set channel profile: \n RtcEngine.SetChannelProfile(CHANNEL_PROFILE_TYPE.CHANNEL_PROFILE_LIVE_BROADCASTING)APP -> API: Set the context: \n RtcEngineContext context = new RtcEngineContext(_appID, 0, true,CHANNEL_PROFILE_TYPE.CHANNEL_PROFILE_LIVE_BROADCASTING,Syntax Error? \ No newline at end of file diff --git a/assets/images/video-sdk/ils-call-logic-unreal.svg b/assets/images/video-sdk/ils-call-logic-unreal.svg index 5df33060e..3f357f912 100644 --- a/assets/images/video-sdk/ils-call-logic-unreal.svg +++ b/assets/images/video-sdk/ils-call-logic-unreal.svg @@ -1 +1 @@ -Your appAgoraUserUserVideo SDKVideo SDKSD-RTNSD-RTNOpen AppInitiate the Agora Video SDK engine:agora::rtc::ue::createAgoraRtcEngine()Enable the audio and video modules:agoraEngine.enableVideo()agoraEngine.enableAudio()Set the channel profile:RtcEngineContext contextcontext.channelProfile = CHANNEL_PROFILE_COMMUNICATION;HostStart a live streaming eventRetrieve authentication token to join channelSet the role as host:agoraEngine->setClientRole(CLIENT_ROLE_BROADCASTER)Setup local video:agoraEngine.setupLocalVideo(videoCanvas)Join a channel as host:agoraEngine->joinChannelSend data streamAudienceJoin a live streaming eventRetrieve authentication token to join channelSet the user role as audience:agoraEngine->setClientRole(CLIENT_ROLE_AUDIENCE)Join the channel:agoraEngine->joinChannelRetrieve streaming from the hosts:agoraEngine->setupRemoteVideoReceive data streamLeave the live streaming eventLeave the channel:agoraEngine->leaveChannel()Close appClean up local resourcesagoraEngine->release() \ No newline at end of file +Your appAgoraUserUserVideo SDKVideo SDKSD-RTNSD-RTNOpen AppInitiate the Agora Video SDK engine:agora::rtc::ue::createAgoraRtcEngine()Enable the audio and video modules:agoraEngine.enableVideo()agoraEngine.enableAudio()Set the channel profile:RtcEngineContext contextcontext.channelProfile = CHANNEL_PROFILE_COMMUNICATION;HostStart a live streaming eventRetrieve authentication token to join channelSet the role as host:agoraEngine->setClientRole(CLIENT_ROLE_BROADCASTER)Setup local video:agoraEngine.setupLocalVideo(videoCanvas)Join a channel as host:agoraEngine->joinChannelSend data streamAudienceJoin a live streaming eventRetrieve authentication token to join channelSet the user role as audience:agoraEngine->setClientRole(CLIENT_ROLE_AUDIENCE)Join the channel:agoraEngine->joinChannelRetrieve streaming from the hosts:agoraEngine->setupRemoteVideoReceive data streamLeave the live streaming eventLeave the channel:agoraEngine->leaveChannel()Close appClean up local resourcesagoraEngine->release() \ No newline at end of file diff --git a/assets/images/video-sdk/ils-call-logic-web.svg b/assets/images/video-sdk/ils-call-logic-web.svg index 3bbb766c5..46e2f8dc1 100644 --- a/assets/images/video-sdk/ils-call-logic-web.svg +++ b/assets/images/video-sdk/ils-call-logic-web.svg @@ -1,484 +1 @@ -Your appAgoraUserUserVideo SDKVideo SDKSD-RTNSD-RTNOpen AppInitiate the Video SDK engine:agoraEngine = AgoraRTC.createClientSet the required event listners:agoraEngine.on("user-published")agoraEngine.on("user-unpublished")HostStart live streaming eventRetrieve authentication token to join channelSet the user role as host:agoraEngine.setClientRole("host")Join a channel as host:agoraEngine.joinCreate local media tracks :AgoraRTC.createMicrophoneAudioTrackAgoraRTC.createCameraVideoTrackPush local media tracks to the channel:agoraEngine.publishStop the remote video and play the local video:rtc.localVideoTrack.playrtc.remoteVideoTrack.stopRetrieve streaming from the other user:agoraEngine.on("user-published")AudienceJoin live streaming eventRetrieve authentication token to join channelSet the user role as audience:agoraEngine.setClientRole("audience")Join the live streaming event:agoraEngine.joinRetrieve streaming from the other user:agoraEngine.on("user-published")agoraEngine.subscribeStop the local video and play the remote video:rtc.localVideoTrack.stoprtc.remoteVideoTrack.playReceive data streamLeave live streaming eventagoraEngine.leave \ No newline at end of file +Your appAgoraUserUserVideo SDKVideo SDKSD-RTNSD-RTNOpen AppInitiate the Video SDK engine:agoraEngine = AgoraRTC.createClientStart video in the engine:App.initStart local media:stream = AgoraRTC.createStreamstream.initstream.playHostStart live streaming eventRetrieve authentication token to join channelSet the user role as host:agoraEngine.setClientRole("host")Join a channel as host:agoraEngine.joinPush local media to the channel:agoraEngine.publishAudienceJoin live streaming eventRetrieve authentication token to join channelSet the user role as audience:agoraEngine.setClientRole("audience")Join the live streaming event:agoraEngine.joinRetrieve streaming from the other user:agoraEngine.on("stream-added")agoraEngine.subscribeagoraEngine.on("stream-subscribed")Receive data streamLeave live streaming eventagoraEngine.leave \ No newline at end of file diff --git a/assets/images/video-sdk/integrated-token-generation.svg b/assets/images/video-sdk/integrated-token-generation.svg index ba59d35ec..674b682f6 100644 --- a/assets/images/video-sdk/integrated-token-generation.svg +++ b/assets/images/video-sdk/integrated-token-generation.svg @@ -1,448 +1 @@ -Implemented by youProvided byAgoraUserUserAppAppDeveloper'sAuthenticationSystemDeveloper'sAuthenticationSystemSD-RTNSD-RTNJoin a Channel with AuthenticationStart the appLogin to youridentity management system.Select a channelRequest an Agora authentication token usingchannel name, role, token type and user IdValidate user requestagainst internal securityUse integrated Agora libraryto generate a tokenReturn the token to the clientJoin a channel with user Id, channel name, and tokenValidatethe tokenTrigger the callback after adding user to the channel \ No newline at end of file +Implemented by youProvided byAgoraUserUserAppAppDeveloper'sAuthenticationSystemDeveloper'sAuthenticationSystemSD-RTNSD-RTNJoin a Channel with AuthenticationStart the appLogin to youridentity management system.Select a channelRequest an Agora authentication token usingchannel name, role, token type and user IdValidate user requestagainst internal securityUse integrated Agora libraryto generate a tokenReturn the token to the clientJoin a channel with user Id, channel name, and tokenValidatethe tokenTrigger the callback after adding user to the channel \ No newline at end of file diff --git a/assets/images/video-sdk/media-stream-encryption.svg b/assets/images/video-sdk/media-stream-encryption.svg index 6879ac748..36a71688f 100644 --- a/assets/images/video-sdk/media-stream-encryption.svg +++ b/assets/images/video-sdk/media-stream-encryption.svg @@ -1,456 +1 @@ -Implemented by youProvided by AgoraUserUserAuthentication systemAuthentication systemAppAppSD-RTNSD-RTNStart the appInitiate the Video SDK engineSetup media stream encryptionLogin to theauthentication systemRetrieve a 32-byte keyRetrieve a 32-byte salt inBase64 formatCreate a encryption configuration usingthe key and saltSet the encryption configurationSelect a channel to joinRetrieve an access token.Join a channelCommunicate over anencrypted media stream \ No newline at end of file +Implemented by youProvided by AgoraUserUserAuthentication systemAuthentication systemAppAppSD-RTNSD-RTNStart the appInitiate the Video SDK engineSetup media stream encryptionLogin to theauthentication systemRetrieve a 32-byte keyRetrieve a 32-byte salt inBase64 formatCreate a encryption configuration usingthe key and saltSet the encryption configurationSelect a channel to joinRetrieve an access token.Join a channelCommunicate over anencrypted media stream \ No newline at end of file diff --git a/assets/images/video-sdk/play-drm-music.svg b/assets/images/video-sdk/play-drm-music.svg index e3e54f928..7f763970a 100644 --- a/assets/images/video-sdk/play-drm-music.svg +++ b/assets/images/video-sdk/play-drm-music.svg @@ -1,462 +1 @@ -Your appAgoraUserUserVideo SDKVideo SDKSD-RTNSD-RTNSet up the music content centerLoad the DRM extension framework.Create an instance of the music content center.Initialize the music content center.Find music from the content centerFind music.Call the method to search for music.Receive search results through the callback.Call the method to download music charts.Receive music charts data through the callback.Play DRM musicPress playPreload selected music.Create an instance of music player.Open and play music files. \ No newline at end of file +Your appAgoraUserUserVideo SDKVideo SDKSD-RTNSD-RTNSet up the music content centerLoad the DRM extension framework.Create an instance of the music content center.Initialize the music content center.Find music from the content centerFind music.Call the method to search for music.Receive search results through the callback.Call the method to download music charts.Receive music charts data through the callback.Play DRM musicPress playPreload selected music.Create an instance of music player.Open and play music files. \ No newline at end of file diff --git a/assets/images/video-sdk/product-workflow-web.svg b/assets/images/video-sdk/product-workflow-web.svg index d48c32f92..feecc4649 100644 --- a/assets/images/video-sdk/product-workflow-web.svg +++ b/assets/images/video-sdk/product-workflow-web.svg @@ -1,482 +1 @@ -Your appAgoraUserUserVideo SDKVideo SDKSD-RTNSD-RTNOpen AppUse Video SDK to create an Agora Engine instanceCreate and play the local audio/video tracksBypass autoplay block whenonAutoplayFailed event occursJoinJoin a channelRetrieve authentication token to join a channelJoin the channelPublish and SubscribePublish camera and microphone tracks to the channelSubscribe to tracks from other usersManage local and remote audio/video tracksCommon workflowsBypass autoplay blockingStart screen sharingCreate a screen trackUnpublish the local video trackPublish the screen trackAdjust volumeCall API methods to adjust or mutethe local or remote audio trackMute/Unmute videoCall the API method to mute or unmutethe local video trackLeave the channelLeave the channel \ No newline at end of file +Your appAgoraUserUserVideo SDKVideo SDKSD-RTNSD-RTNOpen AppUse Video SDK to create an Agora Engine instanceCreate and play the local audio/video tracksBypass autoplay block whenonAutoplayFailed event occursJoinJoin a channelRetrieve authentication token to join a channelJoin the channelPublish and SubscribePublish camera and microphone tracks to the channelSubscribe to tracks from other usersManage local and remote audio/video tracksCommon workflowsBypass autoplay blockingStart screen sharingCreate a screen trackUnpublish the local video trackPublish the screen trackAdjust volumeCall API methods to adjust or mutethe local or remote audio trackMute/Unmute videoCall the API method to mute or unmutethe local video trackLeave the channelLeave the channel \ No newline at end of file diff --git a/assets/images/video-sdk/product-workflow.svg b/assets/images/video-sdk/product-workflow.svg new file mode 100644 index 000000000..865d963b3 --- /dev/null +++ b/assets/images/video-sdk/product-workflow.svg @@ -0,0 +1 @@ +Your appAgoraUserUserVideo SDKVideo SDKSD-RTNSD-RTNOpen AppUse Video SDK to create an Agora Engine instanceEnable audio and video in the engineJoinJoin a channelRetrieve authentication token to join a channelJoin the channelPublish and SubscribePublish camera and microphone streams to the channelSubscribe to streams from other usersManage local and remote streamsCommon workflowsStart screen sharingCapture and publish your screen to the channelAdjust volumeCall API methods to adjust or mute volumeLeave the channelLeave the channel \ No newline at end of file diff --git a/assets/images/video-sdk/spatial-audio-web.svg b/assets/images/video-sdk/spatial-audio-web.svg index c41e50796..a7a700ea4 100644 --- a/assets/images/video-sdk/spatial-audio-web.svg +++ b/assets/images/video-sdk/spatial-audio-web.svg @@ -1 +1 @@ -Your appAgoraUserUserVideo SDKVideo SDKSD-RTNSD-RTNSpatial AudioExtensionSpatial AudioExtensionAgoraRTC.createClientSetup the spatial audio extensionAgoraRTC.registerExtensionspatialAudioExtension.updateSelfPositionJoin ChannelAgoraRTCClient.joinRealize remote user's spatial soundclient.on("user-published")AgoraRTCClient.subscribespatialAudioExtension.createProcessorremoteTrack.pipe(processor).pipe(track.processorDestination)remoteTrack.Playprocessor.updateRemotePositionSpatial audio effect for media playerPlay media fileAgoraRTC.createBufferSourceAudioTrackspatialAudioExtension.createProcessortrack.pipe(processor).pipe(track.processorDestination)track.Playprocessor.updatePlayerPositionInfoCleanupclient.on("user-unpublished")processor.removeRemotePositionAgoraRTC.leaveprocessor.clearRemotePosition \ No newline at end of file +Your appAgoraUserUserVideo SDKVideo SDKSD-RTNSD-RTNSpatial AudioExtensionSpatial AudioExtensionAgoraRTC.createClientSetup the spatial audio extensionAgoraRTC.registerExtensionspatialAudioExtension.updateSelfPositionJoin ChannelAgoraRTCClient.joinRealize remote user's spatial soundclient.on("user-published")AgoraRTCClient.subscribespatialAudioExtension.createProcessorremoteTrack.pipe(processor).pipe(track.processorDestination)remoteTrack.Playprocessor.updateRemotePositionSpatial audio effect for media playerPlay media fileAgoraRTC.createBufferSourceAudioTrackspatialAudioExtension.createProcessortrack.pipe(processor).pipe(track.processorDestination)track.Playprocessor.updatePlayerPositionInfoCleanupclient.on("user-unpublished")processor.removeRemotePositionAgoraRTC.leaveprocessor.clearRemotePosition \ No newline at end of file diff --git a/assets/images/video-sdk/spatial-audio.svg b/assets/images/video-sdk/spatial-audio.svg index ed780fad6..cb247ab90 100644 --- a/assets/images/video-sdk/spatial-audio.svg +++ b/assets/images/video-sdk/spatial-audio.svg @@ -1,478 +1 @@ -Your client and serverAgoraUserUserAgora SDKAgora SDKYour ServerYour ServerSD-RTNSD-RTNLocal SpatialAudio EngineLocal SpatialAudio EngineEnable spatial audioInitialize Local Spatial Audio EngineCreate local spatial audio engineInitialize the engineSpatial audio effects for usersSend local spatial positionReceive spatial position of remote user(s)Call update self positionCall update remote positionSpatial audio effects for media playerSet spatial position of the userSet spatial position of the media playerSend and receive spatial audioClean upClear remote positionsDestroy the spatial engine \ No newline at end of file +Your client and serverAgoraUserUserAgora SDKAgora SDKYour ServerYour ServerSD-RTNSD-RTNLocal SpatialAudio EngineLocal SpatialAudio EngineEnable spatial audioInitialize Local Spatial Audio EngineCreate local spatial audio engineInitialize the engineSpatial audio effects for usersSend local spatial positionReceive spatial position of remote user(s)Call update self positionCall update remote positionSpatial audio effects for media playerSet spatial position of the userSet spatial position of the media playerSend and receive spatial audioClean upClear remote positionsDestroy the spatial engine \ No newline at end of file diff --git a/assets/images/video-sdk/video-call-logic-android.svg b/assets/images/video-sdk/video-call-logic-android.svg index 24a916ae2..9b41e5e4f 100644 --- a/assets/images/video-sdk/video-call-logic-android.svg +++ b/assets/images/video-sdk/video-call-logic-android.svg @@ -1 +1 @@ -Your appAgoraUserUserVideo SDKVideo SDKSD-RTNSD-RTNOpen AppInitiate the Agora Video SDK engine:agoraEngine = RtcEngine.createEnable the video module:agoraEngine.enableVideo()UserJoin a callSetup local video:agoraEngine.setupLocalVideo(VideoCanvas)Start local preview:agoraEngine.startPreview()Retrieve authentication token to join channelJoin the channel:agoraEngine.joinChannel()Remote user joined:onUserJoined()Retrieve streaming from the remote user:agoraEngine.setupRemoteVideo(VideoCanvas)Receive and send data streamsLeave the callLeave the channel:agoraEngine.leaveChannel()Close appClean up local resources:agoraEngine.destroy() \ No newline at end of file +Your appAgoraUserUserVideo SDKVideo SDKSD-RTNSD-RTNOpen AppInitiate the Agora Video SDK engine:agoraEngine = RtcEngine.createEnable the video module:agoraEngine.enableVideo()UserJoin a callSetup local video:agoraEngine.setupLocalVideo(VideoCanvas)Start local preview:agoraEngine.startPreview()Retrieve authentication token to join channelJoin the channel:agoraEngine.joinChannel()Remote user joined:onUserJoined()Retrieve streaming from the remote user:agoraEngine.setupRemoteVideo(VideoCanvas)Receive and send data streamsLeave the callLeave the channel:agoraEngine.leaveChannel()Close appClean up local resources:agoraEngine.destroy() \ No newline at end of file diff --git a/assets/images/video-sdk/video-call-logic-electron.svg b/assets/images/video-sdk/video-call-logic-electron.svg index 6f190f53d..81faa4b28 100644 --- a/assets/images/video-sdk/video-call-logic-electron.svg +++ b/assets/images/video-sdk/video-call-logic-electron.svg @@ -1 +1 @@ -Your appAgoraUserUserVideo SDKVideo SDKSD-RTNSD-RTNOpen AppCreate an instance of the Video SDK engine:agoraEngine = agoraEngine.createAgoraRtcEngineInitialize the created instance:agoraEngine.initializeSet the role as host:agoraEngine.setClientRole(ClientRoleType.ClientRoleBroadcaster)Setup the callback functions:agoraEngine.registerEventHandlerUserStart callRetrieve authentication token to join channelSetup local video: agoraEngine.setupLocalVideoEnable the local video capturer:agoraEngine.enableVideoStart local preview :agoraEngine.startPreviewJoin a channel:agoraEngine.joinChannelJoin acceptedRetrieve streaming from the other user:agoraEngine.setupRemoteVideoReceive and send data streamsLeave callStop the local preview:stopPreviewleave the channel:agoraEngine.leaveChannel \ No newline at end of file +Your appAgoraUserUserVideo SDKVideo SDKSD-RTNSD-RTNOpen AppCreate an instance of the Video SDK engine:agoraEngine = agoraEngine.createAgoraRtcEngineInitialize the created instance:agoraEngine.initializeSet the role as host:agoraEngine.setClientRole(ClientRoleType.ClientRoleBroadcaster)Setup the callback functions:agoraEngine.registerEventHandlerUserStart callRetrieve authentication token to join channelSetup local video: agoraEngine.setupLocalVideoEnable the local video capturer:agoraEngine.enableVideoStart local preview :agoraEngine.startPreviewJoin a channel:agoraEngine.joinChannelJoin acceptedRetrieve streaming from the other user:agoraEngine.setupRemoteVideoReceive and send data streamsLeave callStop the local preview:stopPreviewleave the channel:agoraEngine.leaveChannel \ No newline at end of file diff --git a/assets/images/video-sdk/video-call-logic-flutter.svg b/assets/images/video-sdk/video-call-logic-flutter.svg index b8f1f139b..63a348e8f 100644 --- a/assets/images/video-sdk/video-call-logic-flutter.svg +++ b/assets/images/video-sdk/video-call-logic-flutter.svg @@ -1 +1 @@ -Your appAgoraUserUserVideo SDKVideo SDKSD-RTNSD-RTNOpen appInitialize the Agora Video SDK engine:agoraEngine = createAgoraRtcEngine()Enable the video module:agoraEngine.enableVideo()Register the event handler:agoraEngine.registerEventHandlerSetup AgoraVideoView widgetsfor local and remote videosVideo CallJoin a callSet a client role:agoraEngine.setClientRoleSet a channel profile:agoraEngine.setChannelProfileRetrieve authentication tokenJoin a channel using the token:agoraEngine.joinChannelStart local peview:agoraEngine.startPreview()Remote user joined:RtcEngineEventHandler onUserJoined:Send and receive data streamsDisplay remote video using AgoraVideoViewLeave callLeave the channelagoraEngine.leaveChannel()Close app \ No newline at end of file +Your appAgoraUserUserVideo SDKVideo SDKSD-RTNSD-RTNOpen appInitialize the Agora Video SDK engine:agoraEngine = createAgoraRtcEngine()Enable the video module:agoraEngine.enableVideo()Register the event handler:agoraEngine.registerEventHandlerSetup AgoraVideoView widgetsfor local and remote videosVideo CallJoin a callSet a client role:agoraEngine.setClientRoleSet a channel profile:agoraEngine.setChannelProfileRetrieve authentication tokenJoin a channel using the token:agoraEngine.joinChannelStart local peview:agoraEngine.startPreview()Remote user joined:RtcEngineEventHandler onUserJoined:Send and receive data streamsDisplay remote video using AgoraVideoViewLeave callLeave the channelagoraEngine.leaveChannel()Close app \ No newline at end of file diff --git a/assets/images/video-sdk/video-call-logic-ios.svg b/assets/images/video-sdk/video-call-logic-ios.svg index e47741b0a..939684c80 100644 --- a/assets/images/video-sdk/video-call-logic-ios.svg +++ b/assets/images/video-sdk/video-call-logic-ios.svg @@ -1 +1 @@ -Your appAgoraUserUserVideo SDKVideo SDKSD-RTNSD-RTNOpen appInitiate the Video SDK engine:agoraEngine = AgoraRtcEngineKit.sharedEngineSetup the local video stream:agoraEngine.enableVideo()HostStart a callIn a call, all users send to the channel:agoraEngine.setClientRole(.broadcaster)Start local video:agoraEngine.setupLocalVideo(videoCanvas)Join the channel:agoraEngine?.joinChannelRetrieve streaming from the other user:agoraEngine.setupRemoteVideo(videoCanvas)Receive and send data streamsLeave the callStop local video:agoraEngine.stopPreview()Leave the channel:agoraEngine.leaveChannel(nil)Close appClean up local resources:AgoraRtcEngineKit.destroy() \ No newline at end of file +Your appAgoraUserUserVideo SDKVideo SDKSD-RTNSD-RTNOpen appInitiate the Video SDK engine:agoraEngine = AgoraRtcEngineKit.sharedEngineSetup the local video stream:agoraEngine.enableVideo()HostStart a callIn a call, all users send to the channel:agoraEngine.setClientRole(.broadcaster)Start local video:agoraEngine.setupLocalVideo(videoCanvas)Join the channel:agoraEngine?.joinChannelRetrieve streaming from the other user:agoraEngine.setupRemoteVideo(videoCanvas)Receive and send data streamsLeave the callStop local video:agoraEngine.stopPreview()Leave the channel:agoraEngine.leaveChannel(nil)Close appClean up local resources:AgoraRtcEngineKit.destroy() \ No newline at end of file diff --git a/assets/images/video-sdk/video-call-logic-reactjs.puml b/assets/images/video-sdk/video-call-logic-reactjs.puml new file mode 100644 index 000000000..8757e6e33 --- /dev/null +++ b/assets/images/video-sdk/video-call-logic-reactjs.puml @@ -0,0 +1,35 @@ +@startuml video-call-logic-web +!include agora_skin.iuml + +actor "User" as USR + +box "Your app" + +participant "Video SDK" as APP + +end box + +box "Agora" + +participant "SD-RTN™" as API + +end box + +USR -> APP: Open App +APP -> APP: Setup app to handle local hardware and streaming. +group User +USR -> APP: Start call +APP -> APP: Create the agoraEngine\nconst agoraEngine = useRTCClient(AgoraRTC.createClient +APP -> APP: Retrieve authentication token to join channel +APP -> API: Join a channel:\n useJoin +API -> APP : Join accepted +APP -> APP: Create local media tracks :\nconst { isLoading: isLoadingCam, localCameraTrack } = useLocalCameraTrack();\nconst { isLoading: isLoadingMic, localMicrophoneTrack } = useLocalMicroph +APP -> API: Push local media tracks to the channel:\n usePublish([localMicrophoneTrack, localCameraTrack]); +API -> APP: Retrieve streaming from the other user: \n +API <-> APP: Receive and send data streams +end +USR -> APP: Leave call +APP -> API: leave the channel:\n \n useJoin + + +@enduml diff --git a/assets/images/video-sdk/video-call-logic-reactjs.svg b/assets/images/video-sdk/video-call-logic-reactjs.svg new file mode 100644 index 000000000..89d111cde --- /dev/null +++ b/assets/images/video-sdk/video-call-logic-reactjs.svg @@ -0,0 +1 @@ +Your appAgoraUserUserVideo SDKVideo SDKSD-RTN™SD-RTN™Open AppSetup app to handle local hardware and streaming.UserStart callCreate the agoraEngineconst agoraEngine = useRTCClient(AgoraRTC.createClientRetrieve authentication token to join channelJoin a channel:useJoinJoin acceptedCreate local media tracks :const { isLoading: isLoadingCam, localCameraTrack } = useLocalCameraTrack();const { isLoading: isLoadingMic, localMicrophoneTrack } = useLocalMicrophPush local media tracks to the channel:usePublish([localMicrophoneTrack, localCameraTrack]);Retrieve streaming from the other user:<RemoteUser user={remoteUser} playVideo={true} playAudio={true} />Receive and send data streamsLeave callleave the channel: useJoin \ No newline at end of file diff --git a/assets/images/video-sdk/video-call-logic-template.svg b/assets/images/video-sdk/video-call-logic-template.svg index 571701706..163ea25d6 100644 --- a/assets/images/video-sdk/video-call-logic-template.svg +++ b/assets/images/video-sdk/video-call-logic-template.svg @@ -1,468 +1 @@ -Your appAgoraUserUserVideo SDKVideo SDKSD-RTNSD-RTNOpen appInitiate the Video SDK engine.Start video in the engine.HostStart callIn a call, all users broadcast to the channel.Start local video.Join the channel.Retrieve streaming from the other user.Receive and send data streamsLeave callStop local video.Leave the channel.Close appClean up local resources. \ No newline at end of file +Your appAgoraUserUserVideo SDKVideo SDKSD-RTNSD-RTNOpen appInitiate the Video SDK engine.Start video in the engine.HostStart callIn a call, all users broadcast to the channel.Start local video.Join the channel.Retrieve streaming from the other user.Receive and send data streamsLeave callStop local video.Leave the channel.Close appClean up local resources. \ No newline at end of file diff --git a/assets/images/video-sdk/video-call-logic-unity.svg b/assets/images/video-sdk/video-call-logic-unity.svg index cf9c1aa56..f3abb23fa 100644 --- a/assets/images/video-sdk/video-call-logic-unity.svg +++ b/assets/images/video-sdk/video-call-logic-unity.svg @@ -1 +1 @@ -Your gameAgoraUserUserVideo SDKVideo SDKSD-RTNSD-RTNOpen gameCreate an RtcEngine instance:RtcEngine = Agora.Rtc.RtcEngine.CreateAgoraRtcEngine()Set the context:RtcEngineContext context = new RtcEngineContext(_appID, 0, true,CHANNEL_PROFILE_TYPE.CHANNEL_PROFILE_LIVE_BROADCASTING,AUDIO_SCENARIO_TYPE.AUDIO_SCENARIO_DEFAULT)Initialize RtcEngine:RtcEngine.Initialize(context)Video CallStart callEnable the video module:RtcEngine.EnableVideo()Set the user role as broadcaster:RtcEngine.SetClientRole(CLIENT_ROLE_TYPE.CLIENT_ROLE_BROADCASTER)Join call:RtcEngine.JoinChannel()Receive and send data streamLeave callLeave the channel:RtcEngine.LeaveChannel()Disable the video modules:RtcEngine.DisableVideo()Close gameClean up local resources:RtcEngine.Dispose() \ No newline at end of file +Your gameAgoraUserUserVideo SDKVideo SDKSD-RTNSD-RTNOpen gameCreate an RtcEngine instance:RtcEngine = Agora.Rtc.RtcEngine.CreateAgoraRtcEngine()Set the context:RtcEngineContext context = new RtcEngineContext(_appID, 0, true,CHANNEL_PROFILE_TYPE.CHANNEL_PROFILE_LIVE_BROADCASTING,AUDIO_SCENARIO_TYPE.AUDIO_SCENARIO_DEFAULT)Initialize RtcEngine:RtcEngine.Initialize(context)Video CallStart callEnable the video module:RtcEngine.EnableVideo()Set the user role as broadcaster:RtcEngine.SetClientRole(CLIENT_ROLE_TYPE.CLIENT_ROLE_BROADCASTER)Join call:RtcEngine.JoinChannel()Receive and send data streamLeave callLeave the channel:RtcEngine.LeaveChannel()Disable the video modules:RtcEngine.DisableVideo()Close gameClean up local resources:RtcEngine.Dispose() \ No newline at end of file diff --git a/assets/images/video-sdk/video-call-logic-unreal.svg b/assets/images/video-sdk/video-call-logic-unreal.svg index 8315cc592..6b670fd49 100644 --- a/assets/images/video-sdk/video-call-logic-unreal.svg +++ b/assets/images/video-sdk/video-call-logic-unreal.svg @@ -1 +1 @@ -Your appAgoraUserUserVideo SDKVideo SDKSD-RTNSD-RTNOpen AppInitiate the Agora Video SDK engine:agoraEngine = agora::rtc::ue::createAgoraRtcEngine()Enable the audio and video modules:agoraEngine->enableVideo()agoraEngine->enableAudio();UserJoin a callSetup local video:agoraEngine->setupLocalVideo(videoCanvas)Retrieve authentication token to join channelJoin the channel:agoraEngine->joinChannel()Remote user joined:onUserJoined()Retrieve streaming from the remote user:agoraEngine->setupRemoteVideo(videoCanvas)Receive and send data streamsLeave the callLeave the channel:agoraEngine->leaveChannel()Close appClean up local resources:agoraEngine->release() \ No newline at end of file +Your appAgoraUserUserVideo SDKVideo SDKSD-RTNSD-RTNOpen AppInitiate the Agora Video SDK engine:agoraEngine = agora::rtc::ue::createAgoraRtcEngine()Enable the audio and video modules:agoraEngine->enableVideo()agoraEngine->enableAudio();UserJoin a callSetup local video:agoraEngine->setupLocalVideo(videoCanvas)Retrieve authentication token to join channelJoin the channel:agoraEngine->joinChannel()Remote user joined:onUserJoined()Retrieve streaming from the remote user:agoraEngine->setupRemoteVideo(videoCanvas)Receive and send data streamsLeave the callLeave the channel:agoraEngine->leaveChannel()Close appClean up local resources:agoraEngine->release() \ No newline at end of file diff --git a/assets/images/video-sdk/video-call-logic-web.svg b/assets/images/video-sdk/video-call-logic-web.svg index b27d5d3c5..002a563f8 100644 --- a/assets/images/video-sdk/video-call-logic-web.svg +++ b/assets/images/video-sdk/video-call-logic-web.svg @@ -1 +1 @@ -Your appAgoraUserUserVideo SDKVideo SDKSD-RTNSD-RTNOpen AppInitiate the Video SDK engine:agoraEngine = AgoraRTC.createClientSet the required event listners:agoraEngine.on("user-published")agoraEngine.on("user-unpublished")UserStart callRetrieve authentication token to join channelJoin a channel:agoraEngine.joinJoin acceptedCreate local media tracks :AgoraRTC.createMicrophoneAudioTrackAgoraRTC.createCameraVideoTrackPush local media tracks to the channel:agoraEngine.publishRetrieve streaming from the other user:agoraEngine.on("user-published")Play remote media tracks: remoteVideoTrack.playremoteAudioTrack.playReceive and send data streamsLeave callleave the channel:agoraEngine.leave \ No newline at end of file +Your appAgoraUserUserVideo SDKVideo SDKSD-RTNâ„¢SD-RTNâ„¢Open AppSetup app to handle local hardware and streaming.UserStart callCreate the agoraEngineconst agoraEngine = useRTCClient(AgoraRTC.createClientRetrieve authentication token to join channelJoin a channel:useJoinJoin acceptedCreate local media tracks :const { isLoading: isLoadingCam, localCameraTrack } = useLocalCameraTrack();const { isLoading: isLoadingMic, localMicrophoneTrack } = useLocalMicrophPush local media tracks to the channel:usePublish([localMicrophoneTrack, localCameraTrack]);Retrieve streaming from the other user:<RemoteUser user={remoteUser} playVideo={true} playAudio={true} />Receive and send data streamsLeave callleave the channel: useJoin \ No newline at end of file diff --git a/assets/images/video-sdk/video_call_workflow.svg b/assets/images/video-sdk/video_call_workflow.svg new file mode 100644 index 000000000..83aa472a0 --- /dev/null +++ b/assets/images/video-sdk/video_call_workflow.svg @@ -0,0 +1 @@ +Call implementerAgoraClientClientToken ServerToken ServerClientClientAgora PlatformAgora PlatformSetup clientUser login to implementor securityAuthenticate and retrieve tokenUser login to implementor securityAuthenticate and retrieve tokenCreate client instanceSet client role: hostCreate client instanceSet client role: audienceRun callConnect to local audio and video resourcesJoin channelSend audio and video to channelJoin channelSend audio and video to channelCommunicateEnd callClose audio and videoLeave callClose audio and videoLeave call \ No newline at end of file diff --git a/assets/images/video-sdk/video_call_workflow_run_end.svg b/assets/images/video-sdk/video_call_workflow_run_end.svg new file mode 100644 index 000000000..ae2b2a5a8 --- /dev/null +++ b/assets/images/video-sdk/video_call_workflow_run_end.svg @@ -0,0 +1 @@ +Call implementerAgoraClientClientClientClientAgora PlatformAgora PlatformRun callConnect to local audio and video resourcesJoin channelSend audio and video to channelConnect to local audio and video resourcesJoin channelSend audio and video to channelCommunicateEnd callClose audio and videoLeave channelClose audio and videoLeave channel \ No newline at end of file diff --git a/assets/images/voice-sdk/authentication-logic.svg b/assets/images/voice-sdk/authentication-logic.svg index 7e964a030..71c783045 100644 --- a/assets/images/voice-sdk/authentication-logic.svg +++ b/assets/images/voice-sdk/authentication-logic.svg @@ -1,450 +1 @@ -Implemented by youProvided by AgoraToken serverToken serverAppAppSD-RTNSD-RTNJoin a channel with authenticationRequest a token using channel name,role, token type, and user IDValidate user againstinternal securityGenerate a token and return it to the clientJoin a channel with uid,channel name, and tokenValidatethe tokenTrigger the callback afteradding user to the channelRenew tokenTrigger event: Token Privilege will ExpireRequest a fresh token using channel name,role, token type, and user IDValidate user request against internal logicGenerate a fresh token and return it to the clientSend the fresh tokenwith a RenewToken request \ No newline at end of file +Implemented by youProvided by AgoraToken serverToken serverAppAppSD-RTNSD-RTNJoin a channel with authenticationRequest a token using channel name,role, token type, and user IDValidate user againstinternal securityGenerate a token and return it to the clientJoin a channel with uid,channel name, and tokenValidatethe tokenTrigger the callback afteradding user to the channelRenew tokenTrigger event: Token Privilege will ExpireRequest a fresh token using channel name,role, token type, and user IDValidate user request against internal logicGenerate a fresh token and return it to the clientSend the fresh tokenwith a RenewToken request \ No newline at end of file diff --git a/assets/images/voice-sdk/ensure-voice-quality.svg b/assets/images/voice-sdk/ensure-voice-quality.svg index 61f224ae5..ccd3465a3 100644 --- a/assets/images/voice-sdk/ensure-voice-quality.svg +++ b/assets/images/voice-sdk/ensure-voice-quality.svg @@ -1,494 +1 @@ -Your appAgoraUserUserVoice SDKVoice SDKSD-RTNSD-RTNOpen AppSet log file configuration and create engineSet log file parametersUse Voice SK to create an instance of Agora EnginePre-call testsStart the echo testCall the method to start the echo testSend and receive backaudio after a delayto test hardware and network qualityStart the network probe testCall the method to start the network probe testDeliver network quality scoreand network statisticsSet the audio profileSpecify audio profile and scenariobased on the nature of the appCall the method to setthe audio profile and scenarioJoin channelJoin channelMonitor in-call qualityEnable the quality statisticsRecieve network, call, and audio quality statisticsRecieve state change notificationsNotify the userTake corrective actionEcho cancellationChoose the audio file to be playedand specify mixing optionsCall the method tostart audio mixingUses echo-cancellationfeatures to remove echo \ No newline at end of file +Your appAgoraUserUserVoice SDKVoice SDKSD-RTNSD-RTNOpen AppSet log file configuration and create engineSet log file parametersUse Voice SK to create an instance of Agora EnginePre-call testsStart the echo testCall the method to start the echo testSend and receive backaudio after a delayto test hardware and network qualityStart the network probe testCall the method to start the network probe testDeliver network quality scoreand network statisticsSet the audio profileSpecify audio profile and scenariobased on the nature of the appCall the method to setthe audio profile and scenarioJoin channelJoin channelMonitor in-call qualityEnable the quality statisticsRecieve network, call, and audio quality statisticsRecieve state change notificationsNotify the userTake corrective actionEcho cancellationChoose the audio file to be playedand specify mixing optionsCall the method tostart audio mixingUses echo-cancellationfeatures to remove echo \ No newline at end of file diff --git a/assets/images/voice-sdk/geofencing.svg b/assets/images/voice-sdk/geofencing.svg index bf480faab..1ded373b7 100644 --- a/assets/images/voice-sdk/geofencing.svg +++ b/assets/images/voice-sdk/geofencing.svg @@ -1,444 +1 @@ -Implemented by youProvided by AgoraUserUserAppAppSD-RTNSD-RTNStart the appGeofencingSet SD-RTN region in the Agoraengine configurationInitiate the Agora engineConnect to SD-RTN in aspecific regionSuccess responseSelect a channel to joinJoin a channel with userId, channel name, and tokenJoin accepted \ No newline at end of file +Implemented by youProvided by AgoraUserUserAppAppSD-RTNSD-RTNStart the appGeofencingSet SD-RTN region in the Agoraengine configurationInitiate the Agora engineConnect to SD-RTN in aspecific regionSuccess responseSelect a channel to joinJoin a channel with userId, channel name, and tokenJoin accepted \ No newline at end of file diff --git a/assets/images/voice-sdk/integrated-token-generation.svg b/assets/images/voice-sdk/integrated-token-generation.svg index ba59d35ec..674b682f6 100644 --- a/assets/images/voice-sdk/integrated-token-generation.svg +++ b/assets/images/voice-sdk/integrated-token-generation.svg @@ -1,448 +1 @@ -Implemented by youProvided byAgoraUserUserAppAppDeveloper'sAuthenticationSystemDeveloper'sAuthenticationSystemSD-RTNSD-RTNJoin a Channel with AuthenticationStart the appLogin to youridentity management system.Select a channelRequest an Agora authentication token usingchannel name, role, token type and user IdValidate user requestagainst internal securityUse integrated Agora libraryto generate a tokenReturn the token to the clientJoin a channel with user Id, channel name, and tokenValidatethe tokenTrigger the callback after adding user to the channel \ No newline at end of file +Implemented by youProvided byAgoraUserUserAppAppDeveloper'sAuthenticationSystemDeveloper'sAuthenticationSystemSD-RTNSD-RTNJoin a Channel with AuthenticationStart the appLogin to youridentity management system.Select a channelRequest an Agora authentication token usingchannel name, role, token type and user IdValidate user requestagainst internal securityUse integrated Agora libraryto generate a tokenReturn the token to the clientJoin a channel with user Id, channel name, and tokenValidatethe tokenTrigger the callback after adding user to the channel \ No newline at end of file diff --git a/assets/images/voice-sdk/process-raw-audio.svg b/assets/images/voice-sdk/process-raw-audio.svg new file mode 100644 index 000000000..c86fe78cf --- /dev/null +++ b/assets/images/voice-sdk/process-raw-audio.svg @@ -0,0 +1 @@ +Your appAgoraUserUserVoice SDKVoice SDKSD-RTNSD-RTNOpen AppCreate an instance of Agora Engine using Voice SDKSetup raw data processingSetup the audio frame observerJoinJoin a channelRegister the audio frame observerSet audio frame parametersRetrieve authentication token to join a channelJoin the channelProcess raw audio dataGet the raw data in the callbacksSend the processed data back with the callbacksLeaveLeave the channelUnegister the audio frame observerLeave the channel \ No newline at end of file diff --git a/assets/images/voice-sdk/product-workflow-voice-web.svg b/assets/images/voice-sdk/product-workflow-voice-web.svg index 96435defa..de3db9e7d 100644 --- a/assets/images/voice-sdk/product-workflow-voice-web.svg +++ b/assets/images/voice-sdk/product-workflow-voice-web.svg @@ -1,476 +1 @@ -Your appAgoraUserUserVoice SDKVoice SDKSD-RTNSD-RTNOpen AppUse Voice SDK to create an Agora Engine instanceCreate and play the local audio trackBypass autoplay block whenonAutoplayFailed event occursJoinJoin a channelRetrieve authentication token to join a channelJoin the channelPublish and SubscribePublish the microphone track to the channelSubscribe to tracks from other usersManage local and remote audio tracksCommon workflowsBypass autoplay blockingUnpublish the local audio trackAdjust volumeCall API methods to adjust or mutethe local or remote audio trackMute/Unmute audioCall the API method to mute or unmutethe local audio trackLeave the channelLeave the channel \ No newline at end of file +Your appAgoraUserUserVoice SDKVoice SDKSD-RTNSD-RTNOpen AppUse Voice SDK to create an Agora Engine instanceCreate and play the local audio trackBypass autoplay block whenonAutoplayFailed event occursJoinJoin a channelRetrieve authentication token to join a channelJoin the channelPublish and SubscribePublish the microphone track to the channelSubscribe to tracks from other usersManage local and remote audio tracksCommon workflowsBypass autoplay blockingUnpublish the local audio trackAdjust volumeCall API methods to adjust or mutethe local or remote audio trackMute/Unmute audioCall the API method to mute or unmutethe local audio trackLeave the channelLeave the channel \ No newline at end of file diff --git a/assets/images/voice-sdk/product-workflow-voice.svg b/assets/images/voice-sdk/product-workflow-voice.svg new file mode 100644 index 000000000..992da5a98 --- /dev/null +++ b/assets/images/voice-sdk/product-workflow-voice.svg @@ -0,0 +1 @@ +Your appAgoraUserUserVoice SDKVoice SDKSD-RTNSD-RTNOpen AppUse Voice SDK to create an Agora Engine instanceJoinJoin a channelRetrieve authentication token to join a channelJoin the channelPublish and SubscribePublish microphone stream to the channelSubscribe to streams from other usersManage local and remote streamsCommon workflowsAdjust volumeCall API methods to adjust or mute volumeLeave the channelLeave the channel \ No newline at end of file diff --git a/assets/images/voice-sdk/voice-call-logic-electron.svg b/assets/images/voice-sdk/voice-call-logic-electron.svg index bc6d279cd..437257c0f 100644 --- a/assets/images/voice-sdk/voice-call-logic-electron.svg +++ b/assets/images/voice-sdk/voice-call-logic-electron.svg @@ -1,454 +1 @@ -Your appAgoraUserUserVoice SDKVoice SDKSD-RTNSD-RTNOpen AppCreate an instance of the Voice SDK engine:agoraEngine = agoraEngine.createAgoraRtcEngineInitialize the created instance:agoraEngine.initializeUserStart callRetrieve authentication token to join channelJoin a channel:agoraEngine.joinChannelJoin acceptedReceive and send audio streamLeave callleave the channel:agoraEngine.leaveChannel \ No newline at end of file +Your appAgoraUserUserVoice SDKVoice SDKSD-RTNSD-RTNOpen AppCreate an instance of the Voice SDK engine:agoraEngine = agoraEngine.createAgoraRtcEngineInitialize the created instance:agoraEngine.initializeUserStart callRetrieve authentication token to join channelJoin a channel:agoraEngine.joinChannelJoin acceptedReceive and send audio streamLeave callleave the channel:agoraEngine.leaveChannel \ No newline at end of file diff --git a/assets/images/voice-sdk/voice-call-logic-flutter.svg b/assets/images/voice-sdk/voice-call-logic-flutter.svg index 72bfacb02..e8ba605b8 100644 --- a/assets/images/voice-sdk/voice-call-logic-flutter.svg +++ b/assets/images/voice-sdk/voice-call-logic-flutter.svg @@ -1,454 +1 @@ -Your appAgoraUserUserVoice SDKVoice SDKSD-RTNSD-RTNOpen appInitialize the Agora Voice SDK engine:agoraEngine = createAgoraRtcEngine()Register the event handler:agoraEngine.registerEventHandlerVoice CallJoin a callSet a client role:agoraEngine.setClientRoleSet a channel profile:agoraEngine.setChannelProfileRetrieve authentication tokenJoin a channel using the token:agoraEngine.joinChannelRemote user joined:RtcEngineEventHandler onUserJoined:Send and receive data streamsLeave callLeave the channelagoraEngine.leaveChannel()Close app \ No newline at end of file +Your appAgoraUserUserVoice SDKVoice SDKSD-RTNSD-RTNOpen appInitialize the Agora Voice SDK engine:agoraEngine = createAgoraRtcEngine()Register the event handler:agoraEngine.registerEventHandlerVoice CallJoin a callSet a client role:agoraEngine.setClientRoleSet a channel profile:agoraEngine.setChannelProfileRetrieve authentication tokenJoin a channel using the token:agoraEngine.joinChannelRemote user joined:RtcEngineEventHandler onUserJoined:Send and receive data streamsLeave callLeave the channelagoraEngine.leaveChannel()Close app \ No newline at end of file diff --git a/assets/images/voice-sdk/voice-call-logic-unity.svg b/assets/images/voice-sdk/voice-call-logic-unity.svg index 4a36623c8..a2fa676d0 100644 --- a/assets/images/voice-sdk/voice-call-logic-unity.svg +++ b/assets/images/voice-sdk/voice-call-logic-unity.svg @@ -1,470 +1 @@ -Your appAgoraUserUserVoice SDKVoice SDKSD-RTNSD-RTNOpen AppInitiate the Agora Voice SDK engine:agoraEngine = RtcEngine.createUserJoin a callRetrieve authentication token to join channelJoin the channel:agoraEngine.joinChannel()Remote user joined:onUserJoined()Receive and send data streamsLeave the callLeave the channel:agoraEngine.leaveChannel()Close appClean up local resources:agoraEngine.destroy() \ No newline at end of file +PlantUML 1.2023.12[From voice-call-logic-unity.puml (line 21) ] @startuml scale max 1000 widthskinparam linetype orthohide stereotype...... ( skipping 374 lines )...  actor "User" as USR box "Your app" participant "Voice SDK" as APP end box box "Agora" participant "SD-RTN" as API end box USR -> APP: Open appAPP -> APP: Create an Agora Voice SDK engine instance: \n RtcEngine = Agora.Rtc.RtcEngine.CreateAgoraRtcEngine()APP -> API: Set the context: \n RtcEngineContext context = new RtcEngineContext(_appID, 0, true,CHANNEL_PROFILE_TYPE.CHANNEL_PROFILE_LIVE_BROADCASTING,Syntax Error? \ No newline at end of file diff --git a/broadcast-streaming/get-started/get-started-sdk.mdx b/broadcast-streaming/get-started/get-started-sdk.mdx index 1ca6d22cf..930fd966a 100644 --- a/broadcast-streaming/get-started/get-started-sdk.mdx +++ b/broadcast-streaming/get-started/get-started-sdk.mdx @@ -6,7 +6,7 @@ description: > Rapidly develop and easily enhance your social, work, education and IoT apps with face-to-face interaction. --- -import GetStartedSDK from '@docs/shared/video-sdk/_get-started-sdk.mdx'; +import GetStartedSDK from '@docs/shared/video-sdk/get-started/get-started-sdk/index.mdx'; export const toc = [{}]; diff --git a/broadcast-streaming/reference/pricing.mdx b/broadcast-streaming/overview/pricing.mdx similarity index 94% rename from broadcast-streaming/reference/pricing.mdx rename to broadcast-streaming/overview/pricing.mdx index 02231738e..a7fe219e5 100644 --- a/broadcast-streaming/reference/pricing.mdx +++ b/broadcast-streaming/overview/pricing.mdx @@ -1,6 +1,6 @@ --- title: Pricing -sidebar_position: 1 +sidebar_position: 3 description: > Provides you with information on billing, fee deductions, free-of-charge policy, and any suspension to your account based on the account type. --- diff --git a/broadcast-streaming/reference/release-notes.mdx b/broadcast-streaming/overview/release-notes.mdx similarity index 94% rename from broadcast-streaming/reference/release-notes.mdx rename to broadcast-streaming/overview/release-notes.mdx index c346b7709..b38c29e4a 100644 --- a/broadcast-streaming/reference/release-notes.mdx +++ b/broadcast-streaming/overview/release-notes.mdx @@ -1,6 +1,6 @@ --- title: 'Release notes' -sidebar_position: 2 +sidebar_position: 4 type: docs description: > Information about changes in each release of Video Calling. diff --git a/broadcast-streaming/reference/supported-platforms.mdx b/broadcast-streaming/overview/supported-platforms.mdx similarity index 100% rename from broadcast-streaming/reference/supported-platforms.mdx rename to broadcast-streaming/overview/supported-platforms.mdx diff --git a/broadcast-streaming/reference/error-codes.mdx b/broadcast-streaming/reference/error-codes.mdx new file mode 100644 index 000000000..5a43fd283 --- /dev/null +++ b/broadcast-streaming/reference/error-codes.mdx @@ -0,0 +1,13 @@ +--- +title: 'Error codes' +sidebar_position: 5 +type: docs +description: > + List of commonly encountered API errors and their causes. +--- + +import ErrorCodes from '@docs/shared/video-sdk/reference/_error-codes.mdx'; + +export const toc = [{}]; + + \ No newline at end of file diff --git a/broadcast-streaming/reference/known-issues.mdx b/broadcast-streaming/reference/known-issues.mdx deleted file mode 100644 index 8fbe59683..000000000 --- a/broadcast-streaming/reference/known-issues.mdx +++ /dev/null @@ -1,15 +0,0 @@ ---- -title: 'Known issues' -sidebar_position: 2 -type: docs -description: > - Known issues and limitations of using the Web SDK. ---- - - -import KnownIssues from '@docs/shared/video-sdk/reference/_known-issues.mdx'; - -export const toc = [{}]; - - - diff --git a/cloud-recording/develop/individual-mode.md b/cloud-recording/develop/individual-mode.md index 617233f08..b3f3794d6 100644 --- a/cloud-recording/develop/individual-mode.md +++ b/cloud-recording/develop/individual-mode.md @@ -40,16 +40,16 @@ Before recording, call the [`acquire`](../reference/rest-api/acquire) method to - Request URL: - ``` json + ```json https://api.agora.io/v1/apps//cloud_recording/acquire - ``` + ``` - `Content-type`: `application/json;charset=utf-8` - `Authorization`: Basic authorization. For more information, see [How to pass the basic HTTP authentication](../reference/restful-authentication). - Request body: - ``` json + ```json { "cname": "https://xxxxx", "uid": "527841", @@ -79,7 +79,7 @@ In individual recording mode, you can configure the following parameters in `cli #### An HTTP request example of `start` - Request URL: - ``` json + ```json https://api.agora.io/v1/apps//cloud_recording/resourceid//mode/individual/start ``` - `Content-type`: `application/json;charset=utf-8` @@ -90,7 +90,7 @@ In individual recording mode, you can configure the following parameters in `cli **Real-time recording for standard mode** -``` json +```json { "uid": "527841", "cname": "httpClient463224", @@ -133,7 +133,7 @@ When a recording finishes, call [`stop`](../reference/rest-api/stop) to leave th #### An HTTP request example of `stop` - The request URL is: - ``` json + ```json http://api.agora.io/v1/apps//cloud_recording/resourceid//sid//mode/individual/stop ``` - `Content-type`: `application/json;charset=utf-8` @@ -143,7 +143,7 @@ When a recording finishes, call [`stop`](../reference/rest-api/stop) to leave th - Request body: - ``` json + ```json { "cname": "httpClient463224", "uid": "527841", diff --git a/cloud-recording/develop/integration-best-practices.md b/cloud-recording/develop/integration-best-practices.md index ad28156ef..3102578e0 100644 --- a/cloud-recording/develop/integration-best-practices.md +++ b/cloud-recording/develop/integration-best-practices.md @@ -95,7 +95,7 @@ To guarantee high availability of important scenes with a large audience, best p 1. Use Notifications to [Handle notifications for specific events](/en/cloud-recording/develop/receive-notifications#cloud-recording-callback-events). After starting the recording, if you don't receive event `13` `High availability register success` within 10 seconds, create a new recording task with a different UID. -These fault recovery methods may result in multiple recording tasks. You are charged separately for each task. For more information, see [Pricing](../reference/pricing). +These fault recovery methods may result in multiple recording tasks. You are charged separately for each task. For more information, see [Pricing](../overview/pricing). diff --git a/cloud-recording/develop/recording-video-profile.md b/cloud-recording/develop/recording-video-profile.md index 0d94aa98f..c342f5ed1 100644 --- a/cloud-recording/develop/recording-video-profile.md +++ b/cloud-recording/develop/recording-video-profile.md @@ -13,7 +13,7 @@ In individual recording mode, the recorded video keeps the original video profil ## Basic guidelines -- Agora recommends setting the recording resolution lower than the [aggregate resolution](../reference/pricing#resolution-calibration) of the original video streams, otherwise the recorded video may be blurry. +- Agora recommends setting the recording resolution lower than the [aggregate resolution](../overview/pricing#resolution-calibration) of the original video streams, otherwise the recorded video may be blurry. - The resolution you set in the video profile is that of the video canvas, and its aspect ratio does not need to be identical to any source video stream. The aspect ratio of each user region in the output video depends on the aspect ratio of the canvas and the video layout. See [Related articles](#related-articles). - Agora only supports the following frame rates: 1 fps, 7 fps, 10 fps, 15 fps, 24 fps, 30 fps, and 60 fps. The default value is 15 fps. If you set other frame rates, the SDK uses the default value. - The base bitrate in the video profile table applies to the communication profile. The live-broadcast profile generally requires a higher bitrate to ensure better video quality. Set the bitrate of the live-broadcast profile as twice the base bitrate. diff --git a/cloud-recording/develop/screen-capture.md b/cloud-recording/develop/screen-capture.md index 79161b2b6..a829a521d 100644 --- a/cloud-recording/develop/screen-capture.md +++ b/cloud-recording/develop/screen-capture.md @@ -15,7 +15,7 @@ The following two screenshot methods are supported: - Take screenshots only. - Capture screenshots and recording during a recording process. Agora only charges recording fees. -For pricing details, see [Pricing](../reference/pricing). +For pricing details, see [Pricing](../overview/pricing). To implement client-side screen capture, see [Screenshot Upload](../../video-calling/enable-features/screenshot-upload). diff --git a/cloud-recording/develop/webpage-mode.md b/cloud-recording/develop/webpage-mode.md index d027fe728..6a5583f26 100644 --- a/cloud-recording/develop/webpage-mode.md +++ b/cloud-recording/develop/webpage-mode.md @@ -302,7 +302,7 @@ A web page recording session generates one M3U8 file and multiple TS files. Depe ## Pricing -Web page recording mode is free to use by November 1, 2021. See [Pricing for Web Page Recording](../reference/pricing-webpage-recording) for details. +Web page recording mode is free to use by November 1, 2021. See [Pricing for Web Page Recording](../overview/pricing-webpage-recording) for details. ## Considerations diff --git a/cloud-recording/reference/pricing-webpage-recording.md b/cloud-recording/overview/pricing-webpage-recording.md similarity index 94% rename from cloud-recording/reference/pricing-webpage-recording.md rename to cloud-recording/overview/pricing-webpage-recording.md index fa8ef898d..d21b94e50 100644 --- a/cloud-recording/reference/pricing-webpage-recording.md +++ b/cloud-recording/overview/pricing-webpage-recording.md @@ -1,6 +1,6 @@ --- title: "Pricing for Web Page Recording" -sidebar_position: 2 +sidebar_position: 4 type: docs platform_selector: false description: > @@ -67,7 +67,7 @@ following categories: ## Preferential billing policies If Video SDK is used in the web page being recorded to implement real-time communications, and the user is -subscribed to a channel with a high-definition (HD) [aggregate video resolution](../reference/pricing#aggregate), Agora waives the cost of the video usage during the web page recording; only the web page recording fees apply. Real-time communication at higher aggregate resolutions does not receive this discount. +subscribed to a channel with a high-definition (HD) [aggregate video resolution](pricing#aggregate), Agora waives the cost of the video usage during the web page recording; only the web page recording fees apply. Real-time communication at higher aggregate resolutions does not receive this discount. ## Examples @@ -110,4 +110,4 @@ actual business scenario or actively stop the web page recording. - [Agora's free-of-charge policy for the first 10,000 minutes](../reference/billing-policies#agoras-free-of-charge-policy-for-the-first-10000-minutes) - [Billing, free deduction, and account suspension](../reference/billing-policies#billing-fee-deductions-and-account-suspension-policies) -- [Cloud Recording pricing](../reference/pricing) \ No newline at end of file +- [Cloud Recording pricing](pricing) \ No newline at end of file diff --git a/cloud-recording/reference/pricing.md b/cloud-recording/overview/pricing.md similarity index 99% rename from cloud-recording/reference/pricing.md rename to cloud-recording/overview/pricing.md index 17a590baa..458068025 100644 --- a/cloud-recording/reference/pricing.md +++ b/cloud-recording/overview/pricing.md @@ -1,6 +1,6 @@ --- -title: "Pricing" -sidebar_position: 1 +title: "Pricing for Cloud Recording" +sidebar_position: 3 type: docs platform_selector: false description: > @@ -307,4 +307,4 @@ When calculating the aggregate resolution, Agora counts the resolution of 225,28 - [Agora's free-of-charge policy for the first 10,000 minutes](../reference/billing-policies#agoras-free-of-charge-policy-for-the-first-10000-minutes) - [Billing, free deduction, and account suspension](../reference/billing-policies#billing-fee-deductions-and-account-suspension-policies) -- [Web Page Recording pricing](../reference/pricing-webpage-recording) \ No newline at end of file +- [Web Page Recording pricing](pricing-webpage-recording) \ No newline at end of file diff --git a/cloud-recording/reference/release-notes.mdx b/cloud-recording/overview/release-notes.mdx similarity index 93% rename from cloud-recording/reference/release-notes.mdx rename to cloud-recording/overview/release-notes.mdx index 7e111ed54..adffb5543 100644 --- a/cloud-recording/reference/release-notes.mdx +++ b/cloud-recording/overview/release-notes.mdx @@ -1,6 +1,6 @@ --- title: "Release notes" -sidebar_position: 3 +sidebar_position: 5 type: docs platform_selector: false description: > diff --git a/cloud-recording/reference/supported-platforms.mdx b/cloud-recording/overview/supported-platforms.mdx similarity index 93% rename from cloud-recording/reference/supported-platforms.mdx rename to cloud-recording/overview/supported-platforms.mdx index 47f841026..c1ebc9f81 100644 --- a/cloud-recording/reference/supported-platforms.mdx +++ b/cloud-recording/overview/supported-platforms.mdx @@ -1,6 +1,6 @@ --- title: 'Supported platforms' -sidebar_position: 7 +sidebar_position: 6 type: docs platform_selector: false description: > diff --git a/extensions-marketplace/develop/integrate/banuba.mdx b/extensions-marketplace/develop/integrate/banuba.mdx index c211fdcde..c21e76c6d 100644 --- a/extensions-marketplace/develop/integrate/banuba.mdx +++ b/extensions-marketplace/develop/integrate/banuba.mdx @@ -51,7 +51,7 @@ To receive a trial token or a full commercial licence from Banuba - please fill 4. Copy and Paste your Banuba client token into the appropriate section of `/BanubaAgoraFilters/Token.swift` with “ ” symbols. For example: - ```` swift + ````swift let banubaClientToken = "Banuba Token" ```` @@ -59,7 +59,7 @@ To receive a trial token or a full commercial licence from Banuba - please fill 6. Copy and Paste your agora token, app and chanel ID into appropriate section of `/BanubaAgoraFilters/Token.swift` with “ ” symbols. For example: - ```` swift + ````swift internal let agoraAppID = "Agora App ID" internal let agoraClientToken = "Agora Token" internal let agoraChannelId = "Agora Channel ID" diff --git a/extensions-marketplace/reference/release-notes.mdx b/extensions-marketplace/overview/release-notes.mdx similarity index 94% rename from extensions-marketplace/reference/release-notes.mdx rename to extensions-marketplace/overview/release-notes.mdx index 62af2a52d..8844b98ac 100644 --- a/extensions-marketplace/reference/release-notes.mdx +++ b/extensions-marketplace/overview/release-notes.mdx @@ -1,6 +1,6 @@ --- title: 'Release notes' -sidebar_position: 2 +sidebar_position: 3 type: docs description: > Information about changes in the releases of different extensions. diff --git a/extensions-marketplace/reference/supported-platforms.mdx b/extensions-marketplace/overview/supported-platforms.mdx similarity index 99% rename from extensions-marketplace/reference/supported-platforms.mdx rename to extensions-marketplace/overview/supported-platforms.mdx index 576a2513d..d27ff17cb 100644 --- a/extensions-marketplace/reference/supported-platforms.mdx +++ b/extensions-marketplace/overview/supported-platforms.mdx @@ -1,6 +1,6 @@ --- title: 'Supported platforms' -sidebar_position: 5 +sidebar_position: 4 type: docs description: > The platforms supported by this product. diff --git a/flexible-classroom/develop/authentication-workflow.mdx b/flexible-classroom/develop/authentication-workflow.mdx index 090f837e1..62ca6e095 100644 --- a/flexible-classroom/develop/authentication-workflow.mdx +++ b/flexible-classroom/develop/authentication-workflow.mdx @@ -84,7 +84,7 @@ In order to show the authentication workflow, this section shows how to build an - **AccessToken2** - ``` go + ```go package main import ( @@ -190,7 +190,7 @@ In order to show the authentication workflow, this section shows how to build an ``` - **AccessToken** - ``` go + ```go package main import ( @@ -296,19 +296,19 @@ In order to show the authentication workflow, this section shows how to build an 2. A `go.mod` file defines this module’s import path and dependency requirements. To create the `go.mod` for your token server, run the following command: - ``` shell + ```shell $ go mod init sampleServer ``` 3. Get dependencies by running the following command: - ``` shell + ```shell $ go get ``` 4. Start the server by running the following command: - ``` shell + ```shell $ go run server.go ``` @@ -330,7 +330,7 @@ In order to show the authentication workflow, this section shows how to build an 3. In `index.html`, add the following code to include the app logic in the UI, then replace `` with the path of the JS file you saved in step 2. - ``` html + ```html Signaling token demo @@ -346,7 +346,7 @@ In order to show the authentication workflow, this section shows how to build an 4. Create the app logic by editing `client.js` with the following content. Then replace `` with your App ID. The App ID must match the one in the server. You also need to replace `` with the host URL and port of the local Golang server you have just deployed, such as `10.53.3.234:8082`. - ``` js + ```js // Parameters for the login method let options = { token: "", @@ -499,7 +499,7 @@ This section introduces the method to generate a token. Take C++ - **AccessToken2** - ``` go + ```go func BuildToken(appId string, appCertificate string, userId string, expire uint32) (string, error) { token := accesstoken.NewAccessToken(appId, appCertificate, expire) serviceRtm := accesstoken.NewServiceRtm(userId) @@ -518,7 +518,7 @@ This section introduces the method to generate a token. Take C++ - **AccessToken** - ``` cpp + ```cpp static std::string buildToken(const std::string& appId, const std::string& appCertificate, const std::string& userAccount, @@ -547,7 +547,7 @@ This section introduces how to upgrade from AccessToken to AccessToken2 by examp 1. Replace the `rtmtokenbuilder` import statement: -``` go +```go // Replace "github.com/AgoraIO/Tools/DynamicKey/AgoraDynamicKey/go/src/RtmTokenBuilder" // with "github.com/AgoraIO/Tools/DynamicKey/AgoraDynamicKey/go/src/rtmtokenbuilder2". import ( @@ -564,7 +564,7 @@ import ( 2. Update the `BuildToken` function: -``` go +```go // Previously, it is `result, err := rtmtokenbuilder.BuildToken(appID, appCertificate, rtm_uid, rtmtokenbuilder.RoleRtmUser, expireTimestamp)`. // Now, remove `rtmtokenbuilder.RoleRtmUser`. result, err := rtmtokenbuilder.BuildToken(appID, appCertificate, rtm_uid, expireTimestamp) diff --git a/flexible-classroom/reference/release-notes.mdx b/flexible-classroom/overview/release-notes.mdx similarity index 94% rename from flexible-classroom/reference/release-notes.mdx rename to flexible-classroom/overview/release-notes.mdx index d72171e84..f52c16cd6 100644 --- a/flexible-classroom/reference/release-notes.mdx +++ b/flexible-classroom/overview/release-notes.mdx @@ -1,6 +1,6 @@ --- title: 'Release notes' -sidebar_position: 1 +sidebar_position: 4 type: docs description: > Information about changes in each release of Flexible Classroom. diff --git a/flexible-classroom/reference/supported-platforms.md b/flexible-classroom/overview/supported-platforms.md similarity index 99% rename from flexible-classroom/reference/supported-platforms.md rename to flexible-classroom/overview/supported-platforms.md index c3556a872..7f43df9ef 100644 --- a/flexible-classroom/reference/supported-platforms.md +++ b/flexible-classroom/overview/supported-platforms.md @@ -1,6 +1,6 @@ --- title: 'Platform support' -sidebar_position: 7 +sidebar_position: 5 type: docs description: > The platforms Flexible Classroom works with. diff --git a/interactive-live-streaming/get-started/get-started-sdk.mdx b/interactive-live-streaming/get-started/get-started-sdk.mdx index 3f17c340e..4132c5ba2 100644 --- a/interactive-live-streaming/get-started/get-started-sdk.mdx +++ b/interactive-live-streaming/get-started/get-started-sdk.mdx @@ -6,7 +6,7 @@ description: > Rapidly develop and easily enhance your social, work, education and IoT apps with face-to-face interaction. --- -import GetStartedSDK from '@docs/shared/video-sdk/_get-started-sdk.mdx'; +import GetStartedSDK from '@docs/shared/video-sdk/get-started/get-started-sdk/index.mdx'; export const toc = [{}]; diff --git a/interactive-live-streaming/reference/pricing.mdx b/interactive-live-streaming/overview/pricing.mdx similarity index 95% rename from interactive-live-streaming/reference/pricing.mdx rename to interactive-live-streaming/overview/pricing.mdx index 790fffa28..77782addc 100644 --- a/interactive-live-streaming/reference/pricing.mdx +++ b/interactive-live-streaming/overview/pricing.mdx @@ -1,6 +1,6 @@ --- title: Pricing -sidebar_position: 1 +sidebar_position: 3 description: > Provides you with information on billing, fee deductions, free-of-charge policy, and any suspension to your account based on the account type. --- diff --git a/interactive-live-streaming/reference/release-notes.mdx b/interactive-live-streaming/overview/release-notes.mdx similarity index 94% rename from interactive-live-streaming/reference/release-notes.mdx rename to interactive-live-streaming/overview/release-notes.mdx index 3e6e2f902..954c36367 100644 --- a/interactive-live-streaming/reference/release-notes.mdx +++ b/interactive-live-streaming/overview/release-notes.mdx @@ -1,6 +1,6 @@ --- title: 'Release notes' -sidebar_position: 2 +sidebar_position: 4 type: docs description: > Information about changes in each release of Video Calling. diff --git a/interactive-live-streaming/reference/supported-platforms.mdx b/interactive-live-streaming/overview/supported-platforms.mdx similarity index 100% rename from interactive-live-streaming/reference/supported-platforms.mdx rename to interactive-live-streaming/overview/supported-platforms.mdx diff --git a/interactive-live-streaming/reference/error-codes.mdx b/interactive-live-streaming/reference/error-codes.mdx new file mode 100644 index 000000000..5a43fd283 --- /dev/null +++ b/interactive-live-streaming/reference/error-codes.mdx @@ -0,0 +1,13 @@ +--- +title: 'Error codes' +sidebar_position: 5 +type: docs +description: > + List of commonly encountered API errors and their causes. +--- + +import ErrorCodes from '@docs/shared/video-sdk/reference/_error-codes.mdx'; + +export const toc = [{}]; + + \ No newline at end of file diff --git a/interactive-live-streaming/reference/known-issues.mdx b/interactive-live-streaming/reference/known-issues.mdx deleted file mode 100644 index 9447e4156..000000000 --- a/interactive-live-streaming/reference/known-issues.mdx +++ /dev/null @@ -1,15 +0,0 @@ ---- -title: 'Known issues' -sidebar_position: 2 -type: docs -description: > - Known issues and limitations of using the Web SDK. ---- - - -import KnownIssues from '@docs/shared/video-sdk/reference/_known-issues.mdx'; - -export const toc = [{}]; - - - diff --git a/interactive-whiteboard/develop/enable-whiteboard.md b/interactive-whiteboard/develop/enable-whiteboard.md index 49f6d285b..fae6bbdf3 100644 --- a/interactive-whiteboard/develop/enable-whiteboard.md +++ b/interactive-whiteboard/develop/enable-whiteboard.md @@ -61,7 +61,7 @@ Unexpected exposure of the security credentials can cause severe security proble - File conversion, including **Docs to Picture** and **Docs to web**. After enabling the file conversion feature, you can call the [RESTful APIs](../reference/whiteboard-api/file-conversion) to launch a file conversion task or query the conversion progress. -Agora charges for the file-conversion feature. See [Pricing](../reference/pricing). +Agora charges for the file-conversion feature. See [Pricing](../overview/pricing). - **Screenshot**. After enabling the screenshot feature, you can call the [RESTful APIs](../reference/whiteboard-api/screenshots) to take screenshots. Follow these steps to enable one or more features and configure the storage settings: diff --git a/interactive-whiteboard/reference/pricing.md b/interactive-whiteboard/overview/pricing.md similarity index 99% rename from interactive-whiteboard/reference/pricing.md rename to interactive-whiteboard/overview/pricing.md index d98b88aae..f6fb95ebd 100644 --- a/interactive-whiteboard/reference/pricing.md +++ b/interactive-whiteboard/overview/pricing.md @@ -1,6 +1,6 @@ --- title: Pricing -sidebar_position: 1 +sidebar_position: 3 description: > Provides you with information on billing, fee deductions, free-of-charge policy, and any suspension to your account based on the account type. --- diff --git a/interactive-whiteboard/reference/release-notes-uikit.mdx b/interactive-whiteboard/overview/release-notes-uikit.mdx similarity index 94% rename from interactive-whiteboard/reference/release-notes-uikit.mdx rename to interactive-whiteboard/overview/release-notes-uikit.mdx index 6037dfd26..1f2b5e09a 100644 --- a/interactive-whiteboard/reference/release-notes-uikit.mdx +++ b/interactive-whiteboard/overview/release-notes-uikit.mdx @@ -1,6 +1,6 @@ --- title: 'Release notes (Fastboard)' -sidebar_position: 4 +sidebar_position: 5 type: docs description: > Information about changes in each release of Fastboard. diff --git a/interactive-whiteboard/reference/release-notes.mdx b/interactive-whiteboard/overview/release-notes.mdx similarity index 94% rename from interactive-whiteboard/reference/release-notes.mdx rename to interactive-whiteboard/overview/release-notes.mdx index e5aff390c..cadae64f6 100644 --- a/interactive-whiteboard/reference/release-notes.mdx +++ b/interactive-whiteboard/overview/release-notes.mdx @@ -1,6 +1,6 @@ --- title: 'Release notes (Whiteboard)' -sidebar_position: 3 +sidebar_position: 4 type: docs description: > Information about changes in each release of Interactive whiteboard. diff --git a/interactive-whiteboard/reference/supported-platforms.mdx b/interactive-whiteboard/overview/supported-platforms.mdx similarity index 93% rename from interactive-whiteboard/reference/supported-platforms.mdx rename to interactive-whiteboard/overview/supported-platforms.mdx index 4bc2ca949..9426b2ea0 100644 --- a/interactive-whiteboard/reference/supported-platforms.mdx +++ b/interactive-whiteboard/overview/supported-platforms.mdx @@ -1,6 +1,6 @@ --- title: 'Supported platforms' -sidebar_position: 9 +sidebar_position: 6 type: docs description: > The platforms supported by this product. diff --git a/iot/reference/pricing.mdx b/iot/overview/pricing.mdx similarity index 94% rename from iot/reference/pricing.mdx rename to iot/overview/pricing.mdx index 93de702d2..780e84bec 100644 --- a/iot/reference/pricing.mdx +++ b/iot/overview/pricing.mdx @@ -1,6 +1,6 @@ --- title: Pricing -sidebar_position: 1 +sidebar_position: 3 description: > Provides you with information on billing, fee deductions, free-of-charge policy, and any suspension to your account based on the account type. --- diff --git a/iot/reference/release-notes.mdx b/iot/overview/release-notes.mdx similarity index 93% rename from iot/reference/release-notes.mdx rename to iot/overview/release-notes.mdx index 6499b2af4..fb910cdb3 100644 --- a/iot/reference/release-notes.mdx +++ b/iot/overview/release-notes.mdx @@ -1,6 +1,6 @@ --- title: 'Release notes' -sidebar_position: 1 +sidebar_position: 4 type: docs description: > Information about changes in each release of IoT SDK. diff --git a/iot/reference/supported-platforms.mdx b/iot/overview/supported-platforms.mdx similarity index 100% rename from iot/reference/supported-platforms.mdx rename to iot/overview/supported-platforms.mdx diff --git a/media-pull/develop/integration-best-practices.mdx b/media-pull/develop/integration-best-practices.mdx index ecb7f0feb..f23a46b04 100644 --- a/media-pull/develop/integration-best-practices.mdx +++ b/media-pull/develop/integration-best-practices.mdx @@ -66,7 +66,7 @@ In your app, subscribe audience members to the master stream and listen to the f When you receive these notification, notify the apps where users are subscribed to the channel as audience members so that they switch to a backup stream. -When you create multiple tasks, you are charged separately for each of them. For details, see [Media Pull pricing](../reference/pricing). +When you create multiple tasks, you are charged separately for each of them. For details, see [Media Pull pricing](../overview/pricing). diff --git a/media-pull/reference/pricing.mdx b/media-pull/overview/pricing.mdx similarity index 94% rename from media-pull/reference/pricing.mdx rename to media-pull/overview/pricing.mdx index 1d9dd030c..1993c706d 100644 --- a/media-pull/reference/pricing.mdx +++ b/media-pull/overview/pricing.mdx @@ -1,6 +1,6 @@ --- title: 'Pricing' -sidebar_position: 1 +sidebar_position: 2 type: docs platform_selector: false description: > diff --git a/media-pull/reference/release-notes.mdx b/media-pull/overview/release-notes.mdx similarity index 100% rename from media-pull/reference/release-notes.mdx rename to media-pull/overview/release-notes.mdx diff --git a/media-push/reference/pricing.mdx b/media-push/overview/pricing.mdx similarity index 94% rename from media-push/reference/pricing.mdx rename to media-push/overview/pricing.mdx index 68e1ded6e..8d4d2b896 100644 --- a/media-push/reference/pricing.mdx +++ b/media-push/overview/pricing.mdx @@ -1,6 +1,6 @@ --- title: 'Pricing' -sidebar_position: 1 +sidebar_position: 2 type: docs platform_selector: false description: > diff --git a/on-premise-recording/reference/billing.md b/on-premise-recording/overview/billing.md similarity index 99% rename from on-premise-recording/reference/billing.md rename to on-premise-recording/overview/billing.md index 0d9f257cb..4c66f1075 100644 --- a/on-premise-recording/reference/billing.md +++ b/on-premise-recording/overview/billing.md @@ -1,6 +1,6 @@ --- title: 'Pricing' -sidebar_position: 1 +sidebar_position: 2 type: docs description: > Pricing information for On-Premise Recording. diff --git a/on-premise-recording/reference/release-notes.mdx b/on-premise-recording/overview/release-notes.mdx similarity index 99% rename from on-premise-recording/reference/release-notes.mdx rename to on-premise-recording/overview/release-notes.mdx index 9c928c013..49e5b0aaf 100644 --- a/on-premise-recording/reference/release-notes.mdx +++ b/on-premise-recording/overview/release-notes.mdx @@ -1,6 +1,6 @@ --- -title: 'Release Notes' -sidebar_position: 2 +title: 'Release notes' +sidebar_position: 3 type: docs description: > The release notes for On-Premise Recording. diff --git a/on-premise-recording/reference/sunset.md b/on-premise-recording/reference/sunset.md index 4aa0b8d7c..39bb4c0f6 100644 --- a/on-premise-recording/reference/sunset.md +++ b/on-premise-recording/reference/sunset.md @@ -25,6 +25,6 @@ If you are using the On-Premise Recording SDK earlier than v3.0.0, upgrade as so The latest versions of the On-Premise Recording SDK have made significant improvements to user experience, service reliability, and security. To avoid service disruptions, upgrade the On-Premise Recording SDK that you are using as soon as possible by referring to the following information: - [SDK download links](https://docs.agora.io/en/Recording/downloads) -- [Release notes](../reference/release-notes) +- [Release notes](../overview/release-notes) If you encounter any problems, contact Agora for support. \ No newline at end of file diff --git a/on-premise-recording/reference/video-profile.md b/on-premise-recording/reference/video-profile.md index 96f51e80c..c5c6ae042 100644 --- a/on-premise-recording/reference/video-profile.md +++ b/on-premise-recording/reference/video-profile.md @@ -12,7 +12,7 @@ In composite recording mode, you can set the video profile (resolution, frame ra ## Basic guidelines -- Agora recommends setting the recording resolution lower than the [aggregate resolution](../reference/billing#aggregate-video-resolution) of the original video streams, otherwise the recorded video may be blurry. +- Agora recommends setting the recording resolution lower than the [aggregate resolution](../overview/billing#aggregate-video-resolution) of the original video streams, otherwise the recorded video may be blurry. - The resolution you set in the video profile is that of the video canvas, and its aspect ratio does not need to be identical to any source video stream. The aspect ratio of each user region in the output video depends on the aspect ratio of the canvas and the video layout. See [Related articles](#relateddocs). - Agora only supports the following frame rates: 1 fps, 7 fps, 10 fps, 15 fps, 24 fps, 30 fps, and 60 fps. The default value is 15 fps. If you set other frame rates, the SDK uses the default value. - The base bitrate in the video profile table applies to the communication profile. The live-broadcast profile generally requires a higher bitrate to ensure better video quality. Set the bitrate of the live-broadcast profile as twice the base bitrate. diff --git a/server-gateway/reference/pricing.mdx b/server-gateway/overview/pricing.mdx similarity index 93% rename from server-gateway/reference/pricing.mdx rename to server-gateway/overview/pricing.mdx index c94ba5a89..ae7324928 100644 --- a/server-gateway/reference/pricing.mdx +++ b/server-gateway/overview/pricing.mdx @@ -1,6 +1,6 @@ --- title: 'Pricing' -sidebar_position: 1 +sidebar_position: 2 type: docs description: > Shows the pricing policy for Cloud Gateway. diff --git a/server-gateway/reference/release-notes.mdx b/server-gateway/overview/release-notes.mdx similarity index 94% rename from server-gateway/reference/release-notes.mdx rename to server-gateway/overview/release-notes.mdx index 9d623ba46..a49d1bff0 100644 --- a/server-gateway/reference/release-notes.mdx +++ b/server-gateway/overview/release-notes.mdx @@ -1,6 +1,6 @@ --- title: 'Release notes' -sidebar_position: 2 +sidebar_position: 3 type: docs description: > Shows Cloud Gateway's past releases. diff --git a/shared/agora-analytics/_alarm.mdx b/shared/agora-analytics/_alarm.mdx index 0f5bca610..76d7afe8e 100644 --- a/shared/agora-analytics/_alarm.mdx +++ b/shared/agora-analytics/_alarm.mdx @@ -16,7 +16,7 @@ Alert Notifications provides the following features: To access the Alert Notifications page, do the following: -1. Subscribe to a [pricing plan](/agora-analytics/reference/pricing) to enable the **Alert Notifications** service. +1. Subscribe to a [pricing plan](/agora-analytics/overview/pricing) to enable the **Alert Notifications** service. 2. Log in to [Agora Console](https://console.agora.io/), and click **** > **Alert Notifications** on the left navigation bar. @@ -207,7 +207,7 @@ Alert information is sent to you through HTTP POST methods in JSON format. The f If the alert granularity is set as **Channel**: -``` json +```json { "alertTime":"1631703720000", // The timestamp (seconds) when the alert is sent "timeZone":"UTC+8", // The timezone you set for the rule @@ -224,7 +224,7 @@ If the alert granularity is set as **Channel**: If the alert granularity is set as **User**: -``` json +```json { "alertTime":"1631780400000", // The timestamp (seconds) when the alert is sent "timeZone":"UTC+8", // The timezone you set for the rule @@ -242,7 +242,7 @@ If the alert granularity is set as **User**: **Event alert** -``` json +```json { "alertTime":"1631785450000", // The timestamp (seconds) when the alert is sent "timeZone":"UTC+8", // The timezone you set for the rule diff --git a/shared/agora-analytics/_api.mdx b/shared/agora-analytics/_api.mdx index 8cd85b3e9..4981aded4 100644 --- a/shared/agora-analytics/_api.mdx +++ b/shared/agora-analytics/_api.mdx @@ -7,7 +7,7 @@ Before working with the RESTful APIs, review the features in [Ag - [Real-time Monitoring](#real-time-monitoring) -
To use Agora Analytics RESTful APIs, subscribe to an Agora Analytics pricing plan.
+
To use Agora Analytics RESTful APIs, subscribe to an Agora Analytics pricing plan.
## Authentication @@ -26,7 +26,7 @@ With the Call Inspector RESTful APIs, you can search for calls with quality issu ### API limits -The limits of the Call Inspector RESTful APIs depend on the [pricing plan](/agora-analytics/reference/pricing) you subscribe to. +The limits of the Call Inspector RESTful APIs depend on the [pricing plan](/agora-analytics/overview/pricing) you subscribe to. The Starter, Standard, Premium, and Enterprise pricing plans have the following differences in terms of API limits: @@ -280,7 +280,7 @@ With the Data Insights RESTful APIs, you can query the usage and quality metrics ### API limits -The limits of the Data Insights RESTful APIs depend on the [pricing plan](/agora-analytics/reference/pricing) you subscribe to. +The limits of the Data Insights RESTful APIs depend on the [pricing plan](/agora-analytics/overview/pricing) you subscribe to. The Starter, Standard, Premium, and Enterprise pricing plans have the following differences in terms of API limits: @@ -781,7 +781,7 @@ The data is returned in regular 20-second time windows starting from 00:00:00. F ### API limits -The limits of the Real-time Monitoring RESTful APIs depend on the [pricing plan](/agora-analytics/reference/pricing) you subscribe to. +The limits of the Real-time Monitoring RESTful APIs depend on the [pricing plan](/agora-analytics/overview/pricing) you subscribe to. The Starter, Standard, Premium, and Enterprise pricing plans have the following differences in terms of API limits: diff --git a/shared/agora-analytics/_call-search.mdx b/shared/agora-analytics/_call-search.mdx index c42f805f7..e3b872ba5 100644 --- a/shared/agora-analytics/_call-search.mdx +++ b/shared/agora-analytics/_call-search.mdx @@ -24,7 +24,7 @@ The following workflow shows how to use the Call Inspector features together: ### Enable Call Inspector -To enable Call Inspector, subscribe to an pricing plan. For details, see [Pricing](../../reference/pricing). +To enable Call Inspector, subscribe to an pricing plan. For details, see [Pricing](../../overview/pricing). ## Use Call Search @@ -42,7 +42,7 @@ To search calls, follow these steps: - Advanced search: Click **Advanced** in the upper right corner, add one or more filters as needed, and then click **Search**. -① The accessible time range depends on the data retention policy for Call Inspector features in your [Pricing Plans](/agora-analytics/reference/pricing). +① The accessible time range depends on the data retention policy for Call Inspector features in your [Pricing Plans](/agora-analytics/overview/pricing). @@ -52,7 +52,7 @@ The Call Overview page is designed to help you quickly understand the overall si To enter the Call Overview page, follow these steps: -1. Subscribe to the Premium or Enterprise pricing plan. See [Subscribe to a plan](/agora-analytics/reference/pricing#subscribe-to-a-plan). Other pricing plans do not provide access to the Call Overview page. +1. Subscribe to the Premium or Enterprise pricing plan. See [Subscribe to a plan](/agora-analytics/overview/pricing#subscribe-to-a-plan). Other pricing plans do not provide access to the Call Overview page. 2. [Use Call Search](#search) to find the call you want to inspect, then click **Call Details** in the **Action** column. - If [the number of Accumulated Call Users (ACU)](/agora-analytics/reference/call-search-terms#acu-accumulated-call-users) is greater than or equal to 50, you enter the Call Overview page. - If the number of ACU is less than 50, you first enter the Call Details page. Click the **Call Overview** tab on the top to switch. diff --git a/shared/agora-analytics/_data-insight-plus.mdx b/shared/agora-analytics/_data-insight-plus.mdx index 3b6dc82e0..7974b38e1 100644 --- a/shared/agora-analytics/_data-insight-plus.mdx +++ b/shared/agora-analytics/_data-insight-plus.mdx @@ -5,7 +5,7 @@ The regular version of Data Studio currently supports query analysis of time ser queries. offers additional capabilities, including multi-dimensional cross analysis, sampling analysis, comparative analysis, and extended querying of service indicators for , , , and . For a detailed comparison of -Standard and Premium, see [pricing](../../reference/pricing). +Standard and Premium, see [pricing](../../overview/pricing). **Want to try out this functionality? There is a 30 day trial period just for you. Please [submit a ticket](https://agora-ticket.agora.io/) to enroll for the trial.** diff --git a/shared/agora-analytics/_data-insight.mdx b/shared/agora-analytics/_data-insight.mdx index b6861eadc..ba46a31ad 100644 --- a/shared/agora-analytics/_data-insight.mdx +++ b/shared/agora-analytics/_data-insight.mdx @@ -7,7 +7,7 @@ import * as data from '@site/data/variables.js'; ## Getting started -1. Subscribe to a [pricing plan](/agora-analytics/reference/pricing) to enable the **Data Insights** feature for your project. +1. Subscribe to a [pricing plan](/agora-analytics/overview/pricing) to enable the **Data Insights** feature for your project. 2. Log in to [Agora Console](https://console.agora.io) and click **** on the left navigation bar. diff --git a/shared/agora-analytics/_embedded.mdx b/shared/agora-analytics/_embedded.mdx index 4cf7ecd27..3719d41ff 100644 --- a/shared/agora-analytics/_embedded.mdx +++ b/shared/agora-analytics/_embedded.mdx @@ -6,7 +6,7 @@ To use the **Embed** option, ensure that the following requirements are met: 1. Your internal portal is secure and has a mechanism for managing user access. -2. You have subscribed to a [pricing plan](/agora-analytics/reference/pricing) that provides the **Embed** option for the page you want. +2. You have subscribed to a [pricing plan](/agora-analytics/overview/pricing) that provides the **Embed** option for the page you want. ## Getting started @@ -55,7 +55,7 @@ The **Embedding Configuration** dialog shows a code sample for Node.js: The response is in JSON format and returns the URL to the feature page you request. For example, if your request specifies `feature` as `callSearch`, the response looks like this: -``` html +```html https://analytics-lab.agora.io/analytics/call/search?token=xxxxxxxxxxxxxxxxxxxxxx ``` @@ -151,7 +151,7 @@ If you only append `token` and `cname` to the URL, the embedded page displays th For example: -``` html +```html https://analytics-lab.agora.io/api/analytics/research?token=xxxxxxxxxxxxxxxxxxxxxx&cname=xxxxxxxxxxxxxxxxxxxxxxxx&fromUid=xxxxxx&toUid=xxxxxx ``` diff --git a/shared/agora-analytics/_monitor.mdx b/shared/agora-analytics/_monitor.mdx index 65b50c5e5..8684ecbd5 100644 --- a/shared/agora-analytics/_monitor.mdx +++ b/shared/agora-analytics/_monitor.mdx @@ -12,7 +12,7 @@ Real-time Monitoring provides the following features: To access the Real-time Monitoring page, do the following: -1. Subscribe to a [pricing plan](/agora-analytics/reference/pricing) to enable the **Real-time Monitoring** feature for your project. +1. Subscribe to a [pricing plan](/agora-analytics/overview/pricing) to enable the **Real-time Monitoring** feature for your project. 2. Log in to [Agora Console](https://console.agora.io/) and click **** > **Real-time Monitoring** on the left navigation bar. diff --git a/shared/agora-analytics/_pricing.mdx b/shared/agora-analytics/_pricing.mdx index c58befc29..d75aaedf2 100644 --- a/shared/agora-analytics/_pricing.mdx +++ b/shared/agora-analytics/_pricing.mdx @@ -225,7 +225,7 @@ Unsubscribing from a plan or switching to another plan takes effect on the first ## See also -- [Pricing for ](../../video-calling/reference/pricing) +- [Pricing for ](../../video-calling/overview/pricing) - [What are Agora’s policies on billing, fee deductions, and account suspension](../reference/billing-policies#billing-fee-deductions-and-account-suspension-policies) diff --git a/shared/chat-sdk/client-api/messages/_translate-messages.mdx b/shared/chat-sdk/client-api/messages/_translate-messages.mdx index e9ec614dd..f1d180f9b 100644 --- a/shared/chat-sdk/client-api/messages/_translate-messages.mdx +++ b/shared/chat-sdk/client-api/messages/_translate-messages.mdx @@ -18,7 +18,7 @@ Before proceeding, ensure that your development environment meets the following - Because this feature is enabled by the Microsoft Azure Translation API, ensure that you understand the supported target languages as described in [Language support](https://learn.microsoft.com/en-us/azure/ai-services/translator/language-support). - Translation is not enabled by default. To use this feature, you need to subscribe to the **Pro** or **Enterprise** [pricing plan](/agora-chat/reference/pricing-plan-details) and enable it in [Agora Console](https://console.agora.io/). -
Add-on fees are incurred if you use this feature. See [Pricing](/agora-chat/reference/pricing#optional-add-on-fee) for details.
+
Add-on fees are incurred if you use this feature. See [Pricing](/agora-chat/overview/pricing#optional-add-on-fee) for details.
## Understand the tech The Chat SDK provides the following methods for implementing translation functionalities: diff --git a/shared/chat-sdk/develop/_authentication.mdx b/shared/chat-sdk/develop/_authentication.mdx index aaac86990..c961331f3 100644 --- a/shared/chat-sdk/develop/_authentication.mdx +++ b/shared/chat-sdk/develop/_authentication.mdx @@ -125,7 +125,7 @@ The following figure shows the API call sequence of generating an Agora Chat tok 3. In `/src/main/resource`, create an `application.properties` file to store the information for generating tokens and update it with your project information and token validity period. For example, set `expire.second` as `6000`, which means the token is valid for 6000 seconds. - ``` shellscript + ```shellscript ## Server port. server.port=8090 ## Fill the App ID of your Agora project. diff --git a/shared/chat-sdk/develop/_content-moderation.mdx b/shared/chat-sdk/develop/_content-moderation.mdx index 08f6abc9c..90c8c9665 100644 --- a/shared/chat-sdk/develop/_content-moderation.mdx +++ b/shared/chat-sdk/develop/_content-moderation.mdx @@ -11,7 +11,7 @@ Delivering a safe and appropriate chat environment to your users is essential. C - You have created a valid [Agora developer account](../reference/manage-agora-account#create-an-agora-account). - Moderation is not enabled by default. To use this feature, you need to subscribe to the **Pro** or **Enterprise** [pricing plan](../reference/pricing-plan-details) and enable it in [Agora Console](https://console.agora.io/). -
Add-on fees are incurred if you use this feature. See Pricing for details.
+
Add-on fees are incurred if you use this feature. See Pricing for details.
## Enable the moderation feature diff --git a/shared/chat-sdk/develop/offline-push/project-implementation/web.mdx b/shared/chat-sdk/develop/offline-push/project-implementation/web.mdx index e65490d84..ff9befef0 100644 --- a/shared/chat-sdk/develop/offline-push/project-implementation/web.mdx +++ b/shared/chat-sdk/develop/offline-push/project-implementation/web.mdx @@ -47,7 +47,7 @@ Alternatively, assume that a DND time period is specified for a conversation, wh You can call `setSilentModeForAll` to set the push notifications at the app level and set the push notification mode and DND mode by specifying the `paramType` field, as shown in the following code sample: -```` javascript +````javascript options // The push notification options. options: { @@ -83,7 +83,7 @@ WebIM.conn.setSilentModeForAll(params) You can call `getSilentModeForAll` to retrieve the push notification settings at the app level, as shown in the following code sample: -```` javascript +````javascript WebIM.conn.getSilentModeForAll() ```` @@ -91,7 +91,7 @@ WebIM.conn.getSilentModeForAll() You can call `setSilentModeForConversation` to set the push notifications for the conversation specified by the `conversationId` and `type` fields, as shown in the following code sample: -``` javascript +```javascript const params = { conversationId: 'test', // The conversation ID. For one-to-one chats, sets to the ID of the peer user. For group chats, sets to the ID of the chat group or chat room. @@ -142,7 +142,7 @@ WebIM.conn.setSilentModeForConversation(params) You can call `getSilentModeForConversation` to retrieve the push notification settings of the specified conversation, as shown in the following code sample: -```` javascript +````javascript const params = { conversationId: 'test', // The conversation ID. For one-to-one chats, sets to the ID of the peer user. For group chats, sets to the ID of the chat group or chat room. type: 'singleChat', // The chat type. Sets the chat type to `singleChat`, `groupChat`, or `chatRoom`. @@ -179,7 +179,7 @@ You can call `clearRemindTypeForConversation` to clear the push notification mod The following code sample shows how to clear the push notification mode of a conversation: -``` javascript +```javascript const params = { conversationId: '12345', // The conversation ID. For one-to-one chats, sets to the ID of the peer user. For group chats, sets to the ID of the chat group or chat room. type: 'groupChat', // The chat type. Sets the chat type to `singleChat`, `groupChat`, or `chatRoom`. diff --git a/shared/chat-sdk/get-started/_enable.mdx b/shared/chat-sdk/get-started/_enable.mdx index bbf8117a8..d4404e4f2 100644 --- a/shared/chat-sdk/get-started/_enable.mdx +++ b/shared/chat-sdk/get-started/_enable.mdx @@ -9,7 +9,7 @@ To enable , make sure that you have the following: - A valid [ account](/agora-chat/reference/manage-agora-account#create-an-agora-account). - An [ project](/agora-chat/reference/manage-agora-account#create-an-agora-project) that uses **App ID** and **Token** for authentication. -- A pricing plan. For details on how to subscribe, see Subscribe to the pricing plan. +- A pricing plan. For details on how to subscribe, see Subscribe to the pricing plan. ## Enable diff --git a/shared/chat-sdk/hide/_token-server-new.mdx b/shared/chat-sdk/hide/_token-server-new.mdx index 00d9bd7fe..ee1131c81 100644 --- a/shared/chat-sdk/hide/_token-server-new.mdx +++ b/shared/chat-sdk/hide/_token-server-new.mdx @@ -179,7 +179,7 @@ The following figure shows the API call sequence of generating a Chat token with 3. In `/src/main/resource`, create an `application.properties` file to store the information for generating tokens and update it with your project information and token validity period. For example, set `expire.second` as `6000`, which means the token is valid for 6000 seconds. - ``` shellscript + ```shellscript ## Server port. server.port=8090 ## Fill the App ID of your Agora project. diff --git a/shared/chat-sdk/overview/_product-overview.mdx b/shared/chat-sdk/overview/_product-overview.mdx index d0dc20f67..9c0316293 100644 --- a/shared/chat-sdk/overview/_product-overview.mdx +++ b/shared/chat-sdk/overview/_product-overview.mdx @@ -88,7 +88,7 @@ With the engine and algorithms developed by , Chat offers four competitive pricing tiers and transparent billing. You enjoy a generous amount of cost-free usage every month; if your usage exceeds this threshold, you can pay as you go. The more you use, the larger your discount. See [Pricing](/agora-chat/reference/pricing) for details. + Chat offers four competitive pricing tiers and transparent billing. You enjoy a generous amount of cost-free usage every month; if your usage exceeds this threshold, you can pay as you go. The more you use, the larger your discount. See [Pricing](/agora-chat/overview/pricing) for details. ## Supported platforms diff --git a/shared/chat-sdk/reference/_http-status-codes.mdx b/shared/chat-sdk/reference/_http-status-codes.mdx index 9c8d9131b..44e27dfd4 100644 --- a/shared/chat-sdk/reference/_http-status-codes.mdx +++ b/shared/chat-sdk/reference/_http-status-codes.mdx @@ -102,8 +102,8 @@ This status code indicates that the API request surpasses the call limit. | Status code | Error code | Error message | Description | | :----- | :------------ | :----------------------------------------------------------- | :------------------------------------------------| -| `429` | `resource_limited` | "You have exceeded the limit of the `{pricing_plan} `edition. Please upgrade to higher edition." | The error message returned because the usage of Chat exceeds the limit of the current pricing plan. For details, see [Pricing](pricing). To upgrade the pricing plan, contact support@agora.io. | -| `429` | `reach_limit` | "This request has reached api limit." | The error message returned because the calling frequency of the Chat API surpasses the call limit. For details, see [Limitations](../reference/limitations). To upgrade the pricing plan, contact support@agora.io。 | +| `429` | `resource_limited` | "You have exceeded the limit of the `{pricing_plan} `edition. Please upgrade to higher edition." | The error message returned because the usage of Chat exceeds the limit of the current pricing plan. For details, see [Pricing](../overview/pricing). To upgrade the pricing plan, contact support@agora.io. | +| `429` | `reach_limit` | "This request has reached api limit." | The error message returned because the calling frequency of the Chat API surpasses the call limit. For details, see [Limitations](../reference/limitations). To upgrade the pricing plan, contact support@agora.io. | ## `5xx` - Server error diff --git a/shared/chat-sdk/reference/_pricing.mdx b/shared/chat-sdk/reference/_pricing.mdx index 6458941d8..4273063bb 100644 --- a/shared/chat-sdk/reference/_pricing.mdx +++ b/shared/chat-sdk/reference/_pricing.mdx @@ -6,7 +6,7 @@ Note that if you have already signed a contract with Agora, the billing terms an ## Overview -Each month, Agora calculates your bill for using Chat, [issues your bill, and deducts your fee](billing-policies). If you subscribe, cancel, or switch to another plan, your fee is prorated for the current month. If your account is [suspended](billing-policies), Agora stores your app data for six months. Agora suggests you top up your account in a timely fashion or export the data before it is deleted. +Each month, Agora calculates your bill for using Chat, [issues your bill, and deducts your fee](../reference/billing-policies). If you subscribe, cancel, or switch to another plan, your fee is prorated for the current month. If your account is [suspended](../reference/billing-policies), Agora stores your app data for six months. Agora suggests you top up your account in a timely fashion or export the data before it is deleted. ## Composition @@ -80,7 +80,7 @@ Chat provides the translation and content moderation features to meet your busin Before using Chat, refer to the following steps to subscribe to a plan: 1. Log in to [Agora Console](https://console.agora.io/), on the left navigation bar, click **Package** > **Chat** > **Subscribe** on the left navigation bar. -2. Check [pricing plan details](pricing-plan-details), choose the plan you want to use, click **Subscribe**, and make the payment. +2. Check [pricing plan details](../reference/pricing-plan-details), choose the plan you want to use, click **Subscribe**, and make the payment. Subscription takes effect immediately. After subscribing to a plan, you can click **Package** > **Chat** > **Manage**, and on this page you can click **Show More** to view your subscription details. diff --git a/shared/cloud-gateway/develop/_media-stream-encryption.mdx b/shared/cloud-gateway/develop/_media-stream-encryption.mdx index 2d4e59954..c5f13d4fc 100644 --- a/shared/cloud-gateway/develop/_media-stream-encryption.mdx +++ b/shared/cloud-gateway/develop/_media-stream-encryption.mdx @@ -20,7 +20,7 @@ To generate and set the `key` and `salt` parameters, refer to the following step 1. Refer to the following command to randomly generate a 32-byte key in the string format through OpenSSL on your server. - ``` shell + ```shell # Randomly generate a 32-byte key in the string format, and pass the string key in the encryptionKey parameter of enableEncryption. openssl rand -hex 32 dba643c8ba6b6dc738df43d9fd624293b4b12d87a60f518253bd10ba98c48453 @@ -32,7 +32,7 @@ To generate and set the `key` and `salt` parameters, refer to the following step 1. Refer to the following command to randomly generate a Base64-encoded, 32-byte salt through OpenSSL on the server. You can also refer to the [C++ sample code](https://github.com/AgoraIO/Tools/blob/master/DynamicKey/AgoraDynamicKey/cpp/sample/RtcChannelEncryptionSaltSample.cpp) provided by Agora on GitHub to randomly generate a salt in the byte array format and convert it to Base64 on the server. - ``` shell + ```shell # Randomly generate a 32-byte salt in the Base64 format. Convert the salt from Base64 to uint8_t, and pass the uint8_t salt in the encryptionKdfSalt parameter of enableEncryption. openssl rand -base64 32 X5w9T+50kzxVOnkJKiY/lUk82/bES2kATOt3vBuGEDw= diff --git a/shared/common/manage-agora-account/_agora-console-restapi.mdx b/shared/common/manage-agora-account/_agora-console-restapi.mdx index d45bb0e52..d825218c5 100644 --- a/shared/common/manage-agora-account/_agora-console-restapi.mdx +++ b/shared/common/manage-agora-account/_agora-console-restapi.mdx @@ -51,7 +51,7 @@ Pass in the following parameters in the request body: **Request body** -``` json +```json { "name": "project1", "enable_sign_key": true @@ -75,7 +75,7 @@ If the status code is `201`, the request succeeds, and the response body include The following is a response example for a successful request: -``` json +```json { "project": { "id": "xxxx", @@ -132,7 +132,7 @@ If the status code is `201`, the request succeeds, and the response body include The following is a response example for a successful request: -``` json +```json { "projects": [ { @@ -180,7 +180,7 @@ If the status code is `201`, the request succeeds, and the response body include The following is a response example for a successful request: -``` json +```json { "projects": [ { @@ -230,7 +230,7 @@ Pass in the following parameters in the request body: **Request body** -``` json +```json { "id": "xxxx", "status": 0 @@ -253,7 +253,7 @@ If the status code is `201`, the request succeeds, and the response body include The following is a response example for a successful request: -``` json +```json { "project": { "id": "xxxx", @@ -292,7 +292,7 @@ Pass in the following parameters in the request body: **Request body** -``` json +```json { "id": "xxxx", "recording_server": "10.12.1.5:8080" @@ -315,7 +315,7 @@ If the status code is `201`, the request succeeds, and the response body include The following is a response example for a successful request: -``` json +```json { "project": { "id": "xxxx", @@ -354,7 +354,7 @@ The following parameters are required in the request body: **Request body** -``` json +```json { "id": "xxxx", "enable": true @@ -377,7 +377,7 @@ If the status code is `201`, the request succeeds, and the response body include The following is a response example for a successful request: -``` json +```json { "project": { "id": "xxxx", @@ -415,7 +415,7 @@ Pass in the following parameter in the request body: **Request body** -``` json +```json { "id": "xxxx" } @@ -437,7 +437,7 @@ If the status code is `201`, the request succeeds, and the response body include The following is a response example for a successful request: -``` json +```json { "project": { "id": "xxxx", @@ -497,7 +497,7 @@ If the status code is `200`, the request succeeds, and the response body include The following is a response example for a successful request: -``` json +```json { "meta": { "durationAudioAll": { diff --git a/shared/common/manage-agora-account/_get-appid-token.mdx b/shared/common/manage-agora-account/_get-appid-token.mdx index 7c1d8b57e..72e6d06ee 100644 --- a/shared/common/manage-agora-account/_get-appid-token.mdx +++ b/shared/common/manage-agora-account/_get-appid-token.mdx @@ -1,5 +1,3 @@ -import * as data from '@site/data/variables'; - ### Get Started with Agora @@ -57,28 +55,31 @@ To generate a set of Customer ID and Customer Secret, do the following: #### Generate a temporary token -When a user attempts to join a channel, your app passes an encrypted authentication token to , this token is unique for each channel. It is generated using the App ID of your project and the channel name. In a test or production environment, your app retrieves the token from a token server. However, for local testing, you can generate a temporary token in . - -1. In , open [Project Management](https://console.agora.io/projects), select your project and click **Config**. +To ensure communication security, best practice is to use tokens to authenticate the users who log in to + from your . -2. Under **Features**, click **Generate temp token**. +To generate a temporary token: -3. Type the **Channel Name**, then click **Generate**. +1. In [ > Project Management](https://console.agora.io/projects), select your project and click +**Configure**. - generates a token valid to join **Channel Name** only. +1. In your browser, navigate to the [ token builder](https://agora-token-generator-demo.vercel.app/). -4. Click **Copy**. +1. Choose the product your user wants to log in to. Fill in *App ID* and *App Certificate* with the +details of your project in . - **Temp Token** is added to the clipboard of your development machine. +1. Customize the token for each user. The required fields are visible in token builder. -#### Generate an token + + In , each authentication token you create is specific for a user ID in your . You create +a token for each user in the channel. When you call `login` using , ensure that the UID is the same +as you used to create the token. -Applicable products of tokens include and Flexible Classroom. + -To ensure communication security, Agora recommends using tokens to authenticate users logging in to an system. +1. Click **Generate Token** -For testing purposes, Agora Console supports generating tokens. To generate an token: + The token appears in Token Builder. -1. Visit [token builder](https://webdemo.agora.io/token-builder). +1. Add the token to your . -2. Fill in the App ID, App certificate, and user ID to log in to the system. You need to specify the user ID yourself (for example, "test"). The generated token is showed on the screen. When calling the `login` method later, ensure that the user ID is the same with the one that you use to generate the token. diff --git a/shared/common/no-uikit.mdx b/shared/common/no-uikit.mdx index 8eafe9e7f..c09b182f3 100644 --- a/shared/common/no-uikit.mdx +++ b/shared/common/no-uikit.mdx @@ -1,5 +1,5 @@ import * as data from '@site/data/variables'; - + **Currently, there is no for .** diff --git a/shared/common/prerequities-get-started.mdx b/shared/common/prerequities-get-started.mdx new file mode 100644 index 000000000..bf681ea76 --- /dev/null +++ b/shared/common/prerequities-get-started.mdx @@ -0,0 +1,77 @@ + +- [Android Studio](https://developer.android.com/studio) 4.1 or higher. +- Android SDK API Level 24 or higher. +- A mobile device that runs Android 4.1 or higher. + + +- Xcode 12.0 or higher. +- A device running iOS 9.0 or higher. + + +- Xcode 12.0 or higher. +- A device running macOs 10.11 or higher. +- An Apple developer account + + +- A device running Windows 7 or higher. +- Microsoft Visual Studio 2017 or higher with [Desktop development with C++](https://devblogs.microsoft.com/cppblog/windows-desktop-development-with-c-in-visual-studio/) support. + + +- A [supported browser](../reference/supported-platforms#browsers). +- Physical media input devices, such as a camera and a microphone. +- A JavaScript package manager such as [npm](https://www.npmjs.com/package/npm). + + +- [Flutter](https://docs.flutter.dev/get-started/install) 2.0.0 or higher +- Dart 2.15.1 or higher +- [Android Studio](https://developer.android.com/studio), IntelliJ, VS Code, or any other IDE that supports Flutter, see [Set up an editor](https://docs.flutter.dev/get-started/editor). + +- If your target platform is iOS: + - Xcode on macOS (latest version recommended) + - A physical iOS device + - iOS version 12.0 or later + +- If your target platform is Android: + - Android Studio on macOS or Windows (latest version recommended) + - An Android emulator or a physical Android device. + +- If you are developing a desktop application for Windows, macOS or Linux, make sure your development device meets the [Flutter desktop development requirements](https://docs.flutter.dev/development/platform-integration/desktop). + + + +- A [supported browser](../reference/supported-platforms#browsers). +- Physical media input devices, such as a camera and a microphone. +- A JavaScript package manager such as [npm](https://www.npmjs.com/package/npm). + + +- React Native 0.60 or later. For more information, see [Setting up the development environment](https://reactnative.dev/docs/environment-setup). +- Node 10 or later +- For iOS + - A machine running macOS + - Xcode 10 or later + - CocoaPods + - A physical or virtual mobile device running iOS 9.0 or later. If you use React Native 0.63 or later, ensure your iOS version is 10.0 or later. +- For Android + - A machine running macOS, Windows, or Linux + - [Java Development Kit (JDK) 11](https://openjdk.org/projects/jdk/11/) or later + - Android Studio + - A physical or virtual mobile device running Android 5.0 or later + + +- [Unity Hub](https://unity.com/download) +- [Unity Editor 2017.X LTS or higher](https://unity.com/releases/editor/archive) +- Microsoft Visual Studio 2017 or higher + + +- Physical media input devices, such as a camera and a microphone. +- A JavaScript package manager such as [npm](https://www.npmjs.com/package/npm). + + +* A device running Linux Ubuntu 14.04 or above; 18.04+ is recommended. +* At least 2 GB of memory. +* `cmake` 3.6.0 or above. + +- An [account](../reference/manage-agora-account#create-an-agora-account) and [project](../reference/manage-agora-account#create-an-agora-project). +- A computer with Internet access. + + Ensure that no firewall is blocking your network communication. diff --git a/shared/common/prerequities.mdx b/shared/common/prerequities.mdx index 81237c7dd..cdbe5d10c 100644 --- a/shared/common/prerequities.mdx +++ b/shared/common/prerequities.mdx @@ -1,20 +1,17 @@ - -- [Android Studio](https://developer.android.com/studio) 4.1 or higher. -- Android SDK API Level 24 or higher. -- A mobile device that runs Android 4.1 or higher. +To test the code used in this page you need to have: + +* Implemented either of the following: + - [ quickstart](../get-started/get-started-uikit) + - [SDK quickstart](../get-started/get-started-sdk) - -- Xcode 12.0 or higher. -- A device running iOS 9.0 or higher. + +* Setup the [ reference app](../get-started/get-started-sdk#prerequisites) for . - -- Xcode 12.0 or higher. -- A device running macOs 10.11 or higher. -- An Apple developer account + +* Same setup as the [ quickstart prerequisites](../get-started/get-started-sdk#prerequisites) for . - -- A device running Windows 7 or higher. -- Microsoft Visual Studio 2017 or higher with [Desktop development with C++](https://devblogs.microsoft.com/cppblog/windows-desktop-development-with-c-in-visual-studio/) support. + +* Implemented the [SDK quickstart](../get-started/get-started-sdk) - [Visual Studio 2019](https://visualstudio.microsoft.com/downloads/) or higher with C++ and desktop development support. diff --git a/shared/common/project-setup/android.mdx b/shared/common/project-setup/android.mdx new file mode 100644 index 000000000..031fca645 --- /dev/null +++ b/shared/common/project-setup/android.mdx @@ -0,0 +1,16 @@ + + +1. **Clone the [ reference app](https://github.com/AgoraIO/video-sdk-samples-android) repository**. + + Navigate to your `` folder and run the following command: + + ```bash + git clone https://github.com/AgoraIO/video-sdk-samples-android.git + ``` + +1. **Open the reference app in Android Studio**: + + From the **File** menu select **Open...** then navigate to `/video-sdk-samples-android/android-reference-app` and click **OK**. Android Studio loads the project and Gradle sync downloads the dependencies. + + + diff --git a/shared/video-sdk/get-started/get-started-sdk/project-setup/electron.mdx b/shared/common/project-setup/electron.mdx similarity index 95% rename from shared/video-sdk/get-started/get-started-sdk/project-setup/electron.mdx rename to shared/common/project-setup/electron.mdx index 23cb0aff0..c6bd811f2 100644 --- a/shared/video-sdk/get-started/get-started-sdk/project-setup/electron.mdx +++ b/shared/common/project-setup/electron.mdx @@ -6,7 +6,7 @@ 2. Execute the following command in the terminal: - ``` bash + ```bash git clone https://github.com/electron/electron-quick-start ``` This command clones the Electron quick-start project that you use to implement . @@ -15,7 +15,7 @@ Open a terminal window in your project folder and execute the following command to download and install the . - ``` bash + ```bash npm i agora-electron-sdk ``` Make sure the path to your project folder does not contain any spaces. This might cause error during the installation. diff --git a/shared/common/project-setup/flutter.mdx b/shared/common/project-setup/flutter.mdx new file mode 100644 index 000000000..0cd874004 --- /dev/null +++ b/shared/common/project-setup/flutter.mdx @@ -0,0 +1,33 @@ + + +1. **Set up your Flutter environment** + + Follow the procedure to [Install Flutter](https://docs.flutter.dev/get-started/install) and [Set up an editor](https://docs.flutter.dev/get-started/editor). To check your installation status, run the following command: + + ```bash + flutter doctor + ``` + + Flutter helps diagnose any issues with your development environment. Make sure that your system passes all the checks. + +1. **Clone the [ reference app](https://github.com/AgoraIO/video-sdk-samples-flutter) repository** + + Navigate to your `` folder and run the following command: + + ```bash + git clone https://github.com/AgoraIO/video-sdk-samples-flutter.git + ``` + +1. **Open the reference app in your IDE** + + Launch Android Studio or the IDE of your choice. From the **File** menu, select **Open...** then navigate to `/video-sdk-samples-flutter/flutter-reference-app` and click **OK**. + +1. **Install project dependencies** + + Run the following command in `video-sdk-samples-flutter/flutter-reference-app`: + + ```bash + flutter pub get + ``` + + \ No newline at end of file diff --git a/shared/video-sdk/get-started/get-started-sdk/project-setup/index.mdx b/shared/common/project-setup/index.mdx similarity index 90% rename from shared/video-sdk/get-started/get-started-sdk/project-setup/index.mdx rename to shared/common/project-setup/index.mdx index 067705e45..e9b0b466e 100644 --- a/shared/video-sdk/get-started/get-started-sdk/project-setup/index.mdx +++ b/shared/common/project-setup/index.mdx @@ -3,6 +3,7 @@ import Ios from './ios.mdx'; import MacOS from './macos.mdx'; import Web from './web.mdx'; import ReactNative from './react-native.mdx'; +import ReactJS from './react-js.mdx'; import Electron from './electron.mdx'; import Flutter from './flutter.mdx'; import Unity from './unity.mdx'; @@ -15,6 +16,7 @@ import Unreal from './unreal.mdx'; + diff --git a/shared/common/project-setup/ios.mdx b/shared/common/project-setup/ios.mdx new file mode 100644 index 000000000..da9a5e468 --- /dev/null +++ b/shared/common/project-setup/ios.mdx @@ -0,0 +1,17 @@ + + +1. **Clone the [ reference app](https://github.com/AgoraIO/video-sdk-samples-ios) to +`` on your development machine**: + + ```bash + git clone https://github.com/AgoraIO/video-sdk-samples-ios.git + ``` + +1. **Open the sample project in Xcode**: + + Select **File** > **Open...** then navigate to `/video-sdk-samples-ios/Example-App/Docs-Examples. + xcodeproj` and click **Open**. Xcode loads the project. + +1. **Connect a physical or virtual device to your development environment**. + + diff --git a/shared/common/project-setup/macos.mdx b/shared/common/project-setup/macos.mdx new file mode 100644 index 000000000..085d44b99 --- /dev/null +++ b/shared/common/project-setup/macos.mdx @@ -0,0 +1,15 @@ + + + 1. Clone the [ sample project](https://github.com/AgoraIO/video-sdk-samples-macos) to `` on + your + development machine: + + ```bash + git clone https://github.com/AgoraIO/video-sdk-samples-macos.git + ``` + + 1. Open the sample project in Xcode. + + Select **File** > **Open...** then navigate to `/video-sdk-samples-macos/Docs-Examples.xcodeproj` and click **Open**. Xcode loads the project. + + \ No newline at end of file diff --git a/shared/common/project-setup/react-js.mdx b/shared/common/project-setup/react-js.mdx new file mode 100644 index 000000000..3d2939281 --- /dev/null +++ b/shared/common/project-setup/react-js.mdx @@ -0,0 +1,22 @@ + + + +1. **Clone the [ reference app](https://github.com/AgoraIO/video-sdk-samples-reactjs) to your + development environment**: + + ```bash + git clone https://github.com/AgoraIO/video-sdk-samples-reactjs + ``` + +1. **Install the dependencies**: + + In Terminal, navigate to `video-sdk-samples-reactjs`, and execute the following command. + + ```bash + npm install + ``` + is installed automatically. However, you can also [Install manually](../reference/downloads#through-the-agora-website). + + + + \ No newline at end of file diff --git a/shared/video-sdk/get-started/get-started-sdk/project-setup/react-native.mdx b/shared/common/project-setup/react-native.mdx similarity index 100% rename from shared/video-sdk/get-started/get-started-sdk/project-setup/react-native.mdx rename to shared/common/project-setup/react-native.mdx diff --git a/shared/common/project-setup/swift.mdx b/shared/common/project-setup/swift.mdx new file mode 100644 index 000000000..e69de29bb diff --git a/shared/common/project-setup/unity.mdx b/shared/common/project-setup/unity.mdx new file mode 100644 index 000000000..8c4abc9ed --- /dev/null +++ b/shared/common/project-setup/unity.mdx @@ -0,0 +1,27 @@ + +1. **Clone the repository** + + To clone the repository to your local machine, open Terminal and navigate to the directory where you want to clone the repository. Then, use the following command: + + ```bash + git clone https://github.com/AgoraIO/video-sdk-samples-unity.git + ``` + +1. **Open the project** + + 1. In Unity Hub, Open `video-sdk-samples-unity`, Unity Editor opens the project. + + Unity Editor warns of compile errors. Don't worry, you fix them when you import Video SDK for Unity. + + 1. Unzip [the latest version of the Agora Video SDK](https://docs.agora.io/en/sdks?platform=unity) to a local folder. + + 1. In **Unity**, click **Assets** > **Import Package** > **Custom Package**. + + 1. Navigate to the Video SDK package and click **Open**. + + 1. In **Import Unity Package**, click **Import**. + + Unity recompiles the Video SDK samples for Unity and the warnings disappear. + + + \ No newline at end of file diff --git a/shared/common/project-setup/unreal.mdx b/shared/common/project-setup/unreal.mdx new file mode 100644 index 000000000..c5e82c270 --- /dev/null +++ b/shared/common/project-setup/unreal.mdx @@ -0,0 +1,3 @@ + + + \ No newline at end of file diff --git a/shared/common/project-setup/web.mdx b/shared/common/project-setup/web.mdx new file mode 100644 index 000000000..2a69bf074 --- /dev/null +++ b/shared/common/project-setup/web.mdx @@ -0,0 +1,26 @@ + + +1. **Clone the [ reference app](https://github.com/AgoraIO/video-sdk-samples-js) repository**. + + Navigate to your `` folder and run the following command: + + + ```bash + git clone https://github.com/AgoraIO/video-sdk-samples-js.git + ``` + + + ```bash + git clone https://github.com/AgoraIO/signaling-sdk-samples-web + ``` + + +1. **Install dependencies**: + + Navigate to the root directory of the cloned repository and run the following command: + + ```bash + pnpm install + ``` + + diff --git a/shared/common/project-setup/windows.mdx b/shared/common/project-setup/windows.mdx new file mode 100644 index 000000000..f100579f9 --- /dev/null +++ b/shared/common/project-setup/windows.mdx @@ -0,0 +1,28 @@ + + + +1. Clone this Git repository for sample code for windows by executing the following command in a terminal window: + ```bash + git clone https://github.com/AgoraIO/video-sdk-samples-windows + ``` +2. Replace the `video-sdk-samples-windows/agora_manager/SDK` with the latest Agora Video SDK, you downloaded and unzipped to a local folder. +3. Create a folder at your solution directory (`video-sdk-samples-windows/agora_manager`) named `/` like for this sample create a nested folder (`x64/Debug`).To achieve this, first create a folder named `x64`, and then within that folder, create another folder named `Debug`. +4. Copy the contents of `video-sdk-samples-windows/agora_manager/SDK/x86_64` to `video-sdk-samples-windows/agora_manager/x64/Debug`. +5. **Install the required third party liberaries through vcpkg:** + + 1. Install `vcpkg` : To install required library + ```bash + 1. Open command prompt and navigate to video-sdk-samples-windows: + 2. git clone https://github.com/Microsoft/vcpkg.git + 3. cd vcpkg + 4. .\bootstrap-vcpkg.bat + ``` + 1. Install required packages : + + Install vcpkag packages as per the project need.Please note we need to install x64-windows version of libreraries, as this sample is 64-bit version of Windows. + ```bash + .\vcpkg.exe install jsoncpp:x64-windows + .\vcpkg.exe install curl:x64-windows + .\vcpkg.exe install opencv:x64-windows + ``` + \ No newline at end of file diff --git a/shared/common/project-test/android.mdx b/shared/common/project-test/android.mdx new file mode 100644 index 000000000..21d0343ff --- /dev/null +++ b/shared/common/project-test/android.mdx @@ -0,0 +1,28 @@ + +4. **Set the APP ID** + + In `agora-manager/res/raw/config.json`, set `appId` to the [AppID](../reference/manage-agora-account?platform=android#get-the-app-id) of your project. + +1. **Set the authentication method** + + Choose one of the following authentication methods: + + - **Temporary token**: + 1. Set `rtcToken` with the value of your [temporary token](../reference/manage-agora-account#generate-a-temporary-token). + 1. Set `channelName` - with the name of a channel you used to create the token. + - **Authentication server**: + 1. Setup an [Authentication server](../get-started/authentication-workflow?platform#create-and-run-a-token-server) + 1. In `config.json`, set: + - `channelName` with the name of a channel you want to join. + - `rtcToken` to an empty string. + - `serverUrl` to the base URL of your authentication server. For example, `https://agora-token-service-production-1234.up.railway.app`. + +1. **Start the reference app** + + 1. In Android Studio, connect a physical Android device to your development machine. + + 1. Click **Run** to start the app. + + A moment later you see the project installed on your device. If this is the first time you run the project, you need to grant microphone and camera access to your . + + diff --git a/shared/common/project-test/clone-project.mdx b/shared/common/project-test/clone-project.mdx new file mode 100644 index 000000000..aa7cd3ea8 --- /dev/null +++ b/shared/common/project-test/clone-project.mdx @@ -0,0 +1,28 @@ + + - documentation app + + + - documentation app + + + - documentation app + + + - documentation app + + + - documentation app + + + - documentation app + + + - documentation app + + + - documentation app + + + - documentation app + + diff --git a/shared/common/project-test/electron.mdx b/shared/common/project-test/electron.mdx new file mode 100644 index 000000000..c6bd811f2 --- /dev/null +++ b/shared/common/project-test/electron.mdx @@ -0,0 +1,22 @@ + + + 1. Take the following steps to setup a new Electron project: + + 1. Open a terminal window and navigate to the directory where you want to create the project. + + 2. Execute the following command in the terminal: + + ```bash + git clone https://github.com/electron/electron-quick-start + ``` + This command clones the Electron quick-start project that you use to implement . + + 2. Install the + + Open a terminal window in your project folder and execute the following command to download and install the . + + ```bash + npm i agora-electron-sdk + ``` + Make sure the path to your project folder does not contain any spaces. This might cause error during the installation. + diff --git a/shared/common/project-test/flutter.mdx b/shared/common/project-test/flutter.mdx new file mode 100644 index 000000000..75bf41d81 --- /dev/null +++ b/shared/common/project-test/flutter.mdx @@ -0,0 +1,28 @@ + + +3. **Set the APP ID** + In `flutter-reference-app/assets/config/config.json`, set `appId` to the [AppID](../reference/manage-agora-account?platform=android#get-the-app-id) of your project. + +1. **Set the authentication method** + + Choose one of the following authentication methods: + + - **Temporary token**: + 1. Set `rtcToken` with the value of your [temporary token](../reference/manage-agora-account#generate-a-temporary-token). + 1. Set `channelName` - with the name of a channel you used to create the token. + - **Authentication server**: + 1. Setup an [Authentication server](../get-started/authentication-workflow?platform#create-and-run-a-token-server) + 1. In `config.json`, set: + - `channelName` with the name of a channel you want to join. + - `rtcToken` to an empty string. + - `serverUrl` to the base URL of your authentication server. For example, `https://agora-token-service-production-1234.up.railway.app`. + +1. **Start the reference app** + + 1. Connect a physical mobile device to your development machine. + + 1. In your IDE, click **Run** to start the app. + + A moment later you see the project installed on your device. If this is the first time you run the project, you need to grant microphone and camera access to your . + + diff --git a/shared/common/project-test/generate-temp-rtc-token.mdx b/shared/common/project-test/generate-temp-rtc-token.mdx new file mode 100644 index 000000000..b5f02da59 --- /dev/null +++ b/shared/common/project-test/generate-temp-rtc-token.mdx @@ -0,0 +1 @@ +[Generate a temporary token](../reference/manage-agora-account#generate-a-temporary-token) in [](https://console.agora.io/). \ No newline at end of file diff --git a/shared/common/project-test/index.mdx b/shared/common/project-test/index.mdx index 47d8fe822..70939fe59 100644 --- a/shared/common/project-test/index.mdx +++ b/shared/common/project-test/index.mdx @@ -1,3 +1,25 @@ +import Android from './android.mdx'; +import Ios from './ios.mdx'; +import MacOS from './macos.mdx'; import Web from './web.mdx'; +import ReactNative from './react-native.mdx'; +import ReactJS from './react-js.mdx'; +import Electron from './electron.mdx'; +import Flutter from './flutter.mdx'; +import Unity from './unity.mdx'; +import Windows from './windows.mdx'; +import CloneProj from './clone-project.mdx' + + + + + + + + + + + + diff --git a/shared/video-sdk/develop/spatial-audio/project-implementation/ios.mdx b/shared/common/project-test/ios.mdx similarity index 100% rename from shared/video-sdk/develop/spatial-audio/project-implementation/ios.mdx rename to shared/common/project-test/ios.mdx diff --git a/shared/common/project-test/load-web-demo.mdx b/shared/common/project-test/load-web-demo.mdx new file mode 100644 index 000000000..5f3b3217a --- /dev/null +++ b/shared/common/project-test/load-web-demo.mdx @@ -0,0 +1 @@ +In your browser, navigate to the web demo and update _App ID_, _Channel_, and _Token_ with the values for your temporary token, then click **Join**. diff --git a/shared/video-sdk/get-started/get-started-sdk/project-setup/macos.mdx b/shared/common/project-test/macos.mdx similarity index 99% rename from shared/video-sdk/get-started/get-started-sdk/project-setup/macos.mdx rename to shared/common/project-test/macos.mdx index 74e608cdd..38ff7300d 100644 --- a/shared/video-sdk/get-started/get-started-sdk/project-setup/macos.mdx +++ b/shared/common/project-test/macos.mdx @@ -1,6 +1,5 @@ import Source from './swift.mdx'; - diff --git a/shared/common/project-test/open-config-file.mdx b/shared/common/project-test/open-config-file.mdx new file mode 100644 index 000000000..6e27f3cd4 --- /dev/null +++ b/shared/common/project-test/open-config-file.mdx @@ -0,0 +1,36 @@ + + + Open the file `/signaling-manager/src/main/res/raw/config.json` + + + + Open the file `/signaling-manager/config.json` + + + + Open the file `/src/signaling_manager/config.json` + + + + + + Open the file `/agora-manager/res/raw/config.json` + + + + Open the file `DocsAppConfig.swift` + + + + Open the file `/src/agora_manager/config.json` + + + + Open the file `/Assets/agora-manager/config.json` + + + + Open the file `/src/agora-manager/config.json` + + + \ No newline at end of file diff --git a/shared/common/project-test/react-js.mdx b/shared/common/project-test/react-js.mdx new file mode 100644 index 000000000..aa1e6edad --- /dev/null +++ b/shared/common/project-test/react-js.mdx @@ -0,0 +1,41 @@ + + + +3. **Set the APP ID** + + In `src/agora-manager/config.json` set `appId` to the [AppID](../reference/manage-agora-account?platform=android#get-the-app-id) of your project. + +1. **Set the authentication method**: + + - **Temporary token**: + 1. Set `rtcToken` with the value of your [temporary token](../reference/manage-agora-account#generate-a-temporary-token). + 1. Set `channelName` - with the name of a channel you used to create the token. + - **Authentication server**: + 1. Setup an [Authentication server](../get-started/authentication-workflow?platform#create-and-run-a-token-server) + 1. In `config.json`: + - Set `rtcToken` to an empty string. + - Set `serverUrl` to the base URL of your authentication server. For example, `https://agora-token-service-production-1234.up.railway.app`. + 1. Start a proxy server so this web app can make HTTP calls to fetch a token. In a Terminal instance in the reference app root, run the following command: + + ```bash + node ./utils/proxy.js + ``` + +1. **Start this reference app**: + + In Terminal, run the following command: + + ```bash + yarn dev + ``` +1. **Open the project in your browser**: + + The default URL is http://localhost:5173/. + +1. **Test **: + + In **Choose a product**, select . + + + + \ No newline at end of file diff --git a/shared/common/project-test/react-native.mdx b/shared/common/project-test/react-native.mdx new file mode 100644 index 000000000..e45dcb562 --- /dev/null +++ b/shared/common/project-test/react-native.mdx @@ -0,0 +1,61 @@ + + +1. **Setup a React Native environment for project** + + In the terminal, run the following command: + + ```bash + npx react-native init ProjectName --template react-native-template-typescript + ``` + + npx creates a new boilerplate project in the `ProjectName` folder. + + For Android projects, enable the project to use Android SDK. In the `android` folder of your project, set the `sdk.dir` in the `local.properties` file. For example: + + + ```bash + sdk.dir=C:\\PATH\\TO\\ANDROID\\SDK + ``` + +1. **Test the setup** + + Launch your Android or iOS simulator and run your project by executing the following command: + + 1. Run `npx react-native start` in the root of your project to start Metro. + 1. Open another terminal in the root of your project and run `npx react-native run-android` to start the Android app, or run `npx react-native run-ios` to start the iOS app. + + You see your new app running in your Android or iOS simulator. You can also run your project on a physical Android or iOS device. For detailed instructions, see [Running on device](https://reactnative.dev/docs/running-on-device). + +1. **Integrate and configure ** + + To integrate on React Native 0.60.0 or later: + 1. Navigate to the root folder of your project in the terminal and integrate with either: + - npm + + ```bash + npm i --save react-native-agora + ``` + + - yarn + + ```bash + // Install yarn. + npm install -g yarn + // Download the Agora React Native SDK using yarn. + yarn add react-native-agora + ``` + + Do not link native modules manually, React Native 0.60.0 and later support [Autolinking](https://github.com/react-native-community/cli/blob/main/docs/autolinking.md). + + 1. If your target platform is iOS, use CocoaPods to install : + + ```bash + npx pod-install + ``` + + 1. uses Swift in native modules, your project must support compiling Swift. To create `File.swift`: + + 1. In Xcode, open `ios/ProjectName.xcworkspace`. + 1. Click **File > New > File, select iOS > Swift File**, then click **Next > Create** . + + \ No newline at end of file diff --git a/shared/common/project-test/rtc-first-steps.mdx b/shared/common/project-test/rtc-first-steps.mdx new file mode 100644 index 000000000..074984bbd --- /dev/null +++ b/shared/common/project-test/rtc-first-steps.mdx @@ -0,0 +1,26 @@ +import OpenConfig from '@docs/shared/common/project-test/open-config-file.mdx'; +import SetAppId from '@docs/shared/common/project-test/set-app-id.mdx'; +import SetAuthenticationMethod from '@docs/shared/common/project-test/set-authentication-rtc.mdx'; +import RunApp from '@docs/shared/common/project-test/run-reference-app.mdx'; +import GetTempToken from '@docs/shared/common/project-test/generate-temp-rtc-token.mdx'; +import LoadWebDemo from '@docs/shared/common/project-test/load-web-demo.mdx'; +import CloneProj from './clone-project.mdx' + +1. **Load the web demo** + + 1. + 1. + +1. **Clone the documentation reference app** + + + +1. **Configure the project** + + 1. + 1. + 1. + +1. **Run the reference app** + + diff --git a/shared/common/project-test/run-reference-app.mdx b/shared/common/project-test/run-reference-app.mdx new file mode 100644 index 000000000..1d8a804ad --- /dev/null +++ b/shared/common/project-test/run-reference-app.mdx @@ -0,0 +1,39 @@ + + + 1. In Android Studio, connect a physical Android device to your development machine. + 1. Click **Run** to launch the app. + 1. A moment later you see the project installed on your device. + + + + + + + + In Terminal, navigate to ``, then run the following command: + + ```bash + pnpm dev + ``` + + Use the URL displayed in the terminal to open the in your browser. + + + + In Terminal, run the following command: + + ```bash + yarn dev + ``` + + + + In Unity, click **Play**. You see the game running on your device. + + + + If this is the first time you run the project, grant microphone and camera access to the app. + + + If this is the first time you run the project, grant microphone access to the app. + \ No newline at end of file diff --git a/shared/common/project-test/set-app-id.mdx b/shared/common/project-test/set-app-id.mdx new file mode 100644 index 000000000..a28ab19d5 --- /dev/null +++ b/shared/common/project-test/set-app-id.mdx @@ -0,0 +1 @@ + Set `appId` to the [AppID](../reference/manage-agora-account#get-the-app-id) of your project. \ No newline at end of file diff --git a/shared/common/project-test/set-authentication-rtc.mdx b/shared/common/project-test/set-authentication-rtc.mdx new file mode 100644 index 000000000..575cbba57 --- /dev/null +++ b/shared/common/project-test/set-authentication-rtc.mdx @@ -0,0 +1,11 @@ + Choose one of the following authentication methods: + + - **Temporary token** + 1. [Generate an RTC token](https://agora-token-generator-demo.vercel.app/) using your `uid` and `channelName` and set `rtcToken` to this value in `config.json`. + 1. Set `channelName` to the name of the channel you used to create the `rtcToken`. + - **Authentication server** + 1. Setup an [Authentication server](../get-started/authentication-workflow?platform#create-and-run-a-token-server) + 1. In `config.json`, set: + - `channelName` to the name of a channel you want to join. + - `token` and `rtcToken` to empty strings. + - `serverUrl` to the base URL for your token server. For example: `https://agora-token-service-production-yay.up.railway.app`. \ No newline at end of file diff --git a/shared/common/project-test/swift.mdx b/shared/common/project-test/swift.mdx new file mode 100644 index 000000000..d9f9fa4ef --- /dev/null +++ b/shared/common/project-test/swift.mdx @@ -0,0 +1,24 @@ + + +4. **Set the APP ID** + + In `DocsAppConfig.swift` set `appId` to the [AppID](../reference/manage-agora-account#get-the-app-id) of your project. + +1. **Set the authentication method**: + + - **Temporary token**: + 1. Set `rtcToken` with the value of your [temporary token](../reference/manage-agora-account#generate-a-temporary-token). + 1. Set `channelName` - with the name of a channel you used to create the token. + - **Authentication server**: + 1. Setup an [Authentication server](../get-started/authentication-workflow#create-and-run-a-token-server) + 1. In `DocsAppConfig.swift`: + + 1. Set `rtcToken` to an empty string. + 1. Set `serverUrl` to the base URL of your authentication server. For example, `https://agora-token-service-production-1234.up.railway.app`. + +1. **Run the reference **: + + If this is the first time you run the project, grant microphone and camera access to your . + + + \ No newline at end of file diff --git a/shared/common/project-test/unity.mdx b/shared/common/project-test/unity.mdx new file mode 100644 index 000000000..fdfb4e54d --- /dev/null +++ b/shared/common/project-test/unity.mdx @@ -0,0 +1,25 @@ + + +3. **Set the APP ID** + + In `Assets/agora-manager/config.json`, set `appId` to the [AppID](../reference/manage-agora-account?platform=android#get-the-app-id) of your project. + +1. **Set the authentication method** + + Choose one of the following authentication methods: + + - **Temporary token**: + 1. Set `rtcToken` with the value of your [temporary token](../reference/manage-agora-account#generate-a-temporary-token). + 1. Set `channelName` - with the name of a channel you used to create the token. + - **Authentication server**: + 1. Setup an [Authentication server](../get-started/authentication-workflow?platform#create-and-run-a-token-server) + 1. In `config.json`, set: + - `channelName` with the name of a channel you want to join. + - `rtcToken` to an empty string. + - `serverUrl` to the base URL of your authentication server. For example, `https://agora-token-service-production-1234.up.railway.app`. + +1. **Run the reference app**: + + In Unity, click **Play**. You see the running on your device. + + \ No newline at end of file diff --git a/shared/common/project-test/windows.mdx b/shared/common/project-test/windows.mdx new file mode 100644 index 000000000..fe9c3fa65 --- /dev/null +++ b/shared/common/project-test/windows.mdx @@ -0,0 +1,32 @@ + + + +3. **Set the APP ID** + + In `video-sdk-samples-windows/agora-manager/config.json`, set `appId` to the [AppID](../reference/manage-agora-account?platform=android#get-the-app-id) of your project. + +4. **Set the authentication method** + + Choose one of the following authentication methods: + + - **Temporary token**: + 1. Set `rtcToken` with the value of your [temporary token](../reference/manage-agora-account#generate-a-temporary-token). + 1. Set `channelName` - with the name of a channel you used to create the token. + - **Authentication server**: + 1. Setup an [Authentication server](../get-started/authentication-workflow?platform#create-and-run-a-token-server) + 1. In `video-sdk-samples-windows/agora-manager/config.json`, set: + - `channelName` with the name of a channel you want to join. + - `rtcToken` to an empty string. + - `serverUrl` to the base URL of your authentication server. For example, `https://agora-token-service-production-1234.up.railway.app`. + +5. **Run the sample** + + 1. Double click on `video-sdk-samples-windows/agora_manager/agora_manager.sln`. It will open the solution in Visual Studio. + + 1. Select the project you want to run in solution explorer, right click and `set as startup project' + + 1. Build and run the project. + + A moment later you see the project installed on your device. If this is the first time you run the project, you need to grant microphone and camera access to your . + + \ No newline at end of file diff --git a/shared/extensions-marketplace/_develop-an-audio-filter.mdx b/shared/extensions-marketplace/_develop-an-audio-filter.mdx index b17b0f724..6dd4738cb 100644 --- a/shared/extensions-marketplace/_develop-an-audio-filter.mdx +++ b/shared/extensions-marketplace/_develop-an-audio-filter.mdx @@ -11,8 +11,7 @@ import ProjectTest from '@docs/shared/extensions-marketplace/common/_project-tes The audio filters you created are easily integrated into apps to supply your voice effects and noise cancellation. - - + ## Understand the tech @@ -64,5 +63,4 @@ To ensure that you have integrated the extension in your : This section contains information that completes the information in this page, or points you to documentation that explains other aspects to this product. - \ No newline at end of file diff --git a/shared/extensions-marketplace/_use-an-extension.mdx b/shared/extensions-marketplace/_use-an-extension.mdx index 667ba6fd3..7f9f1b9c8 100644 --- a/shared/extensions-marketplace/_use-an-extension.mdx +++ b/shared/extensions-marketplace/_use-an-extension.mdx @@ -25,18 +25,16 @@ An extension accesses voice and video data when it is captured from the user's l A typical transmission pipeline consists of a chain of procedures, including capture, pre-processing, encoding, transmitting, decoding, post-processing, and play. Audio or video extensions are inserted into either the pre-processing or post-processing procedure, in order to modify the voice or video data in the transmission pipeline. - -**This functionality is not supported for Unreal Engine.** + +**Not yet available for .** - + ## Prerequisites -In order to follow this procedure you must have: -* Implemented the [](../get-started/get-started-sdk) project for . diff --git a/shared/extensions-marketplace/ai-noise-suppression.mdx b/shared/extensions-marketplace/ai-noise-suppression.mdx index 9efd5fbbf..8537f751a 100644 --- a/shared/extensions-marketplace/ai-noise-suppression.mdx +++ b/shared/extensions-marketplace/ai-noise-suppression.mdx @@ -1,7 +1,9 @@ -import { ProductWrapper } from '../../../src/mdx-components/ProductWrapper'; +import {PlatformWrapper} from "../../../src/mdx-components/PlatformWrapper"; +import Prerequisites from '@docs/shared/common/prerequities.mdx'; import ProjectImplement from '@docs/shared/extensions-marketplace/ai-noise-suppression/project-implementation/index.mdx'; import Reference from '@docs/shared/extensions-marketplace/ai-noise-suppression/reference/index.mdx'; +import ProjectTest from '@docs/shared/extensions-marketplace/ai-noise-suppression/project-test/poc3.mdx'; Agora's , enables you to suppress hundreds of types of noise and reduce distortion in human voices when multiple people speak at the same time. In scenarios such as online meetings, online chat rooms, video consultations with doctors, and online gaming, makes virtual communication as smooth as face-to-face interaction. @@ -30,24 +32,18 @@ In the pre-processing stage, uses deep learning noise reductio ![](/images/extensions-marketplace/ai-noise-suppression.png) - - **This functionality is not supported for Unreal Engine.** - ## Prerequisites -To use the extension, you must meet the following requirements: - -- Implemented the [](../get-started/get-started-sdk) project for . + - [Noise type](#type) matches your business scenario. For example, if you want the microphone to collect background music, the extension is not applicable because it categorizes such background music as noise. ## Implementation -This section explains how to use the latest version of the extension. Implementation for previous versions might be different. For details, see the [release notes](/extensions-marketplace/reference/release-notes#ai-noise-suppression). +This section explains how to use the latest version of the extension. Implementation for previous versions might be different. For details, see the [release notes](/extensions-marketplace/overview/release-notes#ai-noise-suppression). - - + To activate in your and set the noise reduction mode, call: @@ -108,29 +104,28 @@ To activate in your and set the noise redu - + + + + When is enabled, if detects that the device performance is not sufficient, it: - Disables - Enables traditional noise reduction -- Throws the -1054(WARN_APM_AINS_CLOSED) error code. - +- Throws the `WARN_APM_AINS_CLOSED` (-1054) error code. - - -## Reference + -This section contains in-depth technical information about . +## Reference +This section completes the information on this page, or points you to documentation that explains other aspects about this product. -### API reference - diff --git a/shared/extensions-marketplace/ai-noise-suppression/project-implementation/index.mdx b/shared/extensions-marketplace/ai-noise-suppression/project-implementation/index.mdx index 066f07a63..9475fb8a9 100644 --- a/shared/extensions-marketplace/ai-noise-suppression/project-implementation/index.mdx +++ b/shared/extensions-marketplace/ai-noise-suppression/project-implementation/index.mdx @@ -1,18 +1,12 @@ -import Android from './android.mdx'; -import Ios from './ios.mdx'; -import Web from './web.mdx'; +import Poc3 from './poc3.mdx'; import Electron from './electron.mdx'; import Flutter from './flutter.mdx'; import Unity from './unity.mdx'; -import MacOs from './macos.mdx' import Windows from './windows.mdx' import ReactNative from './react-native.mdx' - + - - - diff --git a/shared/extensions-marketplace/ai-noise-suppression/project-implementation/poc3.mdx b/shared/extensions-marketplace/ai-noise-suppression/project-implementation/poc3.mdx new file mode 100644 index 000000000..604ec1e79 --- /dev/null +++ b/shared/extensions-marketplace/ai-noise-suppression/project-implementation/poc3.mdx @@ -0,0 +1,44 @@ +import ImportLibrary from '@docs/assets/code/video-sdk/ai-noise-suppression/import-library.mdx'; +import EnableDenoiser from '@docs/assets/code/video-sdk/ai-noise-suppression/enable-denoiser.mdx'; +import SetupLogging from '@docs/assets/code/video-sdk/ai-noise-suppression/setup-logging.mdx'; +import ImportPlugin from '@docs/assets/code/video-sdk/ai-noise-suppression/import-plugin.mdx'; +import ConfigureExtension from '@docs/assets/code/video-sdk/ai-noise-suppression/configure-extension.mdx'; +import SetMode from '@docs/assets/code/video-sdk/ai-noise-suppression/set-noise-reduction-mode.mdx'; +import SetLevel from '@docs/assets/code/video-sdk/ai-noise-suppression/set-reduction-level.mdx'; + + + +### Import the library + + +### Enable the denoiser + + + +### Setup logging + + + + + + + +### Import the plugin + + + + +### Enable AI noise suppression + + + + +### Add the required imports + +### Configure the AI noise suppression extension + +### Set the noise reduction mode + +### Set the noise reduction level + + diff --git a/shared/extensions-marketplace/ai-noise-suppression/project-implementation/web.mdx b/shared/extensions-marketplace/ai-noise-suppression/project-implementation/web.mdx index 42f029150..91966c426 100644 --- a/shared/extensions-marketplace/ai-noise-suppression/project-implementation/web.mdx +++ b/shared/extensions-marketplace/ai-noise-suppression/project-implementation/web.mdx @@ -28,7 +28,7 @@ 2. If you have enabled the Content Security Policy (CSP), because Wasm files are not allowed to load in Chrome and Edge by default, you need to configure the CSP as follows: - For versions later than Chrome 97 and Edge 97 (Chrome 97 and Edge 97 included): Add `'wasm-unsafe-eval'` and `blob:` in the [`script-src`](https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Content-Security-Policy/script-src) options. For example: - ``` xml + ```xml ``` diff --git a/shared/extensions-marketplace/ai-noise-suppression/project-test/poc3.mdx b/shared/extensions-marketplace/ai-noise-suppression/project-test/poc3.mdx new file mode 100644 index 000000000..bfd3848ca --- /dev/null +++ b/shared/extensions-marketplace/ai-noise-suppression/project-test/poc3.mdx @@ -0,0 +1,22 @@ +import TestFirstSteps from '@docs/shared/common/project-test/rtc-first-steps.mdx'; +import ReactJS from './react-js.mdx'; + + + +## Test AI noise suppression + + + + + +5. **Join a channel** + +6. **Test noise suppression** + + Talk to the remote user connected to the web demo. Turn noise suppression on or off to see the effect of this feature. + + + + + + \ No newline at end of file diff --git a/shared/extensions-marketplace/ai-noise-suppression/project-test/react-js.mdx b/shared/extensions-marketplace/ai-noise-suppression/project-test/react-js.mdx new file mode 100644 index 000000000..2cded4a0c --- /dev/null +++ b/shared/extensions-marketplace/ai-noise-suppression/project-test/react-js.mdx @@ -0,0 +1,23 @@ + + +5. **Choose this sample in the reference app** + + In **Choose a sample code**, select **AI noise suppression**. + +1. **Join a channel** + + + Click **Join** to start a session. When you select **Host**, the local video is published and played in the . When you select **Audience**, the remote stream is subscribed and played. + + + + Press **Join** to connect to the same channel as your web demo. + + +1. **Test the AI suppression extension** + + 1. In **Noise reduction mode**, select a noise reduction mode. + 1. In **Noise reduction level**, select a noise reduction level. + + Now, observe the impact of the chosen noise reduction mode and level on the noise in your app. + \ No newline at end of file diff --git a/shared/extensions-marketplace/ai-noise-suppression/reference/index.mdx b/shared/extensions-marketplace/ai-noise-suppression/reference/index.mdx index 066f07a63..b6492a81a 100644 --- a/shared/extensions-marketplace/ai-noise-suppression/reference/index.mdx +++ b/shared/extensions-marketplace/ai-noise-suppression/reference/index.mdx @@ -8,6 +8,7 @@ import MacOs from './macos.mdx' import Windows from './windows.mdx' import ReactNative from './react-native.mdx' + diff --git a/shared/extensions-marketplace/ai-noise-suppression/reference/ios.mdx b/shared/extensions-marketplace/ai-noise-suppression/reference/ios.mdx index 72231ddd6..52ad025f4 100644 --- a/shared/extensions-marketplace/ai-noise-suppression/reference/ios.mdx +++ b/shared/extensions-marketplace/ai-noise-suppression/reference/ios.mdx @@ -1,5 +1,3 @@ -setParameters - diff --git a/shared/extensions-marketplace/ai-noise-suppression/reference/macos.mdx b/shared/extensions-marketplace/ai-noise-suppression/reference/macos.mdx index bfa28c097..5b1a67bc4 100644 --- a/shared/extensions-marketplace/ai-noise-suppression/reference/macos.mdx +++ b/shared/extensions-marketplace/ai-noise-suppression/reference/macos.mdx @@ -1,5 +1,3 @@ -setParameters - diff --git a/shared/extensions-marketplace/ai-noise-suppression/reference/web.mdx b/shared/extensions-marketplace/ai-noise-suppression/reference/web.mdx index e2e907d54..b9c89c0cb 100644 --- a/shared/extensions-marketplace/ai-noise-suppression/reference/web.mdx +++ b/shared/extensions-marketplace/ai-noise-suppression/reference/web.mdx @@ -1,4 +1,9 @@ + +- For a working example, check out the [AI Denoiser web demo](https://webdemo.agora.io/aiDenoiser/index.html) and the associated [source code](https://github.com/AgoraIO/API-Examples-Web/tree/main/Demo/aiDenoiser). + +### API reference + #### IAIDenoiserExtension ##### checkCompatibility diff --git a/shared/extensions-marketplace/common/_prerequities.mdx b/shared/extensions-marketplace/common/_prerequities.mdx index ab2989784..a4c370d77 100644 --- a/shared/extensions-marketplace/common/_prerequities.mdx +++ b/shared/extensions-marketplace/common/_prerequities.mdx @@ -18,7 +18,7 @@ - C# -- A [supported browser](../reference/supported-platforms#browsers). +- A [supported browser](../overview/supported-platforms#browsers). - Physical media input devices, such as a camera and a microphone. - A JavaScript package manager such as [npm](https://www.npmjs.com/package/npm). diff --git a/shared/extensions-marketplace/common/project-test/poc3.mdx b/shared/extensions-marketplace/common/project-test/poc3.mdx new file mode 100644 index 000000000..cf0b2892f --- /dev/null +++ b/shared/extensions-marketplace/common/project-test/poc3.mdx @@ -0,0 +1,14 @@ +import CloneReference from '@docs/shared/common/project-test/clone-project.mdx'; + + +1. [Generate a temporary token](../reference/manage-agora-account#generate-a-temporary-token) in if required. + +1. In your browser, navigate to the web demo and update _App ID_, _Channel_, and _Token_ with the values for your temporary token, then click **Join**. + + + +4. In the reference , update `appId` and `rtcToken` in `config.json` with the values from . + +5. Follow the steps in the project's README for setup. + + \ No newline at end of file diff --git a/shared/extensions-marketplace/develop-a-video-filter/project-implementation/cpp.mdx b/shared/extensions-marketplace/develop-a-video-filter/project-implementation/cpp.mdx index 9b6ac2f22..d2cc1d848 100644 --- a/shared/extensions-marketplace/develop-a-video-filter/project-implementation/cpp.mdx +++ b/shared/extensions-marketplace/develop-a-video-filter/project-implementation/cpp.mdx @@ -21,7 +21,7 @@ Methods include: The following code sample shows how to use these APIs together to implement a video filter: -``` cpp +```cpp #include "ExtensionVideoFilter.h" #include "../logutils.h" #include @@ -150,7 +150,7 @@ To encapsulate the video filter into an extension, you need to implement the `IE The following code sample shows how to use these APIs to encapsulate the video filter: -``` cpp +```cpp #include "ExtensionProvider.h" #include "../logutils.h" #include "VideoProcessor.h" diff --git a/shared/extensions-marketplace/develop-an-audio-filter/_web.mdx b/shared/extensions-marketplace/develop-an-audio-filter/_web.mdx index d711ba0d9..ae018f3b3 100644 --- a/shared/extensions-marketplace/develop-an-audio-filter/_web.mdx +++ b/shared/extensions-marketplace/develop-an-audio-filter/_web.mdx @@ -19,7 +19,7 @@ Agora provides the following abstract classes for developing an audio extension: Before proceeding, ensure that your development environment meets the following requirements: - A Windows or macOS computer that meets the following criteria: - - A browser that matches the [supported browser list](../reference/supported-platforms). Agora highly recommends using [the latest stable version]( https://www.google.com/chrome/) of Google Chrome. + - A browser that matches the [supported browser list](../overview/supported-platforms). Agora highly recommends using [the latest stable version]( https://www.google.com/chrome/) of Google Chrome. - Physical media input devices, such as a built-in camera and a built-in microphone. - Access to the Internet. If your network has a firewall, follow the instructions in [Firewall Requirements]( ../reference/firewall) to access Agora services. - An Intel 2.2GHz Core i3/i5/i7 processor (2nd generation) or equivalent diff --git a/shared/extensions-marketplace/develop-an-audio-filter/project-implementation/cpp.mdx b/shared/extensions-marketplace/develop-an-audio-filter/project-implementation/cpp.mdx index 90d6618b9..264ca5d62 100644 --- a/shared/extensions-marketplace/develop-an-audio-filter/project-implementation/cpp.mdx +++ b/shared/extensions-marketplace/develop-an-audio-filter/project-implementation/cpp.mdx @@ -19,7 +19,7 @@ Use the `IAudioFilter` interface to implement an audio filter. You can find the The following code sample shows how to use these methods together to implement an audio filter extension: -``` cpp +```cpp // After receiving the audio frames to be processed, call adaptAudioFrame to process the audio frames. bool ExtensionAudioFilter::adaptAudioFrame(const media::base::AudioPcmFrame &inAudioPcmFrame, media::base::AudioPcmFrame &adaptedPcmFrame) { @@ -66,7 +66,7 @@ To encapsulate the audio filter into an extension, you need to implement the `IE The following code sample shows how to use these methods to encapsulate the audio filter: -``` cpp +```cpp void ExtensionProvider::enumerateExtensions(ExtensionMetaInfo* extension_list, int& extension_count) { extension_count = 2; diff --git a/shared/extensions-marketplace/drm-play/project-implementation/swift.mdx b/shared/extensions-marketplace/drm-play/project-implementation/swift.mdx index 2234539a0..8f6c971f0 100644 --- a/shared/extensions-marketplace/drm-play/project-implementation/swift.mdx +++ b/shared/extensions-marketplace/drm-play/project-implementation/swift.mdx @@ -10,13 +10,13 @@ To create these buttons, in the `ViewController` class: Add the following lines along with the other declarations at the top: - ``` swift + ```swift var SearchMusic: UIButton! var PlayMusic: UIButton! ``` - ``` swift + ```swift var SearchMusic: NSButton! var PlayMusic: NSButton! ``` @@ -27,7 +27,7 @@ To create these buttons, in the `ViewController` class: Paste the following lines inside the `initViews` function: - ``` swift + ```swift SearchMusic = UIButton(type: .system) SearchMusic.frame = CGRect(x: 100, y: 550, width: 200, height: 50) SearchMusic.setTitle("Search Music", for: .normal) @@ -44,7 +44,7 @@ To create these buttons, in the `ViewController` class: ``` - ``` swift + ```swift SearchMusic = NSButton() SearchMusic.frame = CGRect(x: 255, y: 10, width: 150, height: 20) SearchMusic.title = "Start Channel Media Relay" diff --git a/shared/extensions-marketplace/faceunity/project-implementation/ios.mdx b/shared/extensions-marketplace/faceunity/project-implementation/ios.mdx index b12b97576..8294c2cea 100644 --- a/shared/extensions-marketplace/faceunity/project-implementation/ios.mdx +++ b/shared/extensions-marketplace/faceunity/project-implementation/ios.mdx @@ -16,7 +16,7 @@ For details of files provided in the resource pack, see [Resource package structure](#resource-package-structure). 1. Import the required header files. Add the following statements to your code: - ``` objective-c + ```objective-c #import #import "authpack.h" ``` diff --git a/shared/extensions-marketplace/image-enhancement.mdx b/shared/extensions-marketplace/image-enhancement.mdx index 56b6dae08..1dd25e774 100644 --- a/shared/extensions-marketplace/image-enhancement.mdx +++ b/shared/extensions-marketplace/image-enhancement.mdx @@ -53,7 +53,7 @@ To integrate and implement the image enhancement extension, follow these steps: Method two: Use the Script tag in the HTML file. Once imported, the `BeautyExtension` instance can be used directly in JavaScript files. - ``` xml + ```xml ``` diff --git a/shared/extensions-marketplace/reference/_ains.mdx b/shared/extensions-marketplace/reference/_ains.mdx index d0fa3a044..f29c182ce 100644 --- a/shared/extensions-marketplace/reference/_ains.mdx +++ b/shared/extensions-marketplace/reference/_ains.mdx @@ -1,4 +1,4 @@ -**Agora charges additionally for this extension. See [Pricing](/video-calling/reference/pricing#ai-noise-suppression-pricing).** +**Agora charges additionally for this extension. See [Pricing](/video-calling/overview/pricing#ai-noise-suppression-pricing).** ### v1.1.0 diff --git a/shared/extensions-marketplace/use-an-extension/project-implementation/android.mdx b/shared/extensions-marketplace/use-an-extension/project-implementation/android.mdx index debd89227..1da4a40dc 100644 --- a/shared/extensions-marketplace/use-an-extension/project-implementation/android.mdx +++ b/shared/extensions-marketplace/use-an-extension/project-implementation/android.mdx @@ -11,14 +11,14 @@ in your Agora project: 1. Add the following lines to import the Android classes used by the extension: - ``` java + ```java import org.json.JSONException; import org.json.JSONObject; ``` 2. Add the following lines to import the Agora classes used by the extension: - ``` java + ```java // ExtensionManager is used to pass in basic info about the extension import io.agora.extension.ExtensionManager; import io.agora.rtc2.IMediaExtensionObserver; diff --git a/shared/extensions-marketplace/use-an-extension/project-implementation/electron.mdx b/shared/extensions-marketplace/use-an-extension/project-implementation/electron.mdx index d54f99a8b..bfaace777 100644 --- a/shared/extensions-marketplace/use-an-extension/project-implementation/electron.mdx +++ b/shared/extensions-marketplace/use-an-extension/project-implementation/electron.mdx @@ -17,7 +17,7 @@ This section shows you how to implement the video filter extension in your , call `loadExtensionProvider` and pass the extension path. To enable the extension, call `enableExtension` and pass the provider name and extension name. To implement this workflow, in `preload.js`, add the following method before `window.onload = () =>`: - ``` javascript + ```javascript function enableExtension() { if (!path) { diff --git a/shared/extensions-marketplace/use-an-extension/project-implementation/flutter.mdx b/shared/extensions-marketplace/use-an-extension/project-implementation/flutter.mdx index acb67aef8..c0eabe063 100644 --- a/shared/extensions-marketplace/use-an-extension/project-implementation/flutter.mdx +++ b/shared/extensions-marketplace/use-an-extension/project-implementation/flutter.mdx @@ -6,7 +6,7 @@ This section presents the framework code you add to your projec You call `loadExtensionProvider` during initialization to specify the extension library path. To do this, add the following code to the `setupVideoSDKEngine` method after you initialize the engine with `agoraEngine.initialize`: - ``` dart + ```dart agoraEngine.loadExtensionProvider(""); ```` @@ -14,7 +14,7 @@ This section presents the framework code you add to your projec To enable the extension, add the following code to the `setupVideoSDKEngine` method before `await agoraEngine.enableVideo();`: - ``` dart + ```dart // Extensions marketplace hosts both third-party extensions as well as those developed by Agora. To use an Agora extensions, you do not need to call addExtension or enableExtension. agoraEngine.enableExtension( provider: "", @@ -31,7 +31,7 @@ This section presents the framework code you add to your projec To customize the extension for your particular , set suitable values for the extension properties. Refer to the extension documentation for a list of available property names and allowable values. To set a property, add the following code to the `setupVideoSDKEngine` method after `agoraEngine.enableExtension`: - ``` dart + ```dart agoraEngine.setExtensionProperty( provider: "", extension: "", diff --git a/shared/extensions-marketplace/use-an-extension/project-implementation/react-native.mdx b/shared/extensions-marketplace/use-an-extension/project-implementation/react-native.mdx index 93a8dd27e..30e023abf 100644 --- a/shared/extensions-marketplace/use-an-extension/project-implementation/react-native.mdx +++ b/shared/extensions-marketplace/use-an-extension/project-implementation/react-native.mdx @@ -78,7 +78,7 @@ You can use a `switch` to enable and disable the extension you wish to integrate To get notified of important events, add the following callbacks to `agoraEngine.registerEventHandler({`: - ``` ts + ```ts onExtensionErrored: ( provider: string, extName: string, diff --git a/shared/extensions-marketplace/use-an-extension/project-implementation/unity.mdx b/shared/extensions-marketplace/use-an-extension/project-implementation/unity.mdx index 03c5b72b4..a0f44998e 100644 --- a/shared/extensions-marketplace/use-an-extension/project-implementation/unity.mdx +++ b/shared/extensions-marketplace/use-an-extension/project-implementation/unity.mdx @@ -6,7 +6,7 @@ This section presents the framework code you add to your projec You call `loadExtensionProvider` during initialization to specify the extension library path. To do this, add the following code to the `SetupVideoSDKEngine` method after you initialize the engine with `RtcEngine.Initialize`: - ``` csharp + ```csharp RtcEngine.LoadExtensionProvider(""); ``` @@ -14,7 +14,7 @@ This section presents the framework code you add to your projec To enable the extension, add the following code to the `Join` method before `RtcEngine.EnableVideo();`: - ``` csharp + ```csharp RtcEngine.EnableExtension( provider: "", extension: "", @@ -31,7 +31,7 @@ This section presents the framework code you add to your projec To customize the extension for your particular , set suitable values for the extension properties. Refer to the extension documentation for a list of available property names and allowable values. To set a property, add the following code to the `Join` method after `RtcEngine.EnableExtension`: - ``` csharp + ```csharp RtcEngine.SetExtensionProperty( provider: "", extension: "", diff --git a/shared/extensions-marketplace/use-an-extension/project-setup/android.mdx b/shared/extensions-marketplace/use-an-extension/project-setup/android.mdx index e62208cd3..57036238c 100644 --- a/shared/extensions-marketplace/use-an-extension/project-setup/android.mdx +++ b/shared/extensions-marketplace/use-an-extension/project-setup/android.mdx @@ -11,7 +11,7 @@ 2. In `/Gradle Scripts/build.gradle(Module: app)`, add the following line under `dependencies`: - ``` java + ```java implementation fileTree(include: ['*.jar', '*.aar'], dir: 'libs') ``` diff --git a/shared/extensions-marketplace/use-an-extension/project-setup/web.mdx b/shared/extensions-marketplace/use-an-extension/project-setup/web.mdx index 1fb4d8f21..2b7c2a73b 100644 --- a/shared/extensions-marketplace/use-an-extension/project-setup/web.mdx +++ b/shared/extensions-marketplace/use-an-extension/project-setup/web.mdx @@ -9,7 +9,7 @@ 1. In the root of your project, install the extension package. For example, for : - ``` bash + ```bash npm i agora-extension-ai-denoiser ``` diff --git a/shared/extensions-marketplace/virtual-background.mdx b/shared/extensions-marketplace/virtual-background.mdx index 92bb3b935..e8a0c5b38 100644 --- a/shared/extensions-marketplace/virtual-background.mdx +++ b/shared/extensions-marketplace/virtual-background.mdx @@ -1,9 +1,7 @@ import ProjectSetup from '@docs/shared/extensions-marketplace/virtual-background/project-setup/index.mdx'; import ProjectImplement from '@docs/shared/extensions-marketplace/virtual-background/project-implementation/index.mdx'; import ProjectTest from '@docs/shared/extensions-marketplace/virtual-background/project-test/index.mdx'; -import Reference from '@docs/shared/extensions-marketplace/virtual-background/reference/index.mdx'; import { - PlatformWrapper -} from '../../../src/mdx-components/PlatformWrapper'; +import Reference from '@docs/shared/extensions-marketplace/virtual-background/reference/index.mdx'; Virtual Background enables users to blur their background or replace it with a solid color or an image. This feature is applicable to scenarios such as online conferences, online classes, and live streaming. It helps protect personal privacy and reduces audience distraction. @@ -44,7 +42,7 @@ A typical transmission pipeline in the Agora Web SDK consists of a chain of proc -## Test your implementation +## Test virtual background @@ -53,3 +51,4 @@ A typical transmission pipeline in the Agora Web SDK consists of a chain of proc This section contains information that completes the information in this page, or points you to documentation that explains other aspects to this product. This section contains information that completes the information in this page, or points you to documentation that explains other aspects to this product. + diff --git a/shared/extensions-marketplace/virtual-background/project-implementation/index.mdx b/shared/extensions-marketplace/virtual-background/project-implementation/index.mdx index 1baaafc47..fe3f01309 100644 --- a/shared/extensions-marketplace/virtual-background/project-implementation/index.mdx +++ b/shared/extensions-marketplace/virtual-background/project-implementation/index.mdx @@ -1,22 +1,14 @@ -import Android from './android.mdx'; -import Web from './web.mdx'; import Windows from './windows.mdx'; -import Ios from './ios.mdx'; -import MacOs from './macos.mdx'; +import Poc3 from './poc3.mdx'; import Electron from './electron.mdx'; import Flutter from './flutter.mdx'; import ReactNative from './react-native.mdx'; -import Unity from './unity.mdx'; import Unreal from './unreal.mdx'; + - - - - - - \ No newline at end of file + diff --git a/shared/extensions-marketplace/virtual-background/project-implementation/poc3.mdx b/shared/extensions-marketplace/virtual-background/project-implementation/poc3.mdx new file mode 100644 index 000000000..ba932a791 --- /dev/null +++ b/shared/extensions-marketplace/virtual-background/project-implementation/poc3.mdx @@ -0,0 +1,77 @@ +import ImportLibrary from '@docs/assets/code/video-sdk/virtual-background/import-library.mdx'; +import DeviceCompatibility from '@docs/assets/code/video-sdk/virtual-background/device-compatibility.mdx'; +import BlurBackground from '@docs/assets/code/video-sdk/virtual-background/blur-background.mdx'; +import ColorBackground from '@docs/assets/code/video-sdk/virtual-background/color-background.mdx'; +import ImageBackground from '@docs/assets/code/video-sdk/virtual-background/image-background.mdx'; +import ResetBackground from '@docs/assets/code/video-sdk/virtual-background/reset-background.mdx'; +import SetVirtualBackground from '@docs/assets/code/video-sdk/virtual-background/set-virtual-background.mdx'; +import ConfigureEngine from '@docs/assets/code/video-sdk/virtual-background/configure-engine.mdx'; + + + + +### Add the required imports + + + + +### Configure the + + + +### Check device compatibility + + To avoid performance degradation or unavailable features when enabling virtual background on low-end devices, check whether the device supports the feature. + + + +### Configure the virtual background extension + + +### Set a blurred background + + + +### Set a color background + + + +### Set an image background + + + +### Reset the background + + + + +### Check device compatibility + + Not all devices have the capability of using Agora's background segmentation. + Check whether the device supports the specified advanced feature. + + + +### Set a blurred background + + + +### Set a color background + + + +### Set an image background + + + +### Reset the background + + + + + + +### Enable virtual background + + + diff --git a/shared/extensions-marketplace/virtual-background/project-implementation/swift.mdx b/shared/extensions-marketplace/virtual-background/project-implementation/swift.mdx index 2b63e193d..1cf9dc819 100644 --- a/shared/extensions-marketplace/virtual-background/project-implementation/swift.mdx +++ b/shared/extensions-marketplace/virtual-background/project-implementation/swift.mdx @@ -26,7 +26,7 @@ To enable and change virtual backgrounds, you add a button to the user interface Paste the following lines inside the `initViews` function: - ``` swift + ```swift // Button to change virtual background BackgroundButton = NSButton() BackgroundButton.frame = CGRect(x: 230, y:240, width:80, height:20) @@ -37,7 +37,7 @@ To enable and change virtual backgrounds, you add a button to the user interface ``` - ``` swift + ```swift // Button to change virtual background BackgroundButton = UIButton(type: .system) BackgroundButton.frame = CGRect(x: 60, y:500, width:250, height:50) diff --git a/shared/extensions-marketplace/virtual-background/project-implementation/unity.mdx b/shared/extensions-marketplace/virtual-background/project-implementation/unity.mdx index 4a3b6ce4e..46fdf9e2d 100644 --- a/shared/extensions-marketplace/virtual-background/project-implementation/unity.mdx +++ b/shared/extensions-marketplace/virtual-background/project-implementation/unity.mdx @@ -1,94 +1,36 @@ - -This section explains how to enable your users to choose a virtual background. - -### Implement the user interface - -To enable and change virtual backgrounds, you add a button to the user interface. To implement this user interface, take the following steps: - - 1. Right-click **Sample Scene**, then click **Game Object** > **UI** > **Button - TextMeshPro**. A button appears in the **Scene** Canvas. - - 2. In **Inspector**, rename **Button** to **virtualBackground**. - - 3. In **SampleScene**, click **Canvas** > **virtualBackground**, and then in **Inspector**, change the following coordinates: - - * **Pos X** - 350 - * **Pos Y** - 172 - -### Set a virtual background - -1. **Define variables to keep track of the virtual background state** - - In your script file, add the following declarations to `NewBehaviourScript`: - - ```csharp - int counter = 0; // to cycle through the different types of backgrounds - bool isVirtualBackGroundEnabled = false; - ``` - -2. **Enable virtual background** - - When a user presses the button, you check if the user's device supports the virtual background feature. If `IsFeatureAvailableOnDevice` returns true, you call `EnableVirtualBackground` to enable background blur. When the user presses the button again, you change the virtual background to a solid color. On the next button press, you set a `.jpg` or `.png` image as the virtual background. To specify these background effects, you configure `VirtualBackgroundSource` and `SegmentationProperty`. To implement this workflow, in your script file, add the following method to the `NewBehaviourScript` class: + When a user presses the button, you call `SetVirtualBackground` to enable background blur. When the user presses the button again, you change the virtual background to a solid color. On the next button press, you set a `.jpg` or `.png` image as the virtual background. To specify these background effects, you configure `VirtualBackgroundSource` and `SegmentationProperty`. ```csharp - public void setVirtualBackground() + public void SetVirtualBackground() { - if(!RtcEngine.IsFeatureAvailableOnDevice(FeatureType.VIDEO_VIRTUAL_BACKGROUND)) - { - Debug.Log("Your device does not support virtual background"); - return; - } + TMP_Text BtnText = virtualBackgroundGo.GetComponentInChildren(true); + // Options for virtual background + string[] options = { "Color", "Blur", "Image" }; - counter++; - if (counter > 3) + if (counter >= 3) { - counter = 0; isVirtualBackGroundEnabled = false; + BtnText.text = "Enable Virtual Background"; + counter = 0; Debug.Log("Virtual background turned off"); } else { isVirtualBackGroundEnabled = true; + BtnText.text = "Background :" + options[counter]; } - VirtualBackgroundSource virtualBackgroundSource = new VirtualBackgroundSource(); - // Set the type of virtual background - if (counter == 1) - { // Set background blur - virtualBackgroundSource.background_source_type = BACKGROUND_SOURCE_TYPE.BACKGROUND_BLUR; - virtualBackgroundSource.blur_degree = BACKGROUND_BLUR_DEGREE.BLUR_DEGREE_HIGH; - Debug.Log("Blur background enabled"); - } - else if (counter == 2) - { // Set a solid background color - virtualBackgroundSource.background_source_type = BACKGROUND_SOURCE_TYPE.BACKGROUND_COLOR; - virtualBackgroundSource.color = 0x0000FF; - Debug.Log("Color background enabled"); - } - else if (counter == 3) - { // Set a background image - virtualBackgroundSource.background_source_type = BACKGROUND_SOURCE_TYPE.BACKGROUND_IMG; - virtualBackgroundSource.source = ""; - Debug.Log("Image background enabled"); - } - - // Set processing properties for background - SegmentationProperty segmentationProperty = new SegmentationProperty(); - segmentationProperty.modelType = SEG_MODEL_TYPE.SEG_MODEL_AI; // Use SEG_MODEL_GREEN if you have a green background - segmentationProperty.greenCapacity = 0.5F; // Accuracy for identifying green colors (range 0-1) - - // Enable or disable virtual background - RtcEngine.EnableVirtualBackground( - isVirtualBackGroundEnabled, - virtualBackgroundSource, segmentationProperty); + // Set the virtual background + virtualBackgroundManager.setVirtualBackground(isVirtualBackGroundEnabled, options[counter]); + counter++; } ``` + For more details, see the following: + + - enableVirtualBackground -3. **Setup an event listener for the virtual background button** + - VirtualBackgroundSource - Call `setVirtualBackground` when the user presses **Virtual Background**. In your script file, add the followig at the end of `SetupUI`: + - SegmentationProperty - ```csharp - go = GameObject.Find("virtualBackground"); - go.GetComponent` -``` html +```html ``` @@ -29,7 +29,7 @@ In your project, import the relevant libraries and declare the required variable To implement media relay, import the corresponding modules. In `preload.js`, add the following before `createAgoraRtcEngine,`: - ``` javascript + ```javascript ChannelMediaRelayEvent, ChannelMediaRelayState, LogLevel @@ -39,7 +39,7 @@ In your project, import the relevant libraries and declare the required variable To store source and destination channel settings and manage channel relay, in `preload.js`, add the following variables to declarations: - ``` javascript + ```javascript var destChannelName = ""; var destChannelToken = ""; var destUid = 100; // User ID that the user uses in the destination channel. @@ -54,7 +54,7 @@ To enable users to relay channel media to a destination chann When a user presses the button, the starts relaying media from the source channel to the destination channel. If channel media relay is already running, the stops it. To integrate this workflow, in `preload.js`add the following method before `document.getElementById("join").onclick = async function ()`: - ``` javascript + ```javascript document.getElementById("coHost").onclick = async function () { if (mediaRelaying) @@ -95,7 +95,7 @@ To enable users to relay channel media to a destination chann To receive the state change notifications sent during media relay, you add a callback to `EventHandles`. Your responds to connection and failure events in the `onChannelMediaRelayStateChanged` event handler. In `preload.js`, add the following method after `const EventHandles = {`: - ``` javascript + ```javascript onChannelMediaRelayStateChanged: (state, code) => { // This example shows toast messages when the relay state changes, @@ -122,7 +122,7 @@ To enable users to relay channel media to a destination chann To receive notifications of important channel relay events such as network disconnection, reconnection, and users joining channels, you add the `onChannelMediaRelayEvent` callback to `EventHandles`. In `preload.js`, add the following method after `const EventHandles = {`: - ``` javascript + ```javascript onChannelMediaRelayEvent: (code) => { switch (code) { @@ -146,7 +146,7 @@ The alternate approach to multi-channel live streaming is joining multiple chann In this example, you need a button to join and leave a second channel. To add a button, in `preload.js`, add the following code after ``: -``` html +```html ``` @@ -159,7 +159,7 @@ In your project, import the relevant libraries and declare the required variable To connect to multiple channels, import the corresponding module. In `preload.js`, add the following before `createAgoraRtcEngine,`: - ``` javascript + ```javascript ChannelMediaOptions, IRtcEngineEx, ChannelMediaInfo, @@ -169,7 +169,7 @@ In your project, import the relevant libraries and declare the required variable To join and manage a second channel, in `preload.js`, add the following to declarations: - ``` javascript + ```javascript var secondChannelName = ""; var secondChannelUid = 100; // Uid for the second channel var secondChannelToken = ""; @@ -187,7 +187,7 @@ To add multi-channel functionality to your , take the followin 1. In `preload.js`, add the following line before `agoraEngine = createAgoraRtcEngine();`: - ``` javascript + ```javascript agoraEngine = new IRtcEngineEx(); ``` @@ -195,7 +195,7 @@ To add multi-channel functionality to your , take the followin When a user presses **Join Second Channel**, the joins a second channel. If the is already connected to a second channel, it leaves the channel. To do this, add the following code before `document.getElementById("join").onclick = async function ()`: - ``` javascript + ```javascript document.getElementById("multiple-channels").onclick = async function () { if (isSecondChannelJoined) @@ -237,7 +237,7 @@ To add multi-channel functionality to your , take the followin The `onUserJoined` callback returns a parameter called `channelId`. Use the said parameter to setup remote view for the second channel remote user. To implement this logic, update the `onUserJoined` callback with the following code: - ``` javascript + ```javascript onUserJoined :(connection, remoteUid, elapsed) => { diff --git a/shared/video-sdk/develop/live-streaming-over-multiple-channels/project-implementation/flutter.mdx b/shared/video-sdk/develop/live-streaming-over-multiple-channels/project-implementation/flutter.mdx index d4d64a6ca..1199893fb 100644 --- a/shared/video-sdk/develop/live-streaming-over-multiple-channels/project-implementation/flutter.mdx +++ b/shared/video-sdk/develop/live-streaming-over-multiple-channels/project-implementation/flutter.mdx @@ -15,7 +15,7 @@ In this example, you use a single `Button` to start and stop channel media relay #### Implement the user interface To enable your users to start and stop relaying to another channel, add a `Button` to the user interface. Open `/lib/main.dart` and add the following lines after `ListView(...children: [` in the `build` method: -``` dart +```dart ElevatedButton( child: relayState == ChannelMediaRelayState.relayStateRunning ? const Text("Stop Channel media relay") @@ -34,7 +34,7 @@ To enable users to relay channel media to a destination chann To store source and destination channel settings and manage channel relay, in `/lib/main.dart`, add the following variable declarations to the `_MyAppState` class: - ``` dart + ```dart String destChannelName = ""; String destChannelToken = ""; int destUid = 0; // Uid to identify the relay stream in the destination channel @@ -47,7 +47,7 @@ To enable users to relay channel media to a destination chann When a user presses the button, the starts relaying media from the source channel to the destination channel. If channel media relay is already running, the stops it. To integrate this workflow, add the following method to the `_MyAppState` class. - ``` dart + ```dart void channelRelay() async { if (mediaRelaying) { agoraEngine.stopChannelMediaRelay(); @@ -79,7 +79,7 @@ To enable users to relay channel media to a destination chann To receive notification of connection state changes during channel media relay, you add an event handler. Your responds to connection and failure events in the `onChannelMediaRelayStateChanged` event. In `setupVideoSDKEngine()`, add the following code after `RtcEngineEventHandler(`: - ``` dart + ```dart onChannelMediaRelayStateChanged: (ChannelMediaRelayState state, ChannelMediaRelayError error) { setState(() { relayState = state; @@ -99,7 +99,7 @@ To enable users to relay channel media to a destination chann To receive notifications of important channel relay events such as network disconnection, reconnection, and users joining channels, you add the `onChannelMediaRelayEvent` method to the event handler. In `setupVideoSDKEngine()`, add the following code after `RtcEngineEventHandler(`: - ``` dart + ```dart onChannelMediaRelayEvent: (ChannelMediaRelayEvent mediaRelayEvent) { // This example shows messages when relay events occur. // A production level app needs to handle these events properly. @@ -115,7 +115,7 @@ The alternate approach to multi-channel live streaming is joining multiple chann 1. In this example, you use a single button to join and leave a second channel. To add a button to your UI, in `/lib/main.dart`, add the following lines after `ListView(...children: [` in the `build` method: - ``` dart + ```dart ElevatedButton( child: !isSecondChannelJoined ? const Text("Join second channel") @@ -142,7 +142,7 @@ To add multi-channel functionality to your , take the followin To join and manage a second channel, in `/lib/main.dart`, add the following declarations to the `_MyAppState` class: - ``` dart + ```dart late RtcConnection rtcSecondConnection; // Connection object for the second channel String secondChannelName = ""; int secondChannelUid = 100; // User Id for the second channel @@ -157,13 +157,13 @@ To add multi-channel functionality to your , take the followin 1. In the `_MyAppState` class, replace the declaration `late RtcEngine agoraEngine;` with: - ``` dart + ```dart late RtcEngineEx agoraEngine; // Agora multi-channel engine instance ``` 1. In the `setupVideoSDKEngine` method, replace the line `agoraEngine = createAgoraRtcEngine();` with the following: - ``` dart + ```dart agoraEngine = createAgoraRtcEngineEx(); ``` @@ -172,7 +172,7 @@ To add multi-channel functionality to your , take the followin When a user presses the button, the joins a second channel. If the is already connected to a second channel, it leaves the channel. To do this, add the following method to the `_MyAppState` class. - ``` dart + ```dart void joinSecondChannel() async { if (isSecondChannelJoined) { agoraEngine.leaveChannelEx(rtcSecondConnection); @@ -216,7 +216,7 @@ To add multi-channel functionality to your , take the followin The `RtcEngineEventHandler` registered with your instance of receives events from all channels that a user is connected to. To identify the channel from which a callback originates, you look at the `channelId` of the `connection` object in the callback. To enable your app to handle callbacks from both channels, in `setupVideoSDKEngine` **replace** the code block `agoraEngine.registerEventHandler(RtcEngineEventHandler(...))` with the following - ``` dart + ```dart agoraEngine.registerEventHandler( RtcEngineEventHandler( onJoinChannelSuccess: (RtcConnection connection, int elapsed) { @@ -266,7 +266,7 @@ To add multi-channel functionality to your , take the followin In this example, you display two remote videos from two different channels. To create the widget for displaying the second video, add the following method to the `_MyAppState` class: - ``` dart + ```dart Widget _secondVideoPanel() { if (!isSecondChannelJoined) { return const Text( diff --git a/shared/video-sdk/develop/live-streaming-over-multiple-channels/project-implementation/index.mdx b/shared/video-sdk/develop/live-streaming-over-multiple-channels/project-implementation/index.mdx index 593ff9366..879deeb9d 100644 --- a/shared/video-sdk/develop/live-streaming-over-multiple-channels/project-implementation/index.mdx +++ b/shared/video-sdk/develop/live-streaming-over-multiple-channels/project-implementation/index.mdx @@ -1,22 +1,14 @@ -import Android from './android.mdx'; -import Ios from './ios.mdx'; -import Web from './web.mdx'; import ReactNative from './react-native.mdx' import Electron from './electron.mdx'; -import Unity from './unity.mdx'; -import MacOS from './macos.mdx'; import Windows from './windows.mdx'; import Flutter from './flutter.mdx' +import Poc3 from './poc3.mdx'; import Unreal from './unreal.mdx' + - - - - - diff --git a/shared/video-sdk/develop/live-streaming-over-multiple-channels/project-implementation/poc3.mdx b/shared/video-sdk/develop/live-streaming-over-multiple-channels/project-implementation/poc3.mdx new file mode 100644 index 000000000..d61d2e69e --- /dev/null +++ b/shared/video-sdk/develop/live-streaming-over-multiple-channels/project-implementation/poc3.mdx @@ -0,0 +1,40 @@ +import ImportLibrary from '@docs/assets/code/video-sdk/live-streaming-multiple-channels/import-library.mdx'; +import SetVariables from '@docs/assets/code/video-sdk/live-streaming-multiple-channels/set-variables.mdx'; +import StartStopChannelMediaRelay from '@docs/assets/code/video-sdk/live-streaming-multiple-channels/start-stop-channel-media-relay.mdx'; +import MonitorChannelMediaRelayState from '@docs/assets/code/video-sdk/live-streaming-multiple-channels/monitor-channel-media-relay-state.mdx'; +import JoinSecondChannel from '@docs/assets/code/video-sdk/live-streaming-multiple-channels/join-a-second-channel.mdx'; +import LeaveSecondChannel from '@docs/assets/code/video-sdk/live-streaming-multiple-channels/leave-second-channel.mdx'; +import ReceiveCallbacksFromSecondChannel from '@docs/assets/code/video-sdk/live-streaming-multiple-channels/receive-callbacks-from-second-channel.mdx'; + + +### Add the required imports + + + + +### Add the required variables + + + +### Start or stop channel media relay + + +### Monitor the channel media relay state + + + +### Join the second channel + + + + +### Leave the second channel + + + + +### Receive callbacks from the second channel + + + + diff --git a/shared/video-sdk/develop/live-streaming-over-multiple-channels/project-implementation/react-native.mdx b/shared/video-sdk/develop/live-streaming-over-multiple-channels/project-implementation/react-native.mdx index 7994f2702..7bb4e4125 100644 --- a/shared/video-sdk/develop/live-streaming-over-multiple-channels/project-implementation/react-native.mdx +++ b/shared/video-sdk/develop/live-streaming-over-multiple-channels/project-implementation/react-native.mdx @@ -16,7 +16,7 @@ In this example, you use a single `Button` to start and stop channel media relay To enable your users to start and stop relaying to another channel, add a button to the user interface. In ``, add the following code after `Leave`: -``` html +```html Start Channel Media Relay @@ -30,7 +30,7 @@ In your project, import the relevant libraries and declare the required variable To implement media relay, import the corresponding modules. In `App.tsx`, add the following imports before `createAgoraRtcEngine,`: - ``` ts + ```ts ChannelMediaRelayEvent, ChannelMediaRelayState, ``` @@ -39,7 +39,7 @@ In your project, import the relevant libraries and declare the required variable To store source and destination channel settings and manage channel relay, in `App.tsx`, add the following variables to the declarations: - ``` ts + ```ts const destChannelName = ''; const destChannelToken = ''; const destUid = 100; // User ID that the user uses in the destination channel. @@ -54,7 +54,7 @@ To enable users to relay channel media to a destination chann When a user presses the button, the starts relaying media from the source channel to the destination channel. If channel media relay is already running, the stops it. To integrate this workflow, in `App.tsx`add the following method after `const leave = () => {}`: - ``` ts + ```ts const coHost = () => { if (mediaRelaying) { agoraEngineRef.current?.stopChannelMediaRelay(); @@ -87,7 +87,7 @@ To enable users to relay channel media to a destination chann To receive the state change notifications sent during media relay, you add a callback to `EventHandles`. Your responds to connection and failure events in the `onChannelMediaRelayStateChanged` event handler. In `App.tsx`, add the following method after `onUserOffline: (_connection, remoteUid) => {}`: - ``` ts + ```ts onChannelMediaRelayStateChanged: (state, code) => { // This example shows toast messages when the relay state changes, // a production level app needs to handle state change properly. @@ -110,7 +110,7 @@ To enable users to relay channel media to a destination chann To receive notifications of important channel relay events such as network disconnection, reconnection, and users joining channels, you add the `onChannelMediaRelayEvent` callback to `EventHandles`. In `App.tsx`, add the following method after `onChannelMediaRelayStateChanged: (state, code) => {}`: - ``` ts + ```ts onChannelMediaRelayEvent: code => { switch (code) { case ChannelMediaRelayEvent.RelayEventNetworkDisconnected: // RELAY_EVENT_NETWORK_DISCONNECTED @@ -134,7 +134,7 @@ The alternate approach to multi-channel live streaming is joining multiple chann #### Implement the user interface In this example, you need a button to join and leave a second channel. In ``, add the following code after `Leave`: -``` html +```html Join Second Channel @@ -148,7 +148,7 @@ In your project, import the relevant libraries and declare the required variable 1. To implement media relay, in `App.tsx`, add the following imports before `createAgoraRtcEngine,`: - ``` ts + ```ts ChannelMediaOptions, IRtcEngineEx, RtcConnection, @@ -160,7 +160,7 @@ In your project, import the relevant libraries and declare the required variable To join and manage a second channel, in `App.tsx`, add the following variables to the declarations: - ``` ts + ```ts var rtcSecondConnection: RtcConnection; const secondChannelName = '<-----Insert second channel name------>'; const secondChannelUid = 100; // Uid for the second channel @@ -179,7 +179,7 @@ To add multi-channel functionality to your , take the followin 1. Replace `const agoraEngineRef = useRef(); // Agora engine instance` with: - ``` ts + ```ts const agoraEngineRef = useRef(); // Agora engine instance ``` @@ -193,7 +193,7 @@ To add multi-channel functionality to your , take the followin When a user presses the button, the joins a second channel. If the is already connected to a second channel, it leaves the channel. To do this, add the following code after the `const leave = () => {}` function: - ``` ts + ```ts const multipleChannels = () => { if (isSecondChannelJoined) { agoraEngineRef.current?.leaveChannelEx({ @@ -234,7 +234,7 @@ To add multi-channel functionality to your , take the followin To see if you have successfully joined the second channel, update the `onJoinChannelSuccess` callback to print the channel ID for the connection. - ``` ts + ```ts onJoinChannelSuccess: (connection, _Uid) => { showMessage( 'Successfully joined the channel ' + connection.channelId, diff --git a/shared/video-sdk/develop/live-streaming-over-multiple-channels/project-implementation/swift.mdx b/shared/video-sdk/develop/live-streaming-over-multiple-channels/project-implementation/swift.mdx index 030315b33..e2e8c54be 100644 --- a/shared/video-sdk/develop/live-streaming-over-multiple-channels/project-implementation/swift.mdx +++ b/shared/video-sdk/develop/live-streaming-over-multiple-channels/project-implementation/swift.mdx @@ -49,7 +49,7 @@ In your project, declare the required variables. relay, add the following declaration to the top of the `ViewController` class: - ``` swift + ```swift var destChannelName = "<#name of the destination channel#>" var destChannelToken = "<#access token for the destination channel#>" var destUid: UInt = 100 // User ID that the user uses in the destination channel. @@ -100,7 +100,7 @@ To enable users to relay channel media to a destination chann To receive notifications of important channel relay events such as network disconnection, reconnection, and users joining channels, you add a `didReceiveChannelMediaRelayEvent` function to the event handler. Add the following function inside `extension ViewController: AgoraRtcEngineDelegate {`: - ``` swift + ```swift func rtcEngine(_ engine: AgoraRtcEngineKit, didReceive event: AgoraChannelMediaRelayEvent) { switch event { case .disconnect: @@ -156,7 +156,7 @@ In your project, declare the necessary variables, and setup access to the UI ele To join and manage a second channel, add the following declarations before `class ViewController: UIViewController {`: - ``` swift + ```swift var secondChannelName = "<#name of the second channel#>" var secondChannelUid: UInt = 100 // Uid for the second channel var secondChannelToken = "<#access token for the second channel#>" @@ -169,7 +169,7 @@ In your project, declare the necessary variables, and setup access to the UI ele `agoraEngine = AgoraRtcEngineKit.sharedEngine(with: config, delegate: self)` in the `initializeAgoraEngine` function: - ``` swift + ```swift secondChannelBtn.isEnabled = false ``` @@ -177,7 +177,7 @@ In your project, declare the necessary variables, and setup access to the UI ele the following line to the `joinChannel` function after `joined = true`: - ``` swift + ```swift secondChannelBtn.isEnabled = true ``` @@ -199,7 +199,7 @@ the following steps: 1. In the `ViewController`, add the following line along with the other declarations at the top: - ``` swift + ```swift var secondChannelDelegate: SecondChannelDelegate = SecondChannelDelegate() ``` @@ -213,13 +213,13 @@ the following steps: In this example, you need both `ViewController` and `SecondChannelDelegate` to be able to set `remoteView` as its `UIView` when calling `setupRemoteVideo` or `setupRemoteVideoEx`, you also need to see the second connection instance. To do this, remove the `var remoteView: UIView!` declaration from `ViewController` and add the following before `class ViewController: UIViewController {`: - ``` swift + ```swift var rtcSecondConnection: AgoraRtcConnection! var remoteView: UIView! ``` - ``` swift + ```swift var rtcSecondConnection: AgoraRtcConnection! var remoteView: NSView! ``` diff --git a/shared/video-sdk/develop/live-streaming-over-multiple-channels/project-implementation/unity.mdx b/shared/video-sdk/develop/live-streaming-over-multiple-channels/project-implementation/unity.mdx index 74d7f7467..d4b24d8c3 100644 --- a/shared/video-sdk/develop/live-streaming-over-multiple-channels/project-implementation/unity.mdx +++ b/shared/video-sdk/develop/live-streaming-over-multiple-channels/project-implementation/unity.mdx @@ -10,292 +10,213 @@ Choose the method that best suits your scenario and follow the step by step proc ### Channel media relay -In this example, you use a single `Button` to start and stop channel media relay. +This section explains the logic to relay channel media to a destination channel: -#### Implement the user interface - -To enable your users to start and stop relaying to another channel, add a `Button` to the user interface by taking the following steps: - - 1. Right-click **Sample Scene**, then click **GameObject** > **UI** > **Button - TextMeshPro**. A button appears in the **Scene** Canvas. - - 2. In **Inspector**, rename **Button** to **StartChannelMediaRelay**, then change the following coordinates: - - * **Pos X** - 350 - * **Pos Y** - 172 - -#### Handle the system logic - -In your project, import the relevant libraries and declare the required variables. - -1. **Declare the variables you need** - - To store source and destination channel settings and manage channel relay, in your script file, add the following declarations to `NewBehaviourScript`: - ``` csharp - private string destChannelName = ""; - private string destChannelToken = ""; - private uint destUid = 100; // User ID that the user uses in the destination channel. - private string sourceChannelToken = ""; // Generate with the _channelName and remoteUid = 0. - private bool mediaRelaying = false; - private TMP_Text channelRelayBtnText; - ``` - -2. **Add the required UI namespace** - - To import the required UI namespace, in your script file, add the following to the list of namespace declarations: - - ``` csharp - using TMPro; - ``` - -3. **Access the channel relay button** - - To programmatically access the channel media relay button, add the following at the end of `SetupUI`: - - ``` csharp - // Access the channel relay button. - go = GameObject.Find("StartChannelMediaRelay"); - go.GetComponent` - -``` html - -``` - -#### Handle the system logic - -For these implementation workflows, you use variables to track the -channels your users are hosting or joining. - -1. **Add the required variables** - - In `main.js`, add the following variables to the declarations: - - ``` javascript - // A variable to track the co-hosting state. - var isCoHost = false; - // The destination channel name you want to join. - var destChannelName = ''; - //In a production app, the user adds the channel name and you retrieve the - // authentication token from a token server. - var destChannelToken = ''; - ``` - -#### Implement channel media relay - -To enable users to relay channel media to a destination channel, take the following steps: - -1. **Start or stop channel media relay** - - When the user presses **Start Channel Media Relay**, implement the following in your : - - - Create an `IChannelMediaRelayConfiguration` object that you use to specify the media relaying configuration. - - - Set the source and destination channel info in the `IChannelMediaRelayConfiguration` instance. - - - Call `startChannelMediaRelay` and pass the configuration object to start media relaying. - - When the user presses **Stop Channel Media Relay**, stop media relay with a call to `stopChannelMediaRelay`. - - To implement this logic, add the following code before `document.getElementById('leave').onclick = async function () {`: - - ``` javascript - document.getElementById('coHost').onclick = async function () - { - //Keep the same UID for this user. - var destUID = options.uid ; - if (!isCoHost) - { - const channelMediaConfig = AgoraRTC.createChannelMediaRelayConfiguration(); - // Set the source channel information. - // Set channelName as the source channel name. Set uid as the ID of the host whose stream is relayed. - // The token is generated with the source channel name. - // Assign the token you generated for the source channel. - channelMediaConfig.setSrcChannelInfo({ - channelName: options.channel, - token: options.token, - uid: options.uid - }) - // Set the destination channel information. You can set a maximum of four destination channels. - // Set channelName as the destination channel name. Set uid as 0 or a 32-bit unsigned integer. - // To avoid UID conflicts, the uid must be different from any other user IDs in the destination channel. - // Assign the token you generated for the destination channel. - channelMediaConfig.addDestChannelInfo({ - channelName: destChannelName, - token: destChannelToken, - uid: destUID - }) - // Start media relaying. - agoraEngine.startChannelMediaRelay(channelMediaConfig).then(() => { - // Update the co-hosting state. - isCoHost = true; - // Update the button text. - document.getElementById(`coHost`).innerHTML = 'Start Channel Media Relay'; - console.log(`startChannelMediaRelay success`); - }).catch(e => { - console.log(`startChannelMediaRelay failed`, e); - }) - } - else - { - // Remove a destination channel. - channelMediaConfig.removeDestChannelInfo(destChannelName) - // Update the configurations of the media stream relay. - agoraEngine.updateChannelMediaRelay(channelMediaConfig).then(() => { - console.log("updateChannelMediaRelay success"); - }).catch(e => - { - console.log("updateChannelMediaRelay failed", e); - }) - //Stop the relay. - agoraEngine.stopChannelMediaRelay().then(() => { - console.log("stop media relay success"); - isCoHost = false; - }).catch(e => - { - console.log("stop media relay failed", e); - }) - // Update the button text. - document.getElementById(`coHost`).innerHTML = 'Start Channel Media Relay*'; - } - // Refresh the page for reuse. - window.location.reload(); - } - ``` - -2. **Monitor the media relay state** - - The supplies `channel-media-relay-state` callback that you use to learn about the current state of channel media - relay. To implement this callback in your , in `main.js`, add the following code before `window.onload = function ()`: - - ``` javascript - agoraEngine.on("channel-media-relay-state", state => - { - console.log("The current state is : "+ state); - }); - ``` - -### Join multiple channels - -The alternate approach to multi-channel live streaming is joining multiple channels. In this section, you learn how to implement joining a second channel in your . - -#### Implement the user interface - -In this example, you use a button to join and leave the second channel. - -To add a button, in `main.js`, add the following code after ``: - -``` html - -``` - -#### Handle the system logic - -To join a second channel, add the required variables in your code. - -1. **Declare the required variables** - - In `main.js`, add the following to the declarations: - - ``` javascript - // A variable to create a second instance of Agora engine. - var agoraEngineSubscriber; - var isMultipleChannel = false; - // The second channel name you want to join. - var secondChannelName = ''; - //In a production app, the user adds the channel name and you retrieve the - // authentication token from a token server. - var secondChannelToken = ''; - ``` - -#### Implement joining multiple channels - -When the user presses **Join second channel**, implement the following in your : - -- Create an instance of that is used to join a new channel. - -- Set the client role to host for live streaming. - -- Publish the local audio and video tracks to the new channel. - -- Listen to the `user-published` event so the audience can subscribe to the new channel. - -When the user presses **Leave Second Channel**, leave the new channel with a call to the `leave` method. - -To implement this logic, in `main.js`, add the following code before `document.getElementById('leave').onclick = async function () {`: - -``` javascript -document.getElementById('multiple-channels').onclick = async function () -{ - // Check to see if the user has already joined a channel. - if(isMultipleChannel == false) - { - // Create an Agora engine instance. - agoraEngineSubscriber = AgoraRTC.createClient({ mode: "live", codec: "vp9" }); - // Setup event handlers to subscribe and unsubscribe to the second channel users. - agoraEngineSubscriber.on("user-published", async (user, mediaType) => - { - // Subscribe to the remote user when the SDK triggers the "user-published" event. - await agoraEngineSubscriber.subscribe(user, mediaType); - console.log("Subscribe success!"); - if(options.role == '') - { - window.alert("Select a user role first!"); - return; - } - // You only play the video when you join the channel as a host. - else if(options.role == 'audience' && mediaType == "video") - { - // Dynamically create a container in the form of a DIV element to play the second channel remote video track. - const container = document.createElement("div"); - // Set the container size. - container.style.width = "640px"; - container.style.height = "480px"; - container.style.padding = "15px 5px 5px 5px"; - // Specify the container id and text. - container.id = user.uid.toString(); - container.textContent = "Remote user from the second channel" + user.uid.toString(); - // Append the container to page body. - document.body.append(container); - // Play the remote video in the container. - user.videoTrack.play(container); - } - // Listen for the "user-unpublished" event. - agoraEngineSubscriber.on("user-unpublished", user => - { - console.log(user.uid+ "has left the channel"); - }); - }); - // Set the user role. - agoraEngineSubscriber.setClientRole(options.role); - // Join the new channel. - await agoraEngineSubscriber.join(options.appId, secondChannelName, secondChannelToken, options.uid); - // An audience can not publish audio and video tracks in the channel. - if(options.role != 'audience') - { - await agoraEngineSubscriber.publish([channelParameters.localAudioTrack, channelParameters.localVideoTrack]); - } - isMultipleChannel = true; - // Update the button text. - document.getElementById('multiple-channels').innerHTML = 'Leave Second Channel'; - } - else - { - isMultipleChannel = false; - // Leave the channel. - await agoraEngineSubscriber.leave(); - } -} -``` - diff --git a/shared/video-sdk/develop/live-streaming-over-multiple-channels/project-implementation/windows.mdx b/shared/video-sdk/develop/live-streaming-over-multiple-channels/project-implementation/windows.mdx index 801585b69..c8a53d7a9 100644 --- a/shared/video-sdk/develop/live-streaming-over-multiple-channels/project-implementation/windows.mdx +++ b/shared/video-sdk/develop/live-streaming-over-multiple-channels/project-implementation/windows.mdx @@ -38,7 +38,7 @@ In your project, declare the required variables and reference the channel media To store source and destination channel settings and manage channel relay, in `AgoraImplementationDlg.cpp`, add the following declarations after the list of header includes: - ``` cpp + ```cpp CHAR destChannelName[] = ""; CHAR destSourceChannelToken[] = ""; int destUid = 100; // User ID that the user uses in the destination channel. @@ -50,7 +50,7 @@ In your project, declare the required variables and reference the channel media To access the channel relay button, in `AgoraImplementationDlg.cpp`, add the following to `OnInitDialog` before `return true;`: - ``` cpp + ```cpp channelMediaButton = (CButton*)GetDlgItem(IDC_BUTTON3); ``` @@ -62,7 +62,7 @@ To enable users to relay channel media to a destination chann When a user presses the button, the starts relaying media from the source channel to the destination channel. To setup an event list on the channel relay button, in **Dialog Editor**, double-click **Start Channel Media Relay**. **Dialog Editor** automatically creates and opens an event listener for you. To start and stop channel media relay on the button click event, add the following code to the event listener method you just created: - ``` cpp + ```cpp if (mediaRelaying) { m_rtcEngine->stopChannelMediaRelay(); @@ -160,7 +160,7 @@ In your project, import the relevant libraries and declare the required variable To join and manage a second channel, in `AgoraImplementationDlg.cpp`, add the following declarations after the list of header includes: - ``` cpp + ```cpp RtcConnection rtcSecondConnection; CButton secondChannelButton; CHAR secondChannelName[] = ""; @@ -179,18 +179,18 @@ To add multi-channel functionality to your , take the followin 1. In `AgoraImplementationDlg.h`, add the following to the list of includes: - ``` cpp + ```cpp #include ``` 2. In `AgoraImplementationDlg.h`, replace the line `IRtcEngine* m_rtcEngine = nullptr;` with the following: - ``` cpp + ```cpp IRtcEngineEx* agoraEngine = nullptr; ``` 3. In `AgoraImplementationDlg.cpp`, replace the line `agoraEngine = createAgoraRtcEngine();` with the following code: - ``` cpp + ```cpp agoraEngine = static_cast(createAgoraRtcEngine()); ``` @@ -198,7 +198,7 @@ To add multi-channel functionality to your , take the followin When a user presses the button, the joins a second channel. If the is already connected to a second channel, it leaves the channel. To implement this workflow, in **Dialog Editor**, double-click **Start Channel Media Relay**. **Dialog Editor** automatically creates and opens an event listener for you. To start and stop channel media relay on the button click event, add the following code to the event listener you just created: - ``` cpp + ```cpp if (isSecondChannelJoined) { agoraEngine->leaveChannelEx(rtcSecondConnection); } @@ -231,7 +231,7 @@ To add multi-channel functionality to your , take the followin 1. Setup an event handler class and declare the required callbacks. In `AgoraImplementation.h`, add the following code before the `AgoraEventHandler` class: - ``` cpp + ```cpp // Callbacks for the second channel. class SecondChannelEventHandler : public IRtcEngineEventHandler { @@ -249,7 +249,7 @@ To add multi-channel functionality to your , take the followin 2. Provide a definition for each callback. In `AgoraImplementation.cpp`, add the following callback definition at the end of the file: - ``` cpp + ```cpp void SecondChannelEventHandler::onJoinChannelSuccess(const char* channel, uid_t uid, int elapsed) { AfxMessageBox(L"You joined the second channel"); diff --git a/shared/video-sdk/develop/live-streaming-over-multiple-channels/project-test/android.mdx b/shared/video-sdk/develop/live-streaming-over-multiple-channels/project-test/android.mdx index 83276f1dd..38d9c67df 100644 --- a/shared/video-sdk/develop/live-streaming-over-multiple-channels/project-test/android.mdx +++ b/shared/video-sdk/develop/live-streaming-over-multiple-channels/project-test/android.mdx @@ -1,59 +1,36 @@ -### Test channel media Relay +5. **Test channel media Relay** -1. Make sure co-host token authentication is enabled for your project in . + 1. Make sure co-host token authentication is enabled for your project in . -1. In Android Studio, open `app/java/com.example./MainActivity` and update `appId`, `channelName` and `destChannelName`. + 1. To configure Channel Media Relay, in `config.json`, set: + + - `sourceChannelToken` to a valid token corresponding to `channelName` and `uid = 0`. + - `destinationChannelName` to the name of the channel to which you want to replay the stream. + - `destinationChannelUid` to a an integer for use as the `uid` to join the destination channel. + - `destinationChannelToken` to a valid token corresponding to the `destinationChannelName` and `destinationChannelUid`. -1. [Generate a temporary token](../reference/manage-agora-account#generate-a-temporary-token) in using `appId` and `channelName`. Use it to update `token` in `MainActivity`. Use the same values to generate another token and update `sourceChannelToken`. + 1. On a second device, **Join** a channel using the `appId`, `destChannelName`, and `destChannelToken`. -1. Generate a third token in using `appId` and `destChannelName`. Use it to update `destChannelToken` in `MainActivity`. + 1. Press **Start Channel Media Relay**. You see the video from the source channel relayed to the destination channel. -1. In your browser, navigate to the web demo and update _App ID_, _Channel_, and _Token_ with the values for your temporary token, then click **Join**. + 1. Press **Stop Channel Media Relay**. The media relaying is stopped. -1. Repeat the previous step on a second device, but this time use `appId`, `destChannelName`, and `destChannelToken` to **Join** the channel. +6. **Test joining a second channel** -1. Connect an Android device to your development device. + 1. To configure joining a second channel, in `config.json`, set: -1. In Android Studio, click **Run app**. A moment later, you see the project installed on your device. + - `secondChannelName` to the name of the second channel. + - `secondChannelUid` to an integer for use as the `uid` to join a second channel. + - `secondChannelToken` to a valid token corresponding to the second channel name and uid. - If this is the first time you run your app, grant camera and microphone permissions. + 1. In your browser, navigate to the web demo and use `appId`, `secondChannelName`, and `secondChannelToken` to **Join** the second channel. -1. Select **Host** and press **Join**. You see the video from the web browser demo app connected to `channelName` in the top frame of your . + 1. In your Android app, press **Join**. You see the video from the web browser demo app connected to `channelName`. -1. Press **Start Channel Media Relay**. You see the video from the web browser demo app connected to `channelName` relayed to the web browser demo app connected to `destChannelName`. + 1. Press **Join Second Channel**. You see another video from the web demo app connected to `secondChannelName` in your . -1. Press **Stop Channel Media Relay**. The media relaying is stopped. - -### Test joining multiple channels - -1. In Android Studio, open `app/java/com.example./MainActivity` and update `appId`, `channelName` and `secondChannelName`. - -1. [Generate a temporary token](../reference/manage-agora-account#generate-a-temporary-token) in using `appId` and `channelName`. Use it to update `token` in `MainActivity`. - -1. Generate a second token in using `appId` and `secondChannelName`. Use it to update `secondChannelToken` in `MainActivity`. - -1. In your browser, navigate to the web demo and update _App ID_, _Channel_, and _Token_ with the values for your temporary token, then click **Join**. - -1. Repeat the previous step on a second device, but this time use `appId`, `secondChannelName`, and `secondChannelToken` to **Join** the channel. - -1. Connect an Android device to your development device. - -1. In Android Studio, click **Run app**. A moment later, you see the project installed on your device. - - If this is the first time you run your app, grant camera and microphone permissions. - -1. Select **Audience** - - 1. Press **Join**. You see the video from the web browser demo app connected to `channelName` in the top frame of your . - 1. Press **Join Second Channel**. You see the video from the web browser demo app connected to `secondChannelName` in the bottom frame of your . 1. Press **Leave Second Channel** and then **Leave** to exit both channels. -1. Select **Host** - - 1. Press **Join**. You see the local video in the top frame of your . The web browser demo app connected to `channelName` shows the video from your . - 1. Press **Join Second Channel**. You see the video from the web browser demo app connected to `secondChannelName` in the bottom frame of your . - 1. Press **Leave Second Channel** to exit both channels. - diff --git a/shared/video-sdk/develop/live-streaming-over-multiple-channels/project-test/index.mdx b/shared/video-sdk/develop/live-streaming-over-multiple-channels/project-test/index.mdx index fc5ddb2ab..2fbab440d 100644 --- a/shared/video-sdk/develop/live-streaming-over-multiple-channels/project-test/index.mdx +++ b/shared/video-sdk/develop/live-streaming-over-multiple-channels/project-test/index.mdx @@ -1,21 +1,13 @@ -import Android from './android.mdx'; -import Ios from './ios.mdx'; -import Web from './web.mdx'; -import ReactNative from './react-native.mdx' +import Poc3 from './poc3.mdx'; +import ReactNative from './react-native.mdx'; import Electron from './electron.mdx'; -import Unity from './unity.mdx'; -import MacOS from './macos.mdx'; +import Flutter from './flutter.mdx'; import Windows from './windows.mdx'; -import Flutter from './flutter.mdx' import Unreal from './unreal.mdx' - - - - - + - - + + \ No newline at end of file diff --git a/shared/video-sdk/develop/live-streaming-over-multiple-channels/project-test/ios.mdx b/shared/video-sdk/develop/live-streaming-over-multiple-channels/project-test/ios.mdx index bb04a60c7..207d4cfa6 100644 --- a/shared/video-sdk/develop/live-streaming-over-multiple-channels/project-test/ios.mdx +++ b/shared/video-sdk/develop/live-streaming-over-multiple-channels/project-test/ios.mdx @@ -1,61 +1,9 @@ -### Test channel media Relay +5. **Select "Live streaming over multiple channels".** -1. In Xcode, open `ViewController` and update `appID`, `channelName` and `destChannelName`. +1. **Join the primary channel.** -2. [Generate a temporary token](../reference/manage-agora-account#generate-a-temporary-token) in using `appID` and `channelName`. Use it to update `token` in `ViewController`. Use the same values to generate another token and update `sourceChannelToken`. +1. **Stream to the secondary channel.** -3. Generate a third token in using `appID` and `destChannelName`. Use it to update `destChannelToken` in `ViewController`. - -4. In your browser, navigate to the web demo and update _App ID_, _Channel_, and _Token_ with the values for your temporary token, then click **Join**. - -5. Repeat the previous step on a second device, but this time use `appID`, `destChannelName`, and `destChannelToken` to **Join** the channel. - -6. Run your using either a physical or a simulator iOS device. - - If this is the first time you run the project, grant microphone and camera access to your . - - If you use an iOS simulator, you see the remote video only. You cannot see the local video stream because of [Apple simulator hardware restrictions](https://help.apple.com/simulator/mac/current/#/devb0244142d). - -7. Select **Host** and press **Join**. You see the video from the web browser demo app connected to `channelName` in the top frame of your iOS device. - -8. Press **Start Channel Media Relay**. You see the video from the web browser demo app connected to `channelName` relayed to the web browser demo app connected to `destChannelName`. - -9. Press **Stop Channel Media Relay**. The media relaying is stopped. - -### Test joining multiple channels - -1. In Xcode, open `ViewController` and update `appID`, `channelName` and `secondChannelName`. - -2. [Generate a temporary token](../reference/manage-agora-account#generate-a-temporary-token) in using `appID` and `channelName`. Use it to update `token` in `ViewController`. - -3. Generate a second token in using `appID` and `secondChannelName`. Use it to update `secondChannelToken` in `ViewController`. - -4. In your browser, navigate to the web demo and update _App ID_, _Channel_, and _Token_ with the values for your temporary token, then click **Join**. - -5. Repeat the previous step on a second device, but this time use `appID`, `secondChannelName`, and `secondChannelToken` to **Join** the channel. - -6. Run your using either a physical or a simulator iOS device. - - If this is the first time you run the project, grant microphone and camera access to your . - - If you use an iOS simulator, you see the remote video only. You cannot see the local video stream because of [Apple simulator hardware restrictions](https://help.apple.com/simulator/mac/current/#/devb0244142d). - -7. Select **Audience** - - 1. Press **Join**. You see the video from the web browser demo app connected to `channelName` in the top frame of your iOS device. - - 2. Press **Join Second Channel**. You see the video from the web browser demo app connected to `secondChannelName` in the bottom frame of your iOS device. - - 3. Press **Leave Second Channel** and then *Leave* to exit both channels. - -8. Select **Host** - - 1. Press **Join**. You see the local video in the top frame of your iOS device. The web browser demo app connected to `channelName` shows the video from your iOS device. - - 2. Press **Join Second Channel**. You see the video from the web browser demo app connected to `secondChannelName` in the bottom frame of your iOS device. - - 3. Press **Leave Second Channel** and then **Leave** to exit both channels. - - + \ No newline at end of file diff --git a/shared/video-sdk/develop/live-streaming-over-multiple-channels/project-test/macos.mdx b/shared/video-sdk/develop/live-streaming-over-multiple-channels/project-test/macos.mdx index 38ff7300d..8d1b2892a 100644 --- a/shared/video-sdk/develop/live-streaming-over-multiple-channels/project-test/macos.mdx +++ b/shared/video-sdk/develop/live-streaming-over-multiple-channels/project-test/macos.mdx @@ -1,7 +1,9 @@ -import Source from './swift.mdx'; - - +5. **Select "Live streaming over multiple channels".** + +1. **Join the primary channel.** + +1. **Stream to the secondary channel.** \ No newline at end of file diff --git a/shared/video-sdk/develop/live-streaming-over-multiple-channels/project-test/poc3.mdx b/shared/video-sdk/develop/live-streaming-over-multiple-channels/project-test/poc3.mdx new file mode 100644 index 000000000..40730d51d --- /dev/null +++ b/shared/video-sdk/develop/live-streaming-over-multiple-channels/project-test/poc3.mdx @@ -0,0 +1,20 @@ +import TestFirstSteps from '@docs/shared/common/project-test/rtc-first-steps.mdx'; +import Android from './android.mdx'; +import Ios from './ios.mdx'; +import Unity from './unity.mdx'; +import ReactJS from './react-js.mdx'; +import MacOS from './macos.mdx'; +import Web from './web.mdx'; + + + + + + + + + + + + + \ No newline at end of file diff --git a/shared/video-sdk/develop/live-streaming-over-multiple-channels/project-test/react-js.mdx b/shared/video-sdk/develop/live-streaming-over-multiple-channels/project-test/react-js.mdx new file mode 100644 index 000000000..f141714d2 --- /dev/null +++ b/shared/video-sdk/develop/live-streaming-over-multiple-channels/project-test/react-js.mdx @@ -0,0 +1,89 @@ + +5. **Test channel media relay** + + 1. In `src/agora-manager/config.ts`, update `appId`, `channelName` and `destChannelName`. + + 2. In `src/agora-manager/config.ts`, set `destUID` to a non-zero unsigned 32-bit integer. + + 3. [Generate a temporary token](../reference/manage-agora-account#generate-a-temporary-token) in using `appID` and `channelName`. Use it to update `rtcToken` in `config.ts`. + + 4. Generate a third token in using `appId` and `destChannelName`. Use it to update `destChannelToken` in `config.ts`. + + 5. In your browser, navigate to the web demo and update _App ID_, _Channel_, and _Token_ with the values for your temporary token, then click **Join**. + + 6. Repeat the previous step on a second device, but this time use `appId`, `destChannelName`, and `destChannelToken` to **Join** the channel. + + 1. Start the proxy server: + + In Terminal run the following command: + + ```bash + node ./utils/proxy.js + ``` + 1. Start the dev server: + + Execute the following command in the terminal: + ```bash + yarn dev + ``` + Use the URL displayed in the terminal to open the in your browser. + + 1. Select **Host**. + + 1. To connect to a channel, click **Join**. + + If this is the first time you run the project, you need to grant microphone and camera access to your . You see the web browser demo app connected to `channelName` shows the video from your development device. + + 1. Press **Start Media Relay**. You see the video from the web browser demo app connected to `channelName` relayed to the web browser demo app connected to `destChannelName`. + + 1. Press **Stop Media Relay**. The media relaying is stopped. + +6. **Test joining multiple channels** + + 1. In `src/agora-manager/config.ts`, update `appId`, `channelName`, and `secondChannel`. + + 2. [Generate a temporary token](../reference/manage-agora-account#generate-a-temporary-token) in using `appId` and `channelName`. Use it to update `rtcToken` in `config.ts`. + + 3. Generate a second token in using `appId` and `secondChannel`. Use it to update `secondChannelToken` in `config.ts`. + + 4. In your browser, navigate to the web demo and update _App ID_, _Channel_, and _Token_ with the values for your temporary token, then click **Join**. + + 5. Repeat the previous step on a second device, but this time use `appID` `secondChannel`, and `secondChannelToken` to **Join** the channel. + + 1. Start the proxy server: + + In Terminal run the following command: + ```bash + node ./utils/proxy.js + ``` + 1. Start the dev server: + + Execute the following command in the terminal: + ```bash + yarn dev + ``` + Use the URL displayed in the terminal to open the in your browser. + + 1. Select **Audience** + + 1. Press **Join**. + + If this is the first time you run the project, you need to grant microphone and camera access to your . You see the video from the web browser demo app connected to `channel` in the remote view of your development device. + + 2. Press **Join Second Channel**. + + You see the video from the web browser demo app connected to `secondChannel` in the bottom container of your development device. + 3. Press **Leave Second Channel** and then **Leave** to exit both channels. + + 1. Select **Host** + + 1. Press **Join**. + + You see the local video in the local view of your development device. The web browser demo app connected to `channel` shows the video from your development device. + + 2. Press **Join Second Channel**. + + You see the web browser demo app connected to `secondChannel` shows the video from your development device. + + 3. Press **Leave Second Channel** and then **Leave** to exit both channels. + diff --git a/shared/video-sdk/develop/live-streaming-over-multiple-channels/project-test/swift.mdx b/shared/video-sdk/develop/live-streaming-over-multiple-channels/project-test/swift.mdx deleted file mode 100644 index f6af8a750..000000000 --- a/shared/video-sdk/develop/live-streaming-over-multiple-channels/project-test/swift.mdx +++ /dev/null @@ -1,56 +0,0 @@ - -### Test channel media Relay - -1. In Xcode, open `ViewController` and update `appID`, `channelName` and `destChannelName`. - -2. [Generate a temporary token](../reference/manage-agora-account#generate-a-temporary-token) in using `appID` and `channelName`. Use it to update `token` in `ViewController`. Use the same values to generate another token and update `sourceChannelToken`. - -3. Generate a third token in using `appID` and `destChannelName`. Use it to update `destChannelToken` in `ViewController`. - -4. In your browser, navigate to the web demo and update _App ID_, _Channel_, and _Token_ with the values for your temporary token, then click **Join**. - -5. Repeat the previous step on a second device, but this time use `appID`, `destChannelName`, and `destChannelToken` to **Join** the channel. - -6. Run your : - - If this is the first time you run the project, grant microphone and camera access to your . - - If you use an iOS simulator, you see the remote video only. You cannot see the local video stream because of [Apple simulator hardware restrictions](https://help.apple.com/simulator/mac/current/#/devb0244142d). - - -7. Select **Host** and press **Join**. You see the video from the web browser demo app connected to `channelName` in the top frame of your iOS device. - -8. Press **Start Channel Media Relay**. You see the video from the web browser demo app connected to `channelName` relayed to the web browser demo app connected to `destChannelName`. - -9. Press **Stop Channel Media Relay**. The media relaying is stopped. - -### Test joining multiple channels - -1. In Xcode, open `ViewController` and update `appID`, `channelName` and `secondChannelName`. - -2. [Generate a temporary token](../reference/manage-agora-account#generate-a-temporary-token) in using `appID` and `channelName`. Use it to update `token` in `ViewController`. - -3. Generate a second token in using `appID` and `secondChannelName`. Use it to update `secondChannelToken` in `ViewController`. - -4. In your browser, navigate to the web demo and update _App ID_, _Channel_, and _Token_ with the values for your temporary token, then click **Join**. - -5. Repeat the previous step on a second device, but this time use `appID`, `secondChannelName`, and `secondChannelToken` to **Join** the channel. - -6. Run your using either a physical or a simulator iOS device. - -7. Select **Audience** - - 1. Press **Join**. You see the video from the web browser demo app connected to `channelName` in the top frame of your iOS device. - - 2. Press **Join Second Channel**. You see the video from the web browser demo app connected to `secondChannelName` in the bottom frame of your iOS device. - - 3. Press **Leave Second Channel** and then *Leave* to exit both channels. - -8. Select **Host** - - 1. Press **Join**. You see the local video in the top frame of your iOS device. The web browser demo app connected to `channelName` shows the video from your iOS device. - - 2. Press **Join Second Channel**. You see the video from the web browser demo app connected to `secondChannelName` in the bottom frame of your iOS device. - - 3. Press **Leave Second Channel** and then **Leave** to exit both channels. - diff --git a/shared/video-sdk/develop/live-streaming-over-multiple-channels/project-test/unity.mdx b/shared/video-sdk/develop/live-streaming-over-multiple-channels/project-test/unity.mdx index 3f7ed49bd..ea7a29599 100644 --- a/shared/video-sdk/develop/live-streaming-over-multiple-channels/project-test/unity.mdx +++ b/shared/video-sdk/develop/live-streaming-over-multiple-channels/project-test/unity.mdx @@ -1,53 +1,57 @@ -### Test channel media Relay +5. **Choose this sample in the reference app** -1. In **Unity Editor**, in your script file, update `_appID`, `_channelName` and `destChannelName`. + From the main screen of the , choose **** from the dropdown and then select **Multi-channel live streaming**. -2. [Generate a temporary token](../reference/manage-agora-account#generate-a-temporary-token) in using `_appID` and `_channelName`. Use it to update `_token` in `NewBehaviourScript`. Use the same values to generate another token and update `sourceChannelToken`. +6. **Test channel media Relay** -3. Generate a third token in using `_appID` and `destChannelName`. Use it to update `destChannelToken` in `NewBehaviourScript`. + 1. In `Assets/agora-manager/config.json`, update `destChannelName` to the name of the destination channel. -4. In your browser, navigate to the web demo and update _App ID_, _Channel_, and _Token_ with the values for your temporary token, then click **Join**. + 1. Generate a second token in using `appID` and `destChannelName`. Use it to update `destChannelToken`. -5. Repeat the previous step on a second device, but this time use `_appID`, `destChannelName`, and `destChannelToken` to **Join** the channel. + 1. In your browser, navigate to the web demo and update _App ID_, _Channel_, and _Token_ with the values for your temporary token, then click **Join**. -6. In **Unity Editor**, click **Play**. A moment later you see the running on your development device. + 1. Repeat the previous step on a second device, but this time use `appID`, `destChannelName`, and `destChannelToken` to **Join** the channel. -7. Select **Host** and press **Join**. You see the video from the web browser demo app connected to `_channelName` in the local view of your device. + 1. **Join a channel** -8. Press **Start Channel Media Relay**. You see the video from the web browser demo app connected to `_channelName` relayed to the web browser demo app connected to `destChannelName`. + + Select an option and click **Join** to start a session. When you join as a **Host**, the local video is published and played in the . When you join as **Audience**, the remote stream is subscribed and played. + -9. Press **Stop Channel Media Relay**. The media relaying is stopped. + + Press **Join** to connect to the same channel as your web demo. + -### Test joining multiple channels + 1. Press **Relay Media**. You see the video from the web browser demo app connected to `channelName` relayed to the web browser demo app connected to `destChannelName`. -1. In **Unity Editor**, in your script file, update `_appID`, `_channelName` and `destChannelName`. + 1. Press **Stop Relaying**. The media relaying is stopped. -2. [Generate a temporary token](../reference/manage-agora-account#generate-a-temporary-token) in using `_appID` and `_channelName`. Use it to update `_token` in `NewBehaviourScript`. +7. **Test joining multiple channels** -3. Generate a second token in using `_appID` and `secondChannelName`. Use it to update `secondChannelToken` in `NewBehaviourScript`. + 1. In `Assets/agora-manager/config.json`, update `appID`, `channelName` and `secondChannelName`. -4. In your browser, navigate to the web demo and update _App ID_, _Channel_, and _Token_ with the values for your temporary token, then click **Join**. + 1. [Generate a temporary token](../reference/manage-agora-account#generate-a-temporary-token) in using `_appID` and `_channelName`. -5. Repeat the previous step on a second device, but this time use `_appID`, `secondChannelName`, and `secondChannelToken` to **Join** the channel. + 1. Generate a second token in using `appID` and `secondChannelName`. Use it to update `secondChannelToken` in `config.json`. -6. In **Unity Editor**, click **Play**. A moment later you see the running on your development device. + 1. In your browser, navigate to the web demo and update _App ID_, _Channel_, and _Token_ with the values for your temporary token, then click **Join**. -8. Select **Audience** + 1. Repeat the previous step on a second device, but this time use `appID`, `secondChannelName`, and `secondChannelToken` to **Join** the channel. - 1. Press **Join**. You see the video from the web browser demo app connected to `_channelName` in the remote view of your development device. - 2. Press **Join Second Channel**. Now, you see the video from the web browser demo app connected to `secondChannelName` in the remote view of your development device. - 3. Press **Leave Second Channel** and then **Leave** to exit both channels. + 1. Press **Join**. You see the video from the web browser demo app in the remote view of your development device, connected to `channelName`, and the video from your development device in the remote view of the web demo app -9. Select **Host** + 1. Select **Audience** - 1. Press **Join**. You see the local video in the local view of your development device. The web browser demo app connected to `channelName` shows the video from your development device. - 2. Press **Join Second Channel**. The web browser demo app connected to `secondChannelName` plays only the audio stream from your development device. This is because you can publish video stream only in one channel. - 3. Press **Leave Second Channel** and then **Leave** to exit both channels. - 4. Publish video stream in the second channel: + 1. The remote video in the web browser demo app connected to `channelName` is stopped. + 2. Click **Join Second Channel**. Now, the web browser demo app connected to `secondChannelName` displays the video stream from the development device. + 3. Click **Leave Second Channel** to exit the `secondChannel` channel. - 1. Press **Join Second Channel**. The web browser demo app connected to `secondChannelName` shows the video from your development device. - 2. Press **Join**. You see, the web browser demo app connected to `channelName` plays only the audio stream from your development device. + 1. Select **Host** + + 1. The web browser demo app, connected to `channelName`, displays the video stream from the development device. + 2. Click **Join Second Channel**. The web browser demo app connected to `secondChannelName` plays only the audio stream from your development device. This is because you can publish video stream only in one channel. + 3. Click **Leave Second Channel** to exit the `secondChannel` channel. \ No newline at end of file diff --git a/shared/video-sdk/develop/live-streaming-over-multiple-channels/project-test/web.mdx b/shared/video-sdk/develop/live-streaming-over-multiple-channels/project-test/web.mdx index 5fb119513..a68518c67 100644 --- a/shared/video-sdk/develop/live-streaming-over-multiple-channels/project-test/web.mdx +++ b/shared/video-sdk/develop/live-streaming-over-multiple-channels/project-test/web.mdx @@ -1,79 +1,68 @@ -import * as data from '@site/data/variables'; - -### Test channel media Relay - -1. In `main.js`, update `appID`, `channel` and `destChannelName`. - -2. In `main.js`, set `uid` to a non-zero unsigned 32-bit integer. - -3. [Generate a temporary token](../reference/manage-agora-account#generate-a-temporary-token) in using `appID` and `channel`. Use it to update `token` in `main.js`. -4. Generate a third token in using `appID` and `destChannelName`. Use it to update `destChannelToken` in `main.js`. +5. **Test channel media relay** -5. In your browser, navigate to the web demo and update _App ID_, _Channel_, and _Token_ with the values for your temporary token, then click **Join**. + 1. Generate a token in using `appID` and `destChannelName`. Use it to update `destChannelToken` in `main.js`. -6. Repeat the previous step on a second device, but this time use `appID`, `destChannelName`, and `destChannelToken` to **Join** the channel. + 1. In your browser, navigate to the web demo and update _App ID_, _Channel_, and _Token_ with the values for your temporary token, then click **Join**. -7. Start the dev server: + 1. Repeat the previous step on a second device, but this time use `appID`, `destChannelName`, and `destChannelToken` to **Join** the channel. - Execute the following command in the terminal: - ```bash - npm run dev - ``` - Use the URL displayed in the terminal to open the in your browser. + 1. Start the dev server: -8 Select **Host**. + Execute the following command in the terminal: + ```bash + npm run dev + ``` + Use the URL displayed in the terminal to open the in your browser. -9. To connect to a channel, click **Join**. + 1. Select **Host**. - If this is the first time you run the project, you need to grant microphone and camera access to your . You see the web browser demo app connected to `channel` shows the video from your development device. + 1. To connect to a channel, click **Join**. -10. Press **Start Channel Media Relay**. You see the video from the web browser demo app connected to `channel` relayed to the web browser demo app connected to `destChannelName`. + If this is the first time you run the project, you need to grant microphone and camera access to your . You see the web browser demo app connected to `channel` shows the video from your development device. -11. Press **Stop Channel Media Relay**. The media relaying is stopped. + 1. Press **Start Channel Media Relay**. You see the video from the web browser demo app connected to `channel` relayed to the web browser demo app connected to `destChannelName`. + 1. Press **Stop Channel Media Relay**. The media relaying is stopped. -### Test joining multiple channels -1. In _main.js_, update `appID`, `channel`, and `secondChannelName`. +1. **Test joining multiple channels** -2. [Generate a temporary token](../reference/manage-agora-account#generate-a-temporary-token) in using `appID` and `channel`. Use it to update `token` in `main.js`. + 1. Generate a token in using `appID` and `secondChannelName`. Use it to update `secondChannelToken` in `main.js`. -3. Generate a second token in using `appID` and `secondChannelName`. Use it to update `secondChannelToken` in `main.js`. + 1. In your browser, navigate to the web demo and update _App ID_, _Channel_, and _Token_ with the values for your temporary token, then click **Join**. -4. In your browser, navigate to the web demo and update _App ID_, _Channel_, and _Token_ with the values for your temporary token, then click **Join**. + 1. Repeat the previous step on a second device, but this time use `appID` `secondChannelName`, and `secondChannelToken` to **Join** the channel. -5. Repeat the previous step on a second device, but this time use `appID` `secondChannelName`, and `secondChannelToken` to **Join** the channel. + 1. Start the dev server: -6. Start the dev server: + Execute the following command in the terminal: + ```bash + npm run dev + ``` + Use the URL displayed in the terminal to open the in your browser. - Execute the following command in the terminal: - ```bash - npm run dev - ``` - Use the URL displayed in the terminal to open the in your browser. + 1. Select **Audience** -7. Select **Audience** - - 1. Press **Join**. - - If this is the first time you run the project, you need to grant microphone and camera access to your . You see the video from the web browser demo app connected to `channel` in the remote view of your development device. + 1. Press **Join**. + + If this is the first time you run the project, you need to grant microphone and camera access to your . You see the video from the web browser demo app connected to `channel` in the remote view of your development device. - 2. Press **Join Second Channel**. - - You see the video from the web browser demo app connected to `secondChannelName` in the bottom container of your development device. - 3. Press **Leave Second Channel** and then **Leave** to exit both channels. + 2. Press **Join Second Channel**. + + You see the video from the web browser demo app connected to `secondChannelName` in the bottom container of your development device. + 3. Press **Leave Second Channel** and then **Leave** to exit both channels. -8. Select **Host** + 1. Select **Host** - 1. Press **Join**. + 1. Press **Join**. - You see the local video in the local view of your development device. The web browser demo app connected to `channel` shows the video from your development device. + You see the local video in the local view of your development device. The web browser demo app connected to `channel` shows the video from your development device. - 2. Press **Join Second Channel**. - - You see the web browser demo app connected to `secondChannelName` shows the video from your development device. + 2. Press **Join Second Channel**. + + You see the web browser demo app connected to `secondChannelName` shows the video from your development device. - 3. Press **Leave Second Channel** and then **Leave** to exit both channels. + 3. Press **Leave Second Channel** and then **Leave** to exit both channels. diff --git a/shared/video-sdk/develop/live-streaming-over-multiple-channels/reference/android.mdx b/shared/video-sdk/develop/live-streaming-over-multiple-channels/reference/android.mdx index bb447ed77..c56ca9b9b 100644 --- a/shared/video-sdk/develop/live-streaming-over-multiple-channels/reference/android.mdx +++ b/shared/video-sdk/develop/live-streaming-over-multiple-channels/reference/android.mdx @@ -1,17 +1,3 @@ - -### API reference - -* RtcEngineEx -* joinChannelEx -* leaveChannelEx -* RtcConnection -* startChannelMediaRelay -* stopChannelMediaRelay -* updateChannelMediaRelay -* pauseAllChannelMediaRelay -* resumeAllChannelMediaRelay -* onChannelMediaRelayStateChanged -* onChannelMediaRelayEvent diff --git a/shared/video-sdk/develop/live-streaming-over-multiple-channels/reference/ios.mdx b/shared/video-sdk/develop/live-streaming-over-multiple-channels/reference/ios.mdx index 49be59be3..52ad025f4 100644 --- a/shared/video-sdk/develop/live-streaming-over-multiple-channels/reference/ios.mdx +++ b/shared/video-sdk/develop/live-streaming-over-multiple-channels/reference/ios.mdx @@ -1,14 +1,3 @@ - -### API reference - -* AgoraChannelMediaRelayInfo -* AgoraChannelMediaRelayConfiguration -* channelMediaRelayStateDidChange -* didReceiveChannelMediaRelayEvent -* AgoraRtcConnection -* AgoraRtcEngineDelegate -* setupRemoteVideoEx - diff --git a/shared/video-sdk/develop/live-streaming-over-multiple-channels/reference/macos.mdx b/shared/video-sdk/develop/live-streaming-over-multiple-channels/reference/macos.mdx index 6afa0e177..5b1a67bc4 100644 --- a/shared/video-sdk/develop/live-streaming-over-multiple-channels/reference/macos.mdx +++ b/shared/video-sdk/develop/live-streaming-over-multiple-channels/reference/macos.mdx @@ -1,14 +1,3 @@ - -### API reference - -* AgoraChannelMediaRelayInfo -* AgoraChannelMediaRelayConfiguration -* channelMediaRelayStateDidChange -* didReceiveChannelMediaRelayEvent -* AgoraRtcConnection -* AgoraRtcEngineDelegate -* setupRemoteVideoEx - diff --git a/shared/video-sdk/develop/live-streaming-over-multiple-channels/reference/unity.mdx b/shared/video-sdk/develop/live-streaming-over-multiple-channels/reference/unity.mdx index 5b6522e4a..daf235c54 100644 --- a/shared/video-sdk/develop/live-streaming-over-multiple-channels/reference/unity.mdx +++ b/shared/video-sdk/develop/live-streaming-over-multiple-channels/reference/unity.mdx @@ -1,16 +1,2 @@ - - -### API reference -* RtcEngineEx -* joinChannelEx -* leaveChannelEx -* RtcConnection -* startChannelMediaRelay -* stopChannelMediaRelay -* updateChannelMediaRelay -* pauseAllChannelMediaRelay -* resumeAllChannelMediaRelay -* onChannelMediaRelayStateChanged -* onChannelMediaRelayEvent \ No newline at end of file diff --git a/shared/video-sdk/develop/live-streaming-over-multiple-channels/reference/web.mdx b/shared/video-sdk/develop/live-streaming-over-multiple-channels/reference/web.mdx index c9f890969..8f737f379 100644 --- a/shared/video-sdk/develop/live-streaming-over-multiple-channels/reference/web.mdx +++ b/shared/video-sdk/develop/live-streaming-over-multiple-channels/reference/web.mdx @@ -1,16 +1,3 @@ -import * as data from '@site/data/variables'; - -### API reference - -* createChannelMediaRelayConfiguration -* startChannelMediaRelay -* stopChannelMediaRelay -* updateChannelMediaRelay -* createClient -* publish -* setClientRole -* leave - diff --git a/shared/video-sdk/develop/migration-guide/web.mdx b/shared/video-sdk/develop/migration-guide/web.mdx index d6adb4b7e..5a73f8c69 100644 --- a/shared/video-sdk/develop/migration-guide/web.mdx +++ b/shared/video-sdk/develop/migration-guide/web.mdx @@ -18,7 +18,7 @@ First, create a `Client` object and join a specified channel. - Use the v - ``` js + ```js const client = AgoraRTC.createClient({ mode: "live", codec: "vp9" }); client.init("APPID", () => { client.join("Token", "Channel", null, (uid) => { @@ -30,7 +30,7 @@ First, create a `Client` object and join a specified channel. ``` - Use the v - ``` js + ```js const client = AgoraRTC.createClient({ mode: "live", codec: "vp9" }); try { @@ -54,7 +54,7 @@ Second, create an audio track object from the audio sampled by a microphone and - Use the v - ``` js + ```js const localStream = AgoraRTC.createStream({ audio: true, video: true }); localStream.init(() => { console.log("init stream success"); @@ -66,7 +66,7 @@ Second, create an audio track object from the audio sampled by a microphone and - Use the v - ``` js + ```js const localAudio = await AgoraRTC.createMicrophoneAudioTrack(); const localVideo = await AgoraRTC.createCameraVideoTrack(); console.log("create local audio/video track success"); @@ -88,7 +88,7 @@ After creating the local audio and video tracks, publish these tracks to the cha - Use the v - ``` js + ```js client.publish(localStream, err => { console.log("publish failed", err); }); @@ -98,7 +98,7 @@ After creating the local audio and video tracks, publish these tracks to the cha ``` - Use the v - ``` js + ```js try { // Remove this line if the channel profile is not live broadcast. await client.setClientRole("host"); @@ -122,7 +122,7 @@ When a remote user in the channel publishes media tracks, we need to automatical - Use the v - ``` js + ```js client.on("stream-added", e => { client.subscribe(e.stream, { audio: true, video: true }, err => { console.log("subscribe failed", err); @@ -136,7 +136,7 @@ When a remote user in the channel publishes media tracks, we need to automatical ``` - Use the v - ``` js + ```js client.on("user-published", async (remoteUser, mediaType) => { await client.subscribe(remoteUser, mediaType); if (mediaType == "video") { @@ -191,14 +191,14 @@ The improved events are: - Use the v - ``` js + ```js client.on("connection-state-change", e => { console.log("current", e.curState, "prev", e.prevState); }); ``` - Use the v - ``` js + ```js client.on("connection-state-change", (curState, prevState) => { console.log("current", curState, "prev", prevState); }); diff --git a/shared/video-sdk/develop/play-media/project-implementation/android.mdx b/shared/video-sdk/develop/play-media/project-implementation/android.mdx index c94adc2d9..e9ca12bde 100644 --- a/shared/video-sdk/develop/play-media/project-implementation/android.mdx +++ b/shared/video-sdk/develop/play-media/project-implementation/android.mdx @@ -1,274 +1,199 @@ -### Implement the user interface - -In a real-word application, you provide several buttons to enable a user to open, play, pause and stop playing files in the media player. In this page, you use a single `Button` to demonstrate the basic media player functions. You also add a `ProgressBar` to display the play progress to the user. - -To add the UI elements, in `/app/res/layout/activity_main.xml`, add the following code before ``: - -``` xml -`: -``` html +```html
@@ -19,7 +19,7 @@ To setup your project to use the media player APIs, take the following steps: To import the required libraries, in `preload.js`, add the following before ` createAgoraRtcEngine,` statement: - ``` javascript + ```javascript MediaPlayerState, ChannelMediaOptions, ``` @@ -28,7 +28,7 @@ To setup your project to use the media player APIs, take the following steps: To create and manage an instance of the media player and access the UI elements, in `preload.js`, add the following to the list of declarations: - ``` javascript + ```javascript var mediaPlayer; // To hold an instance of the media player var isMediaPlaying = false; var mediaDuration = 0; @@ -49,7 +49,7 @@ To implement playing and publishing media files in your , take When a user presses the button for the first time, you create an instance of the media player, set its `mediaPlayerObserver` to receive the callbacks, and open the media file. When the user presses the button again, you play the media file. On subsequent button presses, you pause or resume playing the media file alternately. To do this, add the following method to `preload.js` before `document.getElementById("leave").onclick = async function ()`: - ``` javascript + ```javascript document.getElementById("mediaPlayer").onclick = async function () { mediaButton = document.getElementById("mediaPlayer"); @@ -106,7 +106,7 @@ To implement playing and publishing media files in your , take The `IMediaPlayerSourceObserver` implements media player callbacks. You create an instance of `onPlayerSourceStateChanged` and register it with the media player instance. When the player state changes, you take appropriate actions to update the UI in `onPlayerStateChanged`. You use the `onPositionChanged` callback to update the progress bar. To implement these callbacks, add the following code before `const EventHandles = `: - ``` javascript + ```javascript const mediaPlayerObserver = { onPlayerSourceStateChanged: (state,error) => @@ -148,7 +148,7 @@ To implement playing and publishing media files in your , take You use `ChannelMediaOptions` and the `updateChannelMediaOptions` method to specify the type of stream to publish. To switch between publishing media player and camera and microphone streams, in `preload.js`, add the following method before `window.onload = () => `: - ``` javascript + ```javascript function updateChannelPublishOptions(publishMediaPlayer) { let channelOptions = new ChannelMediaOptions(); @@ -165,7 +165,7 @@ To implement playing and publishing media files in your , take Setup a `VideoCanvas` and use it in the `setupLocalVideo` method of the to show the media player output locally. To switch between displaying media player output and the camera stream, in `preload.js`, add the following function before `window.onload = () => `: - ``` javascript + ```javascript function setupLocalVideo(forMediaPlayer) { if (forMediaPlayer) @@ -196,7 +196,7 @@ To implement playing and publishing media files in your , take To free up resources when you exit the , add the following lines to the `document.getElementById("leave").onclick` method before `window.location.reload();`: - ``` javascript + ```javascript // Destroy the media player mediaPlayer.stop(); mediaPlayer.unregisterPlayerSourceObserver(mediaPlayerObserver); diff --git a/shared/video-sdk/develop/play-media/project-implementation/flutter.mdx b/shared/video-sdk/develop/play-media/project-implementation/flutter.mdx index 52c9b6d86..416271ebf 100644 --- a/shared/video-sdk/develop/play-media/project-implementation/flutter.mdx +++ b/shared/video-sdk/develop/play-media/project-implementation/flutter.mdx @@ -8,7 +8,7 @@ To add the UI elements, in `/lib/main.dart`. 1. Add the following code to the `build` method after `ListView(...children: [`: - ``` dart + ```dart _mediaPLayerButton(), Slider( value: _seekPos.toDouble(), @@ -25,7 +25,7 @@ To add the UI elements, in `/lib/main.dart`. 1. The `_mediaPLayerButton` widget displays a suitable caption on a button depending on the state of the media player. To define this widget, add the following method to the `_MyAppState` class: - ``` dart + ```dart Widget _mediaPLayerButton() { String caption = ""; @@ -54,7 +54,7 @@ To setup your project to use the media player APIs, take the following steps: To create and manage an instance of the `MediaPlayerController` and configure the UI elements, add the following declarations to the `_MyAppState` class after `late RtcEngine agoraEngine;`: - ``` dart + ```dart late final MediaPlayerController _mediaPlayerController; String mediaLocation = "https://www.appsloveworld.com/wp-content/uploads/2018/10/640.mp4"; @@ -75,7 +75,7 @@ To implement playing and publishing media files in your , take When a user presses the button for the first time, you create an instance of the media player and open the media file. When the user presses the button again, you play the media file. On subsequent button presses, you pause or resume playing the media file alternately. To do this, add the following method to the `_MyAppState` class: - ``` dart + ```dart void playMedia() async { if (!_isUrlOpened) { await initializeMediaPlayer(); @@ -108,7 +108,7 @@ To implement playing and publishing media files in your , take During initialization, you create an instance of `MediaPlayerController`, initialize it using the `initialize()` method and register a `MediaPlayerSourceObserver` to receive media player callbacks. When the player state changes, you take appropriate actions to update the UI in `onPlayerStateChanged`. You use the `onPositionChanged` callback to update the progress on the Slider. To do this, add the following code to the `_MyAppState` class: - ``` dart + ```dart Future initializeMediaPlayer() async { _mediaPlayerController= MediaPlayerController( rtcEngine: agoraEngine, @@ -163,7 +163,7 @@ To implement playing and publishing media files in your , take You use `ChannelMediaOptions` and the `updateChannelMediaOptions` method to specify the type of stream to publish. To switch between publishing media player and camera and microphone streams, add the following method to the `MainActivity` class: - ``` dart + ```dart void updateChannelPublishOptions(bool publishMediaPlayer) { ChannelMediaOptions channelOptions = ChannelMediaOptions( publishMediaPlayerAudioTrack: publishMediaPlayer, @@ -180,7 +180,7 @@ To implement playing and publishing media files in your , take To show the media player output locally, you create an `AgoraVideoView` and set its `controller` to `_mediaPlayerController`. To enable switching between displaying media player output and the camera stream, **replace** the `_localPreview()` method in the `_MyAppState` class with the following: - ``` dart + ```dart Widget _localPreview() { if (_isJoined) { if (_isPlaying) { @@ -208,7 +208,7 @@ To implement playing and publishing media files in your , take To free up resources when you exit the channel, add the following lines to the `leave` method after `agoraEngine.leaveChannel();`: - ``` dart + ```dart // Dispose the media player _mediaPlayerController.dispose(); diff --git a/shared/video-sdk/develop/play-media/project-implementation/index.mdx b/shared/video-sdk/develop/play-media/project-implementation/index.mdx index adebfbc81..ee253ad47 100644 --- a/shared/video-sdk/develop/play-media/project-implementation/index.mdx +++ b/shared/video-sdk/develop/play-media/project-implementation/index.mdx @@ -1,22 +1,14 @@ -import Android from './android.mdx'; -import Ios from './ios.mdx'; -import Web from './web.mdx'; import ReactNative from './react-native.mdx'; import Electron from './electron.mdx'; import Flutter from './flutter.mdx'; -import Unity from './unity.mdx'; -import MacOS from './macos.mdx'; import Unreal from './unreal.mdx'; import Windows from './windows.mdx'; +import Poc3 from './poc3.mdx'; - - - - + - - \ No newline at end of file + diff --git a/shared/video-sdk/develop/play-media/project-implementation/poc3.mdx b/shared/video-sdk/develop/play-media/project-implementation/poc3.mdx new file mode 100644 index 000000000..6cc519bd1 --- /dev/null +++ b/shared/video-sdk/develop/play-media/project-implementation/poc3.mdx @@ -0,0 +1,76 @@ +import ImportLibrary from '@docs/assets/code/video-sdk/play-media/import-library.mdx'; +import SetVariables from '@docs/assets/code/video-sdk/play-media/set-variables.mdx'; +import StartStreaming from '@docs/assets/code/video-sdk/play-media/start-streaming.mdx'; +import UpdateChannelPublishOptions from '@docs/assets/code/video-sdk/play-media/update-channel-publish-options.mdx'; +import PlayPauseResume from '@docs/assets/code/video-sdk/play-media/play-pause-resume.mdx'; +import HandleEvents from '@docs/assets/code/video-sdk/play-media/event-handler.mdx'; +import DestroyPlayer from '@docs/assets/code/video-sdk/play-media/destroy-media-player.mdx'; +import DisplayMedia from '@docs/assets/code/video-sdk/play-media/display-media.mdx'; +import ConfigureEngine from '@docs/assets/code/video-sdk/play-media/configure-engine.mdx'; + + + +### Add the required imports + + + +### Configure an instance of + + + +### Stream media to the channel + + + + + + +### Add the required variables + + + +### Start streaming a video from a URL + + + +### Play, pause, and resume the media file + + + +### Configure to publish the media player stream + + By setting appropriate parameters, you can publish the media player output, the user's local microphone and camera tracks, or both. + + + +### Display media player output locally + + + +### Manage media player callbacks + + + +### Clean up when you close the + + + + + + + +### Import the required library + + + +### Add the required variables + + + +### Start streaming a local media file + + Start streaming a local media file in a source URL: + + + + diff --git a/shared/video-sdk/develop/play-media/project-implementation/react-js.mdx b/shared/video-sdk/develop/play-media/project-implementation/react-js.mdx new file mode 100644 index 000000000..5497269d1 --- /dev/null +++ b/shared/video-sdk/develop/play-media/project-implementation/react-js.mdx @@ -0,0 +1,59 @@ + + +1. Import the components and hooks you need to manage a video call: + + ```typescript + import { usePublish, useConnectionState } from "agora-rtc-react"; + import AgoraRTC, { IBufferSourceAudioTrack } from "agora-rtc-sdk-ng"; + ``` + +1. Process an audio file + + ```typescript + const PlayAudioFile: React.FC<{ track: IBufferSourceAudioTrack }> = ({ track }) => { + usePublish([track]); + + useEffect(() => { + track.startProcessAudioBuffer(); + track.play(); // to play the track for the local user + return () => { + track.stopProcessAudioBuffer(); + track.stop(); + }; + }, [track]); + + return
Audio file playing
; + }; + ``` + +1. Play the audio to a channel + + ```typescript + try + { + AgoraRTC.createBufferSourceAudioTrack({ source: selectedFile }) + .then((track) => {setAudioFileTrack(track)}) + .catch((error) => {console.error(error);}) + } catch (error) { + console.error("Error creating buffer source audio track:", error); + } + ``` + +1. Put it all together in the UI + + ```typescript + +

+ +

+ {isMediaPlaying && audioFileTrack && } + ``` +
\ No newline at end of file diff --git a/shared/video-sdk/develop/play-media/project-implementation/react-native.mdx b/shared/video-sdk/develop/play-media/project-implementation/react-native.mdx index 15d36446f..419329de7 100644 --- a/shared/video-sdk/develop/play-media/project-implementation/react-native.mdx +++ b/shared/video-sdk/develop/play-media/project-implementation/react-native.mdx @@ -7,13 +7,13 @@ In a real-word application, you provide several buttons to enable a user to open To add the button, in `App.tsx`: 1. Add the following component to the `return` statement of the `App` component before ` ``` 1. The `MediaPlayerButton` component displays a suitable caption depending on the state of the media player. To define this component, add the following code to the `App` component: - ``` typescript + ```typescript const MediaPlayerButton = () => { var caption = ''; @@ -43,7 +43,7 @@ To setup your project to use the media player APIs, take the following steps: Add the following to the list of `import` statements in `App.tsx`: - ``` typescript + ```typescript import { IMediaPlayer, VideoSourceType, @@ -58,13 +58,13 @@ To setup your project to use the media player APIs, take the following steps: 1. To specify the path to the media file, add the following declaration to `App.tsx` after `const uid = 0;`: - ``` typescript + ```typescript const mediaLocation = 'https://webdemo.agora.io/agora-web-showcase/examples/Agora-Custom-VideoSource-Web/assets/sample.mp4'; ``` 1. To create and manage an instance of `IMediaPlayer` and configure the UI elements, add the following declarations to the `App` component after `const [message, setMessage] = useState('');`: - ``` typescript + ```typescript const mediaPlayerRef = useRef(); // Media player instance const [isUrlOpened, setIsUrlOpened] = useState(false); // Media file has been opened const [isPlaying, setIsPlaying] = useState(false); // Media file is playing @@ -80,7 +80,7 @@ To implement playing and publishing media files in your , take When a user presses the button for the first time, you create an instance of the media player and open the media file. When the user presses the button again, you play the media file. On subsequent button presses, you pause or resume playing the media file alternately. To do this, add the following method to the `App` component: - ``` typescript + ```typescript const playMedia = () => { if (!isJoined) { return; @@ -114,7 +114,7 @@ To implement playing and publishing media files in your , take During initialization, you create an instance of `IMediaPlayer` and register a `IMediaPlayerSourceObserver` to receive media player callbacks. When the player state changes, you take appropriate actions to update the UI in `onPlayerStateChanged`. You use the `onPositionChanged` callback to update the play progress in the UI. To do this, add the following code to the `App` component: - ``` typescript + ```typescript const initializeMediaPlayer = () => { mediaPlayerRef.current = agoraEngineRef.current?.createMediaPlayer(); @@ -148,7 +148,7 @@ To implement playing and publishing media files in your , take You use `ChannelMediaOptions` and the `updateChannelMediaOptions` method to specify the type of stream to publish. To switch between publishing media player and camera and microphone streams, add the following method to the `App` component: - ``` typescript + ```typescript const updateChannelPublishOptions = (publishMediaPlayer: boolean) => { var channelOptions = new ChannelMediaOptions(); channelOptions.publishMediaPlayerAudioTrack = publishMediaPlayer; @@ -170,12 +170,12 @@ To implement playing and publishing media files in your , take 1. **Replace** the first `RtcSurfaceView` component block in the `return` statement of `App` with the following: - ``` typescript + ```typescript ``` 1. To define the `LocalPreview` component, add the following method to `App`: - ``` typescript + ```typescript const LocalPreview = () => { if (!isPlaying) { return ; diff --git a/shared/video-sdk/develop/play-media/project-implementation/swift.mdx b/shared/video-sdk/develop/play-media/project-implementation/swift.mdx index 4387e1a1e..226566a41 100644 --- a/shared/video-sdk/develop/play-media/project-implementation/swift.mdx +++ b/shared/video-sdk/develop/play-media/project-implementation/swift.mdx @@ -28,7 +28,7 @@ To setup your project to use the media player APIs and access the UI elements, t To create and manage an instance of the media player and access the UI elements, add the following declarations to `ViewController`: - ``` swift + ```swift var mediaPlayer: AgoraRtcMediaPlayerProtocol? // Instance of the media player var isMediaPlaying: Bool = false var mediaDuration: Int = 0 @@ -52,7 +52,7 @@ To implement playing and publishing media files in your , take The `AgoraRtcMediaPlayerDelegate` implements media player callbacks. When the player state changes, update the UI using the `didChangedToState`, `didChangedToPosition` callbacks. To setup the `AgoraRtcMediaPlayerDelegate`, add the following extension to the `ViewController`: - ``` swift + ```swift extension ViewController: AgoraRtcMediaPlayerDelegate { func agoraRtcMediaPlayer(_ playerKit: AgoraRtcMediaPlayerProtocol, didChangedTo state: AgoraMediaPlayerState, error: AgoraMediaPlayerError) { if (state == .openCompleted) { @@ -101,7 +101,7 @@ To implement playing and publishing media files in your , take You use the `AgoraRtcChannelMediaOptions` class and the `updateChannelWithMediaOptions` method functions to specify the type of stream to publish. To switch between publishing media player and camera and microphone streams, add the following function to the `ViewController`: - ``` swift + ```swift func updateChannelPublishOptions(_ publishMediaPlayer: Bool) { let channelOptions: AgoraRtcChannelMediaOptions = AgoraRtcChannelMediaOptions() @@ -119,7 +119,7 @@ To implement playing and publishing media files in your , take Create an `AgoraRtcVideoCanvas` and use it in the `setupLocalVideo` method of the to show the media player output locally. To switch between displaying media player output and the camera stream, replace the `setupLocalVideo()` method in the `ViewController` with the following: - ``` swift + ```swift func setupLocalVideo(_ forMediaPlayer: Bool) { // Enable the video module agoraEngine.enableVideo() @@ -147,7 +147,7 @@ To implement playing and publishing media files in your , take When you join a channel, you set up the local video panel to initially display the camera output. In the `joinChannel()` function, replace `setupLocalVideo()` with a call to the updated function: - ``` swift + ```swift setupLocalVideo(false) ``` @@ -155,7 +155,7 @@ To implement playing and publishing media files in your , take To free up resources when you exit the channel, add the following lines at the end of the `leaveChannel` function: - ``` swift + ```swift // Destroy the media player agoraEngine.destroyMediaPlayer(mediaPlayer) mediaPlayer = nil diff --git a/shared/video-sdk/develop/play-media/project-implementation/unity.mdx b/shared/video-sdk/develop/play-media/project-implementation/unity.mdx index 087738fca..02575a172 100644 --- a/shared/video-sdk/develop/play-media/project-implementation/unity.mdx +++ b/shared/video-sdk/develop/play-media/project-implementation/unity.mdx @@ -1,90 +1,41 @@ - -### Implement the user interface - -In a real-word application, you provide several buttons to enable a user to open, play, pause and stop playing files in the media player. In this page, you use a single `Button` to demonstrate the basic media player functions. You also add a `Slider` to display the play progress to the user. To implement this user interface, take the following steps: - - 1. Right-click **Sample Scene**, then click **Game Object** > **UI** > **Button - TextMeshPro**. A button appears in the **Scene** Canvas. - - 2. In **Inspector**, rename **Button** to **playMedia**, and then change the following coordinates: - - * **Pos X** - 350 - * **Pos Y** - 172 - - 3. Right-click **Sample Scene**, then click **Game Object** > **UI** > **Slider**. A Slider appears in the **Scene** Canvas. - 4. In **Inspector**, rename **Slider** to **mediaProgressBar**, and then change the following coordinates: - - * **Pos X** - -250 - * **Pos Y** - 142 +The `PlayMediaManager` class manages media streaming logic within the channel: -### Handle the system logic +1. **Import the required namespace** -To declare the required variables and access the UI elements, take the following steps: - -1. **Add the required UI library** - - To import the required Uinty UI library, in your script file, add the following to the list of namespace declarations: - - ``` csharp - using TMPro; + ```csharp + using Agora.Rtc; ``` -2. **Declare the variables you need** - - To create and manage an instance of the media player and access the UI elements, in your script file, add the following declarations to `NewBehaviourScript`: +1. **Declare the variables you need** ``` csharp - private IMediaPlayer mediaPlayer; // Instance of the media player - private bool isMediaPlaying = false; - private long mediaDuration = 0; - public Slider ProgressBar; - // In a real world app, you declare the media location variable with an empty string - // and update it when a user chooses a media file from a local or remote source. - private string mediaLocation = - "https://webdemo.agora.io/agora-web-showcase/examples/Agora-Custom-VideoSource-Web/assets/sample.mp4"; - private TMP_Text mediaBtnText; - private Slider mediaProgressBar; - ``` - -### Implement media player functions - -To implement playing and publishing media files in your , take the following steps: - -1. **Access the UI elements programmatically** - - To setup the event listener for the media player button and access the slider, in your script file, add the following at the end of `SetupUI`: - - ```csharp - go = GameObject.Find("playMedia"); - go.GetComponent`: - - ```html - - ``` - -1. **Add an audio file to your project** - - Create a `resources` folder in your project and copy an audio file to the folder in one of the following formats: - - - MP3 - - AAC - - Other audio formats supported by the browser - -1. **Create an audio track from a source file** - - When a user presses the button, you create an audio track by calling the `createBufferSourceAudioTrack` method. To do this, add the following code to `main.js`: - - ```javascript - document.getElementById("playAudioFile").onclick = - async function localPlayerStart() { - - // Create an audio track from a source file - const track = await AgoraRTC.createBufferSourceAudioTrack({ - source: "./resources/", - }); - }; - ``` - -1. **Play the audio track** - - To publish and play the audio track, add the following code to `localPlayerStart()` after you create the audio track: - - ```javascript - // Play the track - track.startProcessAudioBuffer({loop: false}); - track.play(); - ``` - - diff --git a/shared/video-sdk/develop/play-media/project-implementation/windows.mdx b/shared/video-sdk/develop/play-media/project-implementation/windows.mdx index 11ba09576..0d4321e7a 100644 --- a/shared/video-sdk/develop/play-media/project-implementation/windows.mdx +++ b/shared/video-sdk/develop/play-media/project-implementation/windows.mdx @@ -38,7 +38,7 @@ To setup your project to use the media player APIs and access the UI elements, t To import the required libraries, in `AgoraImplementationDlg.h`, add the following header files at the start: - ``` cpp + ```cpp #include #include using namespace agora::base; @@ -49,7 +49,7 @@ To setup your project to use the media player APIs and access the UI elements, t To create and manage an instance of the media player and access the UI elements, in `AgoraImplementationDlg.h`, add the following to `CAgoraImplementationDlg`: - ``` cpp + ```cpp IMediaPlayer *mediaPlayer; // Instance of the media player BOOL isMediaPlaying = false; long mediaDuration = 0; @@ -66,7 +66,7 @@ To setup your project to use the media player APIs and access the UI elements, t In `AgoraImplementationDlg.h`, add the following at the end of `OnInitDialog`: - ``` cpp + ```cpp mediaButton = (CButton*)GetDlgItem(IDC_BUTTON3); mediaProgressBar = (CSliderCtrl*)GetDlgItem(IDC_SLIDER1); ``` @@ -79,7 +79,7 @@ To implement playing and publishing media files in your , take When a user presses the button for the first time, you create an instance of the media player, set its `mediaPlayerObserver` to receive the callbacks, and open the media file. When the user presses the button again, you play the media file. On subsequent button presses, you pause or resume playing the media file alternately. To implement this workflow, in **Dialog Editor**, double-click **Open Media File**. **Dialog Editor** automatically creates and opens an event listener for you. Add the following code to the event listener you just created: - ``` cpp + ```cpp // Initialize the mediaPlayer and open a media file if (mediaPlayer == NULL) { // Create an instance of the media player @@ -124,7 +124,7 @@ To implement playing and publishing media files in your , take 1. In `AgoraImplementationDlg.h`, add the following class before `AgoraEventHandler`: - ``` cpp + ```cpp class MediaPlayerSourceObserver : public IMediaPlayerSourceObserver { public: @@ -183,28 +183,28 @@ To implement playing and publishing media files in your , take 1. In `AgoraImplementationDlg.h`, add the following to `CAgoraImplementationDlg`: - ``` cpp + ```cpp afx_msg LRESULT OnEIDMediaPlayerStateChanged(WPARAM wParam, LPARAM lParam); afx_msg LRESULT OnEIDMediaPlayerPositionChanged(WPARAM wParam, LPARAM lParam); ``` 1. In `AgoraImplementationDlg.cpp`, add the following after the list of header files: - ``` cpp + ```cpp #define PLAYER_POSITION_CHANGED 0x00000005 #define PLAYER_MEDIA_COMPLETED 0x00000006 ``` 1. In `AgoraImplementationDlg.cpp`, add the following code: - ``` cpp + ```cpp ON_MESSAGE(WM_MSGID(PLAYER_POSITION_CHANGED), &CAgoraImplementationDlg::OnEIDMediaPlayerPositionChanged) ON_MESSAGE(WM_MSGID(PLAYER_STATE_CHANGED), &CAgoraImplementationDlg::OnEIDMediaPlayerStateChanged) ``` 1. In `AgoraImplementationDlg.cpp`, add the following methods before `OnInitDialog`: - ``` cpp + ```cpp LRESULT CAgoraImplementationDlg::OnEIDMediaPlayerStateChanged(WPARAM wParam, LPARAM lParam) { if (wParam == PLAYER_STATE_OPEN_COMPLETED) @@ -253,14 +253,14 @@ To implement playing and publishing media files in your , take 1. In `AgoraImplementationDlg.h`, add the following to method `CAgoraImplementationDlg` before `afx_msg void OnClose();` - ``` cpp + ```cpp // Declare a method to publish and publish the local and media file streams. void updateChannelPublishOptions(BOOL value); ``` 2. In `AgoraImplementationDlg.cpp`, add the method after `OnInitDialog`: - ``` cpp + ```cpp void CAgoraImplementationDlg::updateChannelPublishOptions(BOOL publishMediaPlayer) { // You use ChannelMediaOptions to change channel media options. @@ -280,14 +280,14 @@ To implement playing and publishing media files in your , take 1. In `AgoraImplementationDlg.h`, add the following to method `CAgoraImplementationDlg` before `afx_msg void OnClose();` - ``` cpp + ```cpp // Declare a method to switch between the local video and media file output. void setupLocalVideo(BOOL value); ``` 2. In `AgoraImplementationDlg.cpp`, add the method after `OnInitDialog`: - ``` cpp + ```cpp void CAgoraImplementationDlg::setupLocalVideo(BOOL forMediaPlayer) { // Pass the window handle to the engine so that it renders the local video. @@ -310,7 +310,7 @@ To implement playing and publishing media files in your , take To free up resources when you exit the , add the following lines to the `onDestroy` method after `super.onDestroy();`: - ``` java + ```java // Destroy the media player mediaPlayer.stop(); mediaPlayer.unRegisterPlayerObserver(mediaPlayerObserver); diff --git a/shared/video-sdk/develop/play-media/project-test/android.mdx b/shared/video-sdk/develop/play-media/project-test/android.mdx index b45017f46..c963f1ed2 100644 --- a/shared/video-sdk/develop/play-media/project-test/android.mdx +++ b/shared/video-sdk/develop/play-media/project-test/android.mdx @@ -1,36 +1,36 @@ - -3. In Android Studio, open `app/java/com.example./MainActivity` and update `appId`, `channelName` and `token` with the values from Agora Console. -1. Connect an Android device to your development device. +5. **Choose this sample in the reference app** -1. In Android Studio, click **Run app**. A moment later, you see the project installed on your device. + From the main screen of the , choose **** from the dropdown and then select **Stream media to a channel**. - If this is the first time you run the , grant camera and microphone permissions. +6. **Join a channel** - -6. Click **Join** to start a call. - + + Click **Join** to start a call. + - -6. Select **Host** and press **Join** to start a call. - + + Select **Host** and press **Join** to start a call. + -7. Press **Open Media File**. +7. **Test the media player** - After a short while, you see a toast message confirming that the media file is opened successfully. + 1. Press **Open Media File**. -1. Press **Play Media File** + After a short while, you see a toast message confirming that the media file is opened successfully. - You see the media file played both locally and in the web demo app. The progress bar indicates the play progress. + 1. Press **Play Media File** -1. Press **Pause Playing Media** + You see the media file played both locally and in the web demo app. The progress bar indicates the play progress. - Media publishing is paused and the camera and microphone publishing is resumed. + 1. Press **Pause media** -1. Press **Resume Playing Media** + Media playing is paused. - You see that the media file resumes playing. + 1. Press **Resume** -1. Wait for the media file to finish playing. When the progress bar reaches the end, the media player publishing is stopped and camera and microphone publishing is resumed. + You see that the media file resumes playing. + + Wait for the media file to finish playing. When the progress bar reaches the end, the media player publishing is stopped and camera and microphone publishing is resumed. diff --git a/shared/video-sdk/develop/play-media/project-test/index.mdx b/shared/video-sdk/develop/play-media/project-test/index.mdx index 9699232bf..3cd0d12a6 100644 --- a/shared/video-sdk/develop/play-media/project-test/index.mdx +++ b/shared/video-sdk/develop/play-media/project-test/index.mdx @@ -1,22 +1,22 @@ -import Android from './android.mdx'; -import Ios from './ios.mdx'; -import Web from './web.mdx'; +import Poc3 from './poc3.mdx'; import ReactNative from './react-native.mdx'; import Electron from './electron.mdx'; import Flutter from './flutter.mdx'; -import Unity from './unity.mdx'; -import MacOS from './macos.mdx'; -import Unreal from './unreal.mdx'; import Windows from './windows.mdx'; +import Unreal from './unreal.mdx' + + +1. **[Generate a temporary token](../reference/manage-agora-account#generate-a-temporary-token) in **. + +2. **Configure the web demo you use to connect to your **: + + In your browser, navigate to the web demo and update _App ID_, _Channel_, and _Token_ with the values for your temporary token, then click **Join**. + - - - - - + - + - \ No newline at end of file diff --git a/shared/video-sdk/develop/play-media/project-test/poc3.mdx b/shared/video-sdk/develop/play-media/project-test/poc3.mdx new file mode 100644 index 000000000..070d9833e --- /dev/null +++ b/shared/video-sdk/develop/play-media/project-test/poc3.mdx @@ -0,0 +1,20 @@ +import TestFirstSteps from '@docs/shared/common/project-test/rtc-first-steps.mdx'; +import Android from './android.mdx'; +import Ios from './ios.mdx'; +import Web from './web.mdx'; +import Unity from './unity.mdx'; +import ReactJS from './react-js.mdx'; +import MacOS from './macos.mdx'; + + + + + + + + + + + + + \ No newline at end of file diff --git a/shared/video-sdk/develop/play-media/project-test/react-js.mdx b/shared/video-sdk/develop/play-media/project-test/react-js.mdx new file mode 100644 index 000000000..247eefddf --- /dev/null +++ b/shared/video-sdk/develop/play-media/project-test/react-js.mdx @@ -0,0 +1,23 @@ + + +5. **Run the play media example**: + + In **Choose a sample code**, select **Stream media to a channel**. + +1. **Join a channel** + + + Click **Join** to start a session. When you select **Host**, the local video is published and played in the . When you select **Audience**, the remote stream is subscribed and played. + + + + Press **Join** to connect to the same channel as your web demo. + + +1. **Stream media to the channel** + + 1. Click **Choose File** and select an audio file you wish to stream in the channel. + 1. Click **Play audio file**. You see the media file played both locally and in the web demo app. + + + diff --git a/shared/video-sdk/develop/play-media/project-test/swift.mdx b/shared/video-sdk/develop/play-media/project-test/swift.mdx index c218c3b72..dabf97d67 100644 --- a/shared/video-sdk/develop/play-media/project-test/swift.mdx +++ b/shared/video-sdk/develop/play-media/project-test/swift.mdx @@ -1,36 +1,32 @@ -3. In the `ViewController`, update `appID`, `channelName`, and `token` with the values for your temporary token. +5. **Join a channel** -4. Run your using either a physical or a simulator iOS device. + + Select **Broadcaster** mode and press **Join**. + - If this is the first time you run the project, grant microphone and camera access to your . - - If you use an iOS simulator, you see the remote video only. You cannot see the local video stream because of [Apple simulator hardware restrictions](https://help.apple.com/simulator/mac/current/#/devb0244142d). - - -5. Select **Broadcaster** mode and press **Join**. - + + Press **Join** to connect to the same channel as your web demo. + - -5. Press **Join** to connect to the same channel as your web demo. - +7. **Test the media player** -6. Press **Open Media File**. + 1. Press **Open Media File**. - After a short while, you see a toast message confirming that the media file is opened successfully. + After a short while, you see a toast message confirming that the media file is opened successfully. -1. Press **Play Media File** + 1. Press **Play Media File** - You see the media file played both locally and in the web demo app. The progress bar indicates the play progress. + You see the media file played both locally and in the web demo app. The progress bar indicates the play progress. -1. Press **Pause Playing Media** + 1. Press **Pause Playing Media** - Media publishing is paused and the camera and microphone publishing is resumed. + Media publishing is paused and the camera and microphone publishing is resumed. -1. Press **Resume Playing Media** + 1. Press **Resume Playing Media** - You see that the media file resumes playing. + You see that the media file resumes playing. -1. Wait for the media file to finish playing. When the progress bar reaches the end, the media player publishing is stopped and camera and microphone publishing is resumed. + Wait for the media file to finish playing. When the progress bar reaches the end, the media player publishing is stopped and camera and microphone publishing is resumed. diff --git a/shared/video-sdk/develop/play-media/project-test/unity.mdx b/shared/video-sdk/develop/play-media/project-test/unity.mdx index 223dff199..1a73efdda 100644 --- a/shared/video-sdk/develop/play-media/project-test/unity.mdx +++ b/shared/video-sdk/develop/play-media/project-test/unity.mdx @@ -1,32 +1,36 @@ - -3. In **Unity Editor**, in your script file, update `_appID`, `_channelName` and `_token` with the values for your temporary token. -4. In **Unity Editor**, click **Play**. A moment later you see the application running on your device. +5. **Choose this sample in the reference app** - -5. Select a role using the toggle buttons and click **Join** to start an event. - + From the main screen of the , choose **** from the dropdown and then select **Stream Media to a Channel**. - -5. Click **Join** to start a call. - +1. **Join a channel** -6. Press **Open Media File**. + + Select an option and click **Join** to start a session. When you join as a **Host**, the local video is published and played in the . When you join as **Audience**, the remote stream is subscribed and played. + - After a short while, you see a message in the debug console confirming that the media file is opened successfully. + + Press **Join** to connect to the same channel on both devices. + -7. Press **Play Media File** +7. **Test the media player** - You see the media file played both locally and in the web demo app. The progress bar indicates the play progress. + 1. Press **Load Media File**. -8. Press **Pause Playing Media** + After a short while, you see a message in the debug console confirming that the media file is opened successfully. - Media publishing is paused and the camera and microphone publishing is resumed. + 1. Press **Play Media File** -9. Press **Resume Playing Media** + You see the media file played both locally and in the web demo app. The progress bar indicates the play progress. - You see that the media file resumes playing. + 1. Press **Pause Playing Media** -10. Wait for the media file to finish playing. When the progress bar reaches the end, the media player publishing is stopped and camera and microphone publishing is resumed. + Media publishing is paused and the camera and microphone publishing is resumed. + + 1. Press **Resume Playing Media** + + You see that the media file resumes playing. + + Wait for the media file to finish playing. When the progress bar reaches the end, the media player publishing is stopped and camera and microphone publishing is resumed. diff --git a/shared/video-sdk/develop/play-media/project-test/web.mdx b/shared/video-sdk/develop/play-media/project-test/web.mdx index bf3fe0e92..11a5c6b20 100644 --- a/shared/video-sdk/develop/play-media/project-test/web.mdx +++ b/shared/video-sdk/develop/play-media/project-test/web.mdx @@ -1,26 +1,16 @@ -3. In _main.js_, update `appID`, `channel` and `token` with your values. +5. **Join a channel** -4. Start the dev server + + Select *Host* and then click **Join**. + - Execute the following command in the terminal: + + To connect to a channel, click **Join**. + - ```bash - npm run dev - ``` - - Use the Url displayed in the terminal to open the in your browser. - - -6. Select *Host* and then click **Join**. - - - -6. To connect to a channel, click **Join**. - - -7. Press **Play audio file** +1. Press **Play audio file** You hear the audio file play locally and remotely on the web demo app. diff --git a/shared/video-sdk/develop/play-media/reference/android.mdx b/shared/video-sdk/develop/play-media/reference/android.mdx index 1e87659bf..deb622789 100644 --- a/shared/video-sdk/develop/play-media/reference/android.mdx +++ b/shared/video-sdk/develop/play-media/reference/android.mdx @@ -1,12 +1,3 @@ -### API reference - -* createMediaPlayer -* registerPlayerObserver -* unregisterPlayerSourceObserver -* getMediaPlayerId -* IMediaPlayer -* IMediaPlayerObserver - diff --git a/shared/video-sdk/develop/play-media/reference/index.mdx b/shared/video-sdk/develop/play-media/reference/index.mdx index 95f43012c..a471fcc8b 100644 --- a/shared/video-sdk/develop/play-media/reference/index.mdx +++ b/shared/video-sdk/develop/play-media/reference/index.mdx @@ -1,6 +1,7 @@ import Android from './android.mdx'; import Ios from './ios.mdx'; import Web from './web.mdx'; +import ReactJS from './react-js.mdx'; import ReactNative from './react-native.mdx'; import Electron from './electron.mdx'; import Flutter from './flutter.mdx'; @@ -13,6 +14,7 @@ import Windows from './windows.mdx'; + diff --git a/shared/video-sdk/develop/play-media/reference/react-js.mdx b/shared/video-sdk/develop/play-media/reference/react-js.mdx new file mode 100644 index 000000000..a77a88372 --- /dev/null +++ b/shared/video-sdk/develop/play-media/reference/react-js.mdx @@ -0,0 +1,3 @@ + + + \ No newline at end of file diff --git a/shared/video-sdk/develop/play-media/reference/swift.mdx b/shared/video-sdk/develop/play-media/reference/swift.mdx index cf2048085..e69de29bb 100644 --- a/shared/video-sdk/develop/play-media/reference/swift.mdx +++ b/shared/video-sdk/develop/play-media/reference/swift.mdx @@ -1,19 +0,0 @@ - -# API Reference - - -* AgoraRtcMediaPlayerProtocol -* AgoraRtcMediaPlayerDelegate -* createMediaPlayer -* destroyMediaPlayer -* AgoraMediaPlayerState -* AgoraRtcChannelMediaOptions - - -* AgoraRtcMediaPlayerProtocol -* AgoraRtcMediaPlayerDelegate -* createMediaPlayer -* destroyMediaPlayer -* AgoraMediaPlayerState -* AgoraRtcChannelMediaOptions - diff --git a/shared/video-sdk/develop/play-media/reference/unity.mdx b/shared/video-sdk/develop/play-media/reference/unity.mdx index c2ee96d8e..72baf9c00 100644 --- a/shared/video-sdk/develop/play-media/reference/unity.mdx +++ b/shared/video-sdk/develop/play-media/reference/unity.mdx @@ -1,11 +1,3 @@ -# API Reference - -* CreateMediaPlayer -* InitEventHandler -* GetId -* IMediaPlayer -* IMediaPlayerSourceObserver - diff --git a/shared/video-sdk/develop/play-media/reference/web.mdx b/shared/video-sdk/develop/play-media/reference/web.mdx index 0b2aba40c..c322d467d 100644 --- a/shared/video-sdk/develop/play-media/reference/web.mdx +++ b/shared/video-sdk/develop/play-media/reference/web.mdx @@ -1,9 +1,5 @@ -### API reference - -- createBufferSourceAudioTrack - -- BufferSourceAudioTrackInitConfig +- For pushing live streams to a CDN, check out the [Push Stream to CDN web demo](https://webdemo.agora.io/pushStreamToCDN/index.html) and the associated [source code](https://github.com/AgoraIO/API-Examples-Web/tree/main/Demo/pushStreamToCDN). diff --git a/shared/video-sdk/develop/product-workflow/project-implementation/android.mdx b/shared/video-sdk/develop/product-workflow/project-implementation/android.mdx index 05d5c363d..8eaea454e 100644 --- a/shared/video-sdk/develop/product-workflow/project-implementation/android.mdx +++ b/shared/video-sdk/develop/product-workflow/project-implementation/android.mdx @@ -187,7 +187,7 @@ To implement these features in your , take the following steps To show the screen sharing preview in the local video container, add the following method to the `MainActivity` class: - ``` java + ```java private void startScreenSharePreview() { // Create render view by RtcEngine FrameLayout container = findViewById(R.id.local_video_view_container); @@ -211,7 +211,7 @@ To implement these features in your , take the following steps Add the following method to the `MainActivity` class to update publishing options when starting or stopping screen sharing: - ``` java + ```java void updateMediaPublishOptions(boolean publishScreen) { ChannelMediaOptions mediaOptions = new ChannelMediaOptions(); mediaOptions.publishCameraTrack = !publishScreen; diff --git a/shared/video-sdk/develop/product-workflow/project-implementation/electron.mdx b/shared/video-sdk/develop/product-workflow/project-implementation/electron.mdx index c9d9d6ec8..e9db70d3e 100644 --- a/shared/video-sdk/develop/product-workflow/project-implementation/electron.mdx +++ b/shared/video-sdk/develop/product-workflow/project-implementation/electron.mdx @@ -82,7 +82,7 @@ To implement volume control and screen sharing in your , take Set up a `VideoCanvas` and use it in the `setupScreenSharing` method to show the screen track locally. The new `VideoCanvas` enables you to publish your screen track along with your local video. In `preload.js`, add the following method before `window.onload = () => `: - ``` javascript + ```javascript function setupScreenSharing(doScreenShare) { if (doScreenShare) diff --git a/shared/video-sdk/develop/product-workflow/project-implementation/flutter.mdx b/shared/video-sdk/develop/product-workflow/project-implementation/flutter.mdx index 93ee46b89..311f6d993 100644 --- a/shared/video-sdk/develop/product-workflow/project-implementation/flutter.mdx +++ b/shared/video-sdk/develop/product-workflow/project-implementation/flutter.mdx @@ -152,7 +152,7 @@ To implement these features in your , take the following steps dimensions: VideoDimensions(height: 1280, width: 720), frameRate: 15, bitrate: 600))); - ``` + ``` 1. **Show screen sharing preview** diff --git a/shared/video-sdk/develop/product-workflow/project-implementation/index.mdx b/shared/video-sdk/develop/product-workflow/project-implementation/index.mdx index da19e46b4..e2a8827f6 100644 --- a/shared/video-sdk/develop/product-workflow/project-implementation/index.mdx +++ b/shared/video-sdk/develop/product-workflow/project-implementation/index.mdx @@ -1,21 +1,13 @@ -import Android from './android.mdx'; -import Ios from './ios.mdx'; -import Web from './web.mdx'; +import Poc3 from './poc3.mdx'; import Electron from './electron.mdx'; import Flutter from './flutter.mdx'; import ReactNative from './react-native.mdx' -import Unity from './unity.mdx'; -import MacOS from './macos.mdx'; import Windows from './windows.mdx'; import Unreal from './unreal.mdx'; - - - - + - - \ No newline at end of file + diff --git a/shared/video-sdk/develop/product-workflow/project-implementation/macos.mdx b/shared/video-sdk/develop/product-workflow/project-implementation/macos.mdx index df8813b97..24837442a 100644 --- a/shared/video-sdk/develop/product-workflow/project-implementation/macos.mdx +++ b/shared/video-sdk/develop/product-workflow/project-implementation/macos.mdx @@ -144,7 +144,7 @@ To implement these features in your , take the following steps In `ViewController`, add the following function after `buttonAction`: - ``` swift + ```swift @objc func screenShareAction(sender: NSButton!) { if !joined { // Check if successfully joined the channel and set button title accordingly diff --git a/shared/video-sdk/develop/product-workflow/project-implementation/poc3.mdx b/shared/video-sdk/develop/product-workflow/project-implementation/poc3.mdx new file mode 100644 index 000000000..31d934e6f --- /dev/null +++ b/shared/video-sdk/develop/product-workflow/project-implementation/poc3.mdx @@ -0,0 +1,104 @@ +import ImportLibrary from '@docs/assets/code/video-sdk/product-workflow/import-library.mdx'; +import SetupVolume from '@docs/assets/code/video-sdk/product-workflow/setup-volume.mdx'; +import ExtensioniOS from '@docs/assets/code/video-sdk/product-workflow/ios-extension.mdx'; +import ScreenCaptureMacOS from '@docs/assets/code/video-sdk/product-workflow/macos-screencapture.mdx'; +import MuteRemoteUser from '@docs/assets/code/video-sdk/product-workflow/mute-remote-user.mdx'; +import PreviewScreenTrack from '@docs/assets/code/video-sdk/product-workflow/preview-screen-track.mdx'; +import PublishScreenTrack from '@docs/assets/code/video-sdk/product-workflow/publish-screen-track.mdx'; +import StartScreenSharing from '@docs/assets/code/video-sdk/product-workflow/start-sharing.mdx'; +import StopSharing from '@docs/assets/code/video-sdk/product-workflow/stop-sharing.mdx'; +import MicrophoneCameraChange from '@docs/assets/code/video-sdk/product-workflow/microphone-camera-change.mdx'; +import DeviceChanged from '@docs/assets/code/video-sdk/product-workflow/media-device-changed.mdx'; +import MuteLocalVideo from '@docs/assets/code/video-sdk/product-workflow/mute-local-video.mdx'; + + + + +### Add the required imports + + + +### Setup volume control + + Use the following function to set the remote playback or local recording volume: + + + + +### Mute and unmute the remote user + + + + +### Create an extension for screen sharing + + + + + +### Share a screen or window + + + + + +### Start screen sharing + + + +### Mute and unmute the local video + + + +### Change media input device + + + + + +### Start screen sharing + + + +### Configure to publish or un-publish the screen sharing track + + + +### Display the screen stream locally + + + +### Stop screen sharing + + + + + +### Add the required imports + + + +### Setup volume control + + Use the following function to set the remote playback or local recording volume: + + + +### Mute and unmute the local user + + + +### Share a screen or window + + + +### Stop screen sharing + + + +### Add event handlers for Microphone and Camera change + + + + + diff --git a/shared/video-sdk/develop/product-workflow/project-implementation/react-native.mdx b/shared/video-sdk/develop/product-workflow/project-implementation/react-native.mdx index 41ff2f2f9..e21bcadf7 100644 --- a/shared/video-sdk/develop/product-workflow/project-implementation/react-native.mdx +++ b/shared/video-sdk/develop/product-workflow/project-implementation/react-native.mdx @@ -219,5 +219,5 @@ To implement volume control in your , take the following steps } } }, - ``` + ``` \ No newline at end of file diff --git a/shared/video-sdk/develop/product-workflow/project-implementation/swift.mdx b/shared/video-sdk/develop/product-workflow/project-implementation/swift.mdx index dfea602ab..638835412 100644 --- a/shared/video-sdk/develop/product-workflow/project-implementation/swift.mdx +++ b/shared/video-sdk/develop/product-workflow/project-implementation/swift.mdx @@ -31,7 +31,7 @@ To create this user interface, in the `ViewController` class: In `ViewController`, add the following line after the last `import` statement: - ``` swift + ```swift import ReplayKit ``` diff --git a/shared/video-sdk/develop/product-workflow/project-implementation/unity.mdx b/shared/video-sdk/develop/product-workflow/project-implementation/unity.mdx index 02e5cc35b..91f0c811b 100644 --- a/shared/video-sdk/develop/product-workflow/project-implementation/unity.mdx +++ b/shared/video-sdk/develop/product-workflow/project-implementation/unity.mdx @@ -1,190 +1,180 @@ + +1. **Adjust the recorded audio volume** -### Implement the user interface - -In a real-world application, for each volume setting you want to control, you typically add a volume control UI element such as a Slider to the audio configuration panel. To enable the user to mute local or remote audio, you add Toggle to the interface for each user. In this example, you add a `Button` and a `Toggle` to the UI to test different volume settings. For screen sharing, you add a `Button` to start and stop sharing. To implement this UI, take the following steps: - - 1. Right-click **Sample Scene**, then click **Game Object** > **UI** > **Button - TextMeshPro**. A button appears in the **Scene** Canvas. - - 2. In **Inspector**, rename **Button** to `shareScreen`, and then change the following coordinates: - - * **Pos X** - 350 - * **Pos Y** - 172 - - 3. Right-click **Sample Scene**, then click **Game Object** > **UI** > **Slider**. A Slider appears in the **Scene** Canvas. - - 4. In **Inspector**, rename **Slider** to `mediaProgressBar`, and then change the following coordinates: - - * **Pos X** - 0 - * **Pos Y** - 142 - - 5. Right-click **Canvas**, then click **UI** > **Toggle**. A toggle button appears in the scene canvas. - - 6. In **Inspector**, rename **Toggle** to `Mute`, then change **Pos Y** to `30`. - -### Handle the system logic - -1. **Add the required library** - - To import the required Unity UI library, in your script file, add the following to the namespace declarations: - - ``` csharp - using TMPro; + ```csharp + public void ChangeVolume(int volume) + { + // Adjust the volume of the recorded signal. + agoraEngine.AdjustRecordingSignalVolume(volume); + } ``` + When using a device to capture audio, sets a default global volume value of 85 (range [0, 255]). automatically increases a capture device volume that's too low. You can adjust the capture volume as per your needs by adjusting the microphone or sound card's signal capture volume. -### Implement screen sharing, volume control and mute - -To implement these features in your , take the following steps: + For more details, see the following: -1. **Declare the variables you need** + - AdjustRecordingSignalVolume - To read, store and apply workflow settings, in your script file, add the following declarations to `NewBehaviourScript`: +1. **Implement the logic to mute and unmute the remote user** ```csharp - // Volume Control - private Slider volumeSlider; - // Screen sharing - private bool sharingScreen = false; - private TMP_Text shareScreenBtnText; + public void MuteRemoteAudio(bool value) + { + if (remoteUid > 0) + { + // Pass the uid of the remote user you want to mute. + agoraEngine.MuteRemoteAudioStream(Convert.ToUInt32(remoteUid), value); + } + else + { + Debug.Log("No remote user in the channel"); + } + } ``` + For more details, see the following: -1. **Access the UI elements** + - MuteRemoteAudioStream - To setup the event listener for the screen sharing button and access the slider, in your script file, add the following at the end of `SetupUI`: +1. **Get the list of shareable screen** ```csharp - // Access the button from the UI. - go = GameObject.Find("shareScreen"); - // Add a listener to the button and invokes shareScreen when the button is pressed. - go.GetComponent`: - -```html - - -
-
- - -``` - -### Implement autoplay blocking, screen sharing, volume control and mute - -To implement the workflow logic in your , take the following steps: - -1. **Bypass autoplay blocking** - - Web browsers use autoplay policy in order to improve the user experience and reduce data consumption on expensive networks. The autoplay policy may be to block the audio or video playback in your . To deal with autoplay blocking, supplies the `onAutoplayFailed` callback. Your listen for `onAutoplayFailed` and prompt the user to interact with the webpage to resume the playback. - - To bypass autoplay blocking, in `main.js`, add the following code before `agoraEngine.on("user-published", async (user, mediaType) =>`: - - ```javascript - AgoraRTC.onAutoplayFailed = () => { - // Create button for the user interaction. - const btn = document.createElement("button"); - // Set the button text. - btn.innerText = "Click me to resume the audio/video playback"; - // Remove the button when onClick event occurs. - btn.onClick = () => { - btn.remove(); - }; - // Append the button to the UI. - document.body.append(btn); - } - ``` - - Safari and WebView in iOS have a stricter autoplay policy. They only allow playback with sound that is triggered by user intervention and do not remove the autoplay block after user intervention. To bypass this strict policy, provide a button in the UI to stop and play the remote audio track. When the user stops the audio track and plays it again, the browser automatically removes the autoplay block. - -1. **Add the required variable** - - In `main.js`, add the following variables to the declarations: - - ```javascript - var isSharingEnabled = false; - var isMuteVideo = false; - ``` - -1. **Add the screen sharing logic** - - When the user presses **Share Screen**, your does the following: - - a. Creates a screen track. - b. Unpublishes the local video track. - c. Publishes the screen track. - d. Plays the screen track on the local video container. - e. Updates the button text and screen sharing state. - - When the user presses **Stop Sharing**, your does the following: - - a. Unpublishes the screen track. - b. Publishes the local video track. - c. Plays the local video on the local video container. - d. Updates the button text and screen sharing state. - - To implement this logic, in `main.js`, add the following code before `document.getElementById('leave').onclick = async function ()`: - - - ```javascript - document.getElementById('inItScreen').onclick = async function () { - - if(isSharingEnabled == false) { - // Create a screen track for screen sharing. - channelParameters.screenTrack = await AgoraRTC.createScreenVideoTrack(); - // Replace the video track with the screen track. - await channelParameters.localVideoTrack.replaceTrack(channelParameters.screenTrack, true); - // Update the button text. - document.getElementById(`inItScreen`).innerHTML = "Stop Sharing"; - // Update the screen sharing state. - isSharingEnabled = true; - } else { - // Replace the screen track with the local video track. - await channelParameters.screenTrack.replaceTrack(channelParameters.localVideoTrack, true); - // Update the button text. - document.getElementById(`inItScreen`).innerHTML = "Share Screen"; - // Update the screen sharing state. - isSharingEnabled = false; - } - } - ``` - - - - ```javascript - document.getElementById('inItScreen').onclick = async function () { - if(options.role == "audience" || options.role == "") { - // You cannot share screen in audience mode. - window.alert("Join as a host to start screen sharing!"); - return; - } - - if(isSharingEnabled == false) { - // Create a screen track for screen sharing. - channelParameters.screenTrack = await AgoraRTC.createScreenVideoTrack(); - // Replace the video track with the screen track. - await channelParameters.localVideoTrack.replaceTrack(channelParameters.screenTrack, true); - // Update the button text. - document.getElementById(`inItScreen`).innerHTML = "Stop Sharing"; - // Update the screen sharing state. - isSharingEnabled = true; - } else { - // Replace screen track with video track, and stop screen track. - await channelParameters.screenTrack.replaceTrack(channelParameters.localVideoTrack, true); - // Update the button text. - document.getElementById(`inItScreen`).innerHTML = "Share Screen"; - // Update the screen sharing state. - isSharingEnabled = false; - } - } - ``` - - -1. **Add the volume control logic** - - When the user moves a range slider, adjust the volume for the local or remote audio track. In `main.js`, add the following code before `agoraEngine.on("user-published", async (user, mediaType) =>`: - - ```javascript - // Set an event listener on the range slider. - document.getElementById("localAudioVolume").addEventListener("change", function(evt) { - console.log("Volume of local audio :" + evt.target.value); - // Set the local audio volume. - channelParameters.localAudioTrack.setVolume(parseInt(evt.target.value)); - }); - // Set an event listener on the range slider. - document.getElementById("remoteAudioVolume").addEventListener("change", function(evt) { - console.log("Volume of remote audio :" + evt.target.value); - // Set the remote audio volume. - channelParameters.remoteAudioTrack.setVolume(parseInt(evt.target.value)); - }); - ``` - - When using a device to capture audio, sets a default global volume value of 85 (range [0, 255]). automatically increases a capture device volume that's too low. You can adjust the capture volume as per your needs by adjusting the microphone or sound card's signal capture volume. - -1. **Add the logic to mute and unmute the local video track** - - To mute or unmute the local video track, call `setEnabled` and pass a `Boolean` value. In `main.js`, add the following code before `document.getElementById('leave').onclick = async function ()`: - - ```javascript - document.getElementById('muteVideo').onclick = async function () { - if(isMuteVideo == false) { - // Mute the local video. - channelParameters.localVideoTrack.setEnabled(false); - // Update the button text. - document.getElementById(`muteVideo`).innerHTML = "Unmute Video"; - isMuteVideo = true; - } else { - // Unmute the local video. - channelParameters.localVideoTrack.setEnabled(true); - // Update the button text. - document.getElementById(`muteVideo`).innerHTML = "Mute Video"; - isMuteVideo = false; - } - } - ``` - -1. **Switch media input device** - - To receive notification of media input device changes, in `main.js`, add the following code before `agoraEngine.on("user-published", async (user, mediaType) =>`: - - ```javascript - AgoraRTC.onMicrophoneChanged = async (changedDevice) => { - // When plugging in a device, switch to a device that is newly plugged in. - if (changedDevice.state === "ACTIVE") { - localAudioTrack.setDevice(changedDevice.device.deviceId); - // Switch to an existing device when the current device is unplugged. - } else if (changedDevice.device.label === localAudioTrack.getTrackLabel()) { - const oldMicrophones = await AgoraRTC.getMicrophones(); - oldMicrophones[0] && localAudioTrack.setDevice(oldMicrophones[0].deviceId); - } - } - - AgoraRTC.onCameraChanged = async (changedDevice) => { - // When plugging in a device, switch to a device that is newly plugged in. - if (changedDevice.state === "ACTIVE") { - localVideoTrack.setDevice(changedDevice.device.deviceId); - // Switch to an existing device when the current device is unplugged. - } else if (changedDevice.device.label === localVideoTrack.getTrackLabel()) { - const oldCameras = await AgoraRTC.getCameras(); - oldCameras[0] && localVideoTrack.setDevice(oldCameras[0].deviceId); - } - } - ``` - -
diff --git a/shared/video-sdk/develop/product-workflow/project-setup/android.mdx b/shared/video-sdk/develop/product-workflow/project-setup/android.mdx index e1c2aba91..5c64dd157 100644 --- a/shared/video-sdk/develop/product-workflow/project-setup/android.mdx +++ b/shared/video-sdk/develop/product-workflow/project-setup/android.mdx @@ -1,17 +1,17 @@ -**Add the screen sharing module to your project** +* **Add the screen sharing module to your project** -1. Download the Latest Android and unzip. + 1. Download the Latest Android and unzip. -1. Navigate to the `Agora_Native_SDK_for_Android_FULL/rtc/sdk` folder. + 1. Navigate to the `Agora_Native_SDK_for_Android_FULL/rtc/sdk` folder. -1. Copy the file named `AgoraScreenShareExtension.aar` into the `/app/libs` folder. + 1. Copy the file named `AgoraScreenShareExtension.aar` into the `/app/libs` folder. -1. In `/Gradle Scripts/build.gradle (Module: .app)`, add the following line under `dependencies`: + 1. In `/Gradle Scripts/build.gradle (Module: .app)`, add the following line under `dependencies`: - ``` text - implementation files('libs/AgoraScreenShareExtension.aar') - ``` + ```text + implementation files('libs/AgoraScreenShareExtension.aar') + ``` diff --git a/shared/video-sdk/develop/product-workflow/project-setup/flutter.mdx b/shared/video-sdk/develop/product-workflow/project-setup/flutter.mdx index dbbad1cc4..0700cfc73 100644 --- a/shared/video-sdk/develop/product-workflow/project-setup/flutter.mdx +++ b/shared/video-sdk/develop/product-workflow/project-setup/flutter.mdx @@ -136,7 +136,7 @@ Depending on your target platform, follow these steps: } @end - ``` + ``` 1. Start screen sharing diff --git a/shared/video-sdk/develop/product-workflow/project-test/android.mdx b/shared/video-sdk/develop/product-workflow/project-test/android.mdx index 987552694..5d03bc583 100644 --- a/shared/video-sdk/develop/product-workflow/project-test/android.mdx +++ b/shared/video-sdk/develop/product-workflow/project-test/android.mdx @@ -1,52 +1,27 @@ -3. In Android Studio, open `app/java/com.example./MainActivity` and update `appId`, `channelName` and `token` with the values from Agora Console. +5. **Join a channel** -4. Connect an Android device to your development device. + + Select a user role. Press **Join** to connect to the same channel as your web demo. + -5. In Android Studio, click **Run app**. A moment later, you see the project installed on your device. + + Press **Join** to connect to the same channel as your web demo. + - If this is the first time you run your app, grant camera and microphone permissions. +6. **Test volume control** - -6. Select a user role. Press **Join** to connect to the same channel as your web demo. - - - -6. Press **Join** to connect to the same channel as your web demo. - - -7. **Test volume control** - - 1. Speak into your Android device as you move the slider on the `SeekBar` to the left and then right. You notice the volume decrease and then increase in the web demo app as the recording volume changes. + 1. Select a volume setting from the dropdown and move the slider to increase or decrease the selected volume. - 1. Tap the **Mute** `CheckBox` while you speak into the microphone connected to the web demo app. - You notice that the remote audio is muted on your Android device. + 1. Tap the **Mute** `CheckBox`. The local and all remote audio streams are muted. - 1. To test other volume control methods, in `onProgressChanged` replace the `adjustRecordingSignalVolume` call with one of the following: - - ```java - agoraEngine.adjustPlaybackSignalVolume(volume); - agoraEngine.adjustUserPlaybackSignalVolume(remoteUid,volume); - agoraEngine.adjustAudioMixingVolume(volume); - agoraEngine.adjustAudioMixingPlayoutVolume(volume); - agoraEngine.adjustAudioMixingPublishVolume(volume); - agoraEngine.setInEarMonitoringVolume(volume); - ``` - Run the app again and use the volume slider to test the change in the corresponding volume setting. - - 1. To test other mute methods, in `onCheckedChanged` replace the `muteRemoteAudioStream` call with one of the following: - - ```java - agoraEngine.muteAllRemoteAudioStreams(isChecked); - agoraEngine.muteLocalAudioStream(isChecked); - ``` - Run the app again and tap the `CheckBox` to test the effect of these mute methods. - -1. **Test Screen sharing** + +7. **Test Screen sharing** 1. Press **Start Screen Sharing**. You see your device screen shared in the web demo app and a preview on your Android device. 1. Press **Stop Screen Sharing**. Screen sharing is stopped and the camera stream is restored in the web demo app. + diff --git a/shared/video-sdk/develop/product-workflow/project-test/index.mdx b/shared/video-sdk/develop/product-workflow/project-test/index.mdx index 852041e5b..7f8e8f1af 100644 --- a/shared/video-sdk/develop/product-workflow/project-test/index.mdx +++ b/shared/video-sdk/develop/product-workflow/project-test/index.mdx @@ -1,22 +1,22 @@ -import Android from './android.mdx'; -import Ios from './ios.mdx'; -import Web from './web.mdx'; +import Poc3 from './poc3.mdx'; import Electron from './electron.mdx'; import Flutter from './flutter.mdx'; import ReactNative from './react-native.mdx'; -import MacOS from './macos.mdx'; -import Unity from './unity.mdx'; import Unreal from './unreal.mdx'; import Windows from './windows.mdx'; + + +1. **[Generate a temporary token](../reference/manage-agora-account#generate-a-temporary-token) in **. - - - - +2. **Configure the web demo you use to connect to your **: + + In your browser, navigate to the web demo and update _App ID_, _Channel_, and _Token_ with the values for your temporary token, then click **Join**. + - + \ No newline at end of file diff --git a/shared/video-sdk/develop/product-workflow/project-test/ios.mdx b/shared/video-sdk/develop/product-workflow/project-test/ios.mdx index a0d1f3778..170d386ef 100644 --- a/shared/video-sdk/develop/product-workflow/project-test/ios.mdx +++ b/shared/video-sdk/develop/product-workflow/project-test/ios.mdx @@ -1,52 +1,8 @@ - - -3. In the `ViewController`, update `appID`, `channelName`, and `token` with the values from . - -4. Run your using either a physical or a simulator iOS device. - - If this is the first time you run the project, grant microphone and camera access to your . - - If you use an iOS simulator, you see the remote video only. You cannot see the local video stream because of Apple simulator hardware restrictions. - - -5. Select *Broadcaster* mode. Press **Join** to connect to the same channel as your web demo. - - - -5. Press **Join** to connect to the same channel as your web demo. - - -6. **Test volume control** +import Source from './swift.mdx'; - a. Speak into your iOS device as you move the slider to the left and then right. You notice the volume decrease and then increase in the web demo app as the recording volume changes. - b. Tap the **Mute** `UISwitch` while you speak into the microphone connected to the web demo app. - - You notice that the remote audio is muted on your iOS device. - - c. To test other volume control methods, in `volumeSliderValueChanged` replace the `adjustRecordingSignalVolume` call with one of the following: - - ```swift - agoraEngine.adjustPlaybackSignalVolume(volume) - agoraEngine.adjustUserPlaybackSignalVolume(remoteUid, volume: Int32(volume)) - agoraEngine.adjustAudioMixingVolume(volume) - agoraEngine.adjustAudioMixingPlayoutVolume(volume) - agoraEngine.adjustAudioMixingPublishVolume(volume) - agoraEngine.setInEarMonitoringVolume(volume) - ``` - Run the app again and use the volume slider to test the change in the corresponding volume setting. - - d. To test other mute methods, in `muteSwitchValueChanged` replace the `muteRemoteAudioStream` call with one of the following: - - ```swift - agoraEngine.muteAllRemoteAudioStreams(isMuted) - agoraEngine.muteLocalAudioStream(isMuted) - ``` - - Run the app again and tap **Mute** to test the effect of these mute methods. - -7. **Test Screen sharing** + - Press **Start Screen Sharing**. You see your device screen shared in the web demo app. + - + \ No newline at end of file diff --git a/shared/video-sdk/develop/product-workflow/project-test/poc3.mdx b/shared/video-sdk/develop/product-workflow/project-test/poc3.mdx new file mode 100644 index 000000000..070d9833e --- /dev/null +++ b/shared/video-sdk/develop/product-workflow/project-test/poc3.mdx @@ -0,0 +1,20 @@ +import TestFirstSteps from '@docs/shared/common/project-test/rtc-first-steps.mdx'; +import Android from './android.mdx'; +import Ios from './ios.mdx'; +import Web from './web.mdx'; +import Unity from './unity.mdx'; +import ReactJS from './react-js.mdx'; +import MacOS from './macos.mdx'; + + + + + + + + + + + + + \ No newline at end of file diff --git a/shared/video-sdk/develop/product-workflow/project-test/react-js.mdx b/shared/video-sdk/develop/product-workflow/project-test/react-js.mdx new file mode 100644 index 000000000..9d5a224c7 --- /dev/null +++ b/shared/video-sdk/develop/product-workflow/project-test/react-js.mdx @@ -0,0 +1,35 @@ + + +5. **Run this example** + + Under **Choose a sample code**, select **Screen share and volume control**. + +1. **Join a channel** + + + Click **Join** to start a session. When you select **Host**, the local video is published and played in the . When you select **Audience**, the remote stream is subscribed and played. + + + + Press **Join** to connect to the same channel as your web demo. + + +1. **Test volume control features** + + 1. Move the **Local Audio Level** range slider to the right side. Your increases the volume of the local audio track and the remote users hear a louder sound. Move the slider to the left side to decrease the volume of the local audio track. + + 1. Move the **Remote Audio Level** range slider to the right side. Your increases the volume of the remote audio track. Move the slider to the left side to decrease the volume of the remote audio track. + + 1. Connect a new input audio or video device. Your checks the device state and switches to the newly added active device. Remove the added device. Your switches back to the previous device. + + 1. Click **Mute Video**. You see your disables the local video track. Press the same button again to enable the local video track. + + +8. **Test screen sharing** + + 1. Click **Start Sharing**. Your browser opens the **Choose what to share** window. + + 1. In **Choose what to share**, select the screen to share. You see your screen in the local view of your browser and in the Web demo app. Press the the same button to stop screen sharing. + + + diff --git a/shared/video-sdk/develop/product-workflow/project-test/swift.mdx b/shared/video-sdk/develop/product-workflow/project-test/swift.mdx index ca8bec6b3..5bb002c64 100644 --- a/shared/video-sdk/develop/product-workflow/project-test/swift.mdx +++ b/shared/video-sdk/develop/product-workflow/project-test/swift.mdx @@ -1,53 +1,45 @@ +5. **Join a channel** + + Select *Broadcaster* mode. Press **Join** to connect to the same channel as your web demo. + -3. In the `ViewController`, update `appID`, `channelName`, and `token` with the values from . + + Press **Join** to connect to the same channel as your web demo. + -4. Run your . +6. **Test volume control** - If this is the first time you run the project, grant microphone and camera access to your . - - If you use an iOS simulator, you see the remote video only. You cannot see the local video stream because of Apple simulator hardware restrictions. - + a. Speak into your iOS device as you move the slider to the left and then right. You notice the volume decrease and then increase in the web demo app as the recording volume changes. - -5. Select *Broadcaster* mode. Press **Join** to connect to the same channel as your web demo. - - - -5. Press **Join** to connect to the same channel as your web demo. - - -6. Test volume control - - a. Speak into your device as you move the slider to the left and then right. You notice the volume decrease and then increase in the web demo app as the recording volume changes. - - b. Tap **Mute** `UISwitch` while you speak into the microphone connected to the web demo app. - - You notice that the remote audio is muted on your device. + b. Tap the **Mute** `UISwitch` while you speak into the microphone connected to the web demo app. + + You notice that the remote audio is muted on your iOS device. c. To test other volume control methods, in `volumeSliderValueChanged` replace the `adjustRecordingSignalVolume` call with one of the following: - ```swift - agoraEngine.adjustPlaybackSignalVolume(volume) - agoraEngine.adjustUserPlaybackSignalVolume(remoteUid, volume: Int32(volume)) - agoraEngine.adjustAudioMixingVolume(volume) - agoraEngine.adjustAudioMixingPlayoutVolume(volume) - agoraEngine.adjustAudioMixingPublishVolume(volume) - agoraEngine.setInEarMonitoringVolume(volume) - ``` - Run the app again and use the volume slider to test the change in the corresponding volume setting. + ```swift + agoraEngine.adjustPlaybackSignalVolume(volume) + agoraEngine.adjustUserPlaybackSignalVolume(remoteUid, volume: Int32(volume)) + agoraEngine.adjustAudioMixingVolume(volume) + agoraEngine.adjustAudioMixingPlayoutVolume(volume) + agoraEngine.adjustAudioMixingPublishVolume(volume) + agoraEngine.setInEarMonitoringVolume(volume) + ``` + Run the app again and use the volume slider to test the change in the corresponding volume setting. d. To test other mute methods, in `muteSwitchValueChanged` replace the `muteRemoteAudioStream` call with one of the following: - ```swift - agoraEngine.muteAllRemoteAudioStreams(isMuted) - agoraEngine.muteLocalAudioStream(isMuted) - ``` + ```swift + agoraEngine.muteAllRemoteAudioStreams(isMuted) + agoraEngine.muteLocalAudioStream(isMuted) + ``` - Run the app again and tap **Mute** to test the effect of these mute methods. + Run the app again and tap **Mute** to test the effect of these mute methods. -7. Test Screen sharing + +7. **Test Screen sharing** - Press **Share**. You see your device screen shared in the web demo app. + Press **Start Screen Sharing**. You see your device screen shared in the web demo app. + - If this is the first time you share the screen in the project, grant screen recording access to your . diff --git a/shared/video-sdk/develop/product-workflow/project-test/unity.mdx b/shared/video-sdk/develop/product-workflow/project-test/unity.mdx index ac38e049d..3c98847f9 100644 --- a/shared/video-sdk/develop/product-workflow/project-test/unity.mdx +++ b/shared/video-sdk/develop/product-workflow/project-test/unity.mdx @@ -1,49 +1,53 @@ -3. In **Unity Editor**, in your script file, update `_appID`, `_channelName` and `_token` with the values for your temporary token. +5. **Choose this sample in the reference app** -1. In **Unity Editor**, click **Play**. A moment later you see the running on your development device. + From the main screen of the , choose **** from the dropdown and then select **Product Workflow**. - -5. Select a user role. Press **Join** to connect to the same channel as your web demo. - +1. **Join a channel** - -5. Press **Join** to connect to the same channel as your web demo. - + + Select a role using the toggles. Click **Join** to start a session. + -6. **Test volume control** + + Click **Join** to start a call. + - a. Speak into your Android device as you move the slider to the left and then right. You notice the volume decrease and then increase in the web demo app as the recording volume changes. +7. **Test volume control** - b. Tap the **Mute** `Toggle` while you speak into the microphone connected to the web demo app. - You notice that the remote audio is muted on your Android device. + 1. Speak into your development device as you move the slider to the left and then right. You notice the volume decrease and then increase in the web demo app as the recording volume changes. - c. To test other volume control methods, in your script file, locate `changeVolume` and replace the `AdjustRecordingSignalVolume` call with one of the following: + 1. Tap the **Mute** checkbox while you speak into the microphone connected to the web demo app. + You notice that the remote audio is muted on your development device. + + 1. To test other volume control methods, in the `ProductWorkflowManager.cs` file, locate `ChangeVolume` and replace the `AdjustRecordingSignalVolume` call with one of the following: ```csharp - RtcEngine.AdjustPlaybackSignalVolume(volume); - RtcEngine.AdjustUserPlaybackSignalVolume(remoteUid,volume); - RtcEngine.AdjustAudioMixingVolume(volume); - RtcEngine.AdjustAudioMixingPlayoutVolume(volume); - RtcEngine.AdjustAudioMixingPublishVolume(volume); - RtcEngine.SetInEarMonitoringVolume(volume); + agoraEngine.AdjustPlaybackSignalVolume(volume); + agoraEngine.AdjustUserPlaybackSignalVolume(remoteUid,volume); + agoraEngine.AdjustAudioMixingVolume(volume); + agoraEngine.AdjustAudioMixingPlayoutVolume(volume); + agoraEngine.AdjustAudioMixingPublishVolume(volume); + agoraEngine.SetInEarMonitoringVolume(volume); ``` Run the app again and use the volume slider to test the change in the corresponding volume setting. - d. To test other mute methods, in your script file, locate `muteRemoteAudio` and replace the `MuteRemoteAudioStream` call with one of the following: + 1. To test other mute methods, in `ProductWorkflowManager.cs` file, locate `muteRemoteAudio` and replace the `MuteRemoteAudioStream` call with one of the following: ```csharp - RtcEngine.MuteAllRemoteAudioStreams(value); - RtcEngine.MuteLocalAudioStream(value); + agoraEngine.MuteAllRemoteAudioStreams(value); + agoraEngine.MuteLocalAudioStream(value); ``` - Run the app again and tap the `CheckBox` to test the effect of these mute methods. + Run the app again and tap the `Mute` checkbox to test the effect of these mute methods. -7. **Test Screen sharing** + +8. **Test Screen sharing** - a. Press **Share Screen**. You see your device screen shared in the web demo app. + 1. Press **Share Screen**. You see your device screen shared in the web demo app. - b. Press **Stop Sharing**. Screen sharing is stopped and the camera stream is restored in the web demo app. + 1. Press **Stop Sharing**. Screen sharing is stopped and the camera stream is restored in the web demo app. + \ No newline at end of file diff --git a/shared/video-sdk/develop/product-workflow/project-test/web.mdx b/shared/video-sdk/develop/product-workflow/project-test/web.mdx index b8dd30bc0..88f11a96c 100644 --- a/shared/video-sdk/develop/product-workflow/project-test/web.mdx +++ b/shared/video-sdk/develop/product-workflow/project-test/web.mdx @@ -1,50 +1,43 @@ -3. In _main.js_, update `appID`, `channel` and `token` with your values. +5. **Join a channel** -4. Start the dev server + + Select a user role using the radio buttons and click **Join**. + - Execute the following command in the terminal: + + To connect to a channel, click **Join**. + - ```bash - npm run dev - ``` - Use the Url displayed in the terminal to open the in your browser. +1. **Test volume control** - -5. Select a user role using the radio buttons and click **Join**. + 1. Click **Mute Video** - If this is the first time you run the project, you need to grant microphone and camera access to your . - - - -5. To connect to a channel, click **Join**. + You see your disables the local video track. Press the same button again to enable the local video track. - If this is the first time you run the project, you need to grant microphone and camera access to your . - + 1. Adjust the local audio volume. -6. Click **Mute Video** + Move the **Local Audio Level** range slider to the right side. Your increases the volume of the local audio track and the remote users hear a louder sound. Move the slider to the left side to decrease the volume of the local audio track. - You see your disables the local video track. Press the same button again to enable the local video track. + 1. Adjust the remote audio volume. -7. Click **Share Screen**. + Move the **Remote Audio Level** range slider to the right side. Your increases the volume of the remote audio track. Move the slider to the left side to decrease the volume of the remote audio track. - Your browser opens the **Choose what to share** window. + 1. Test media input switching. -8. In **Choose what to share**, select the screen to share. + Connect a new input audio or video device. Your checks the device state and switches to the newly added active device. Remove the added device. Your switches back to the previous device. - You see your screen in the local view of your browser and in the Web demo app. Press the the same button to stop screen sharing. + +7. **Test screen sharing** -9. Adjust the local audio volume. + 1. Click **Share Screen**. - Move the **Local Audio Level** range slider to the right side. Your increases the volume of the local audio track and the remote users hear a louder sound. Move the slider to the left side to decrease the volume of the local audio track. + Your browser opens the **Choose what to share** window. -10. Adjust the remote audio volume. + 1. In **Choose what to share**, select the screen to share. - Move the **Remote Audio Level** range slider to the right side. Your increases the volume of the remote audio track. Move the slider to the left side to decrease the volume of the remote audio track. - -11. Test media input switching. - - Connect a new input audio or video device. Your checks the device state and switches to the newly added active device. Remove the added device. Your switches back to the previous device. + You see your screen in the local view of your browser and in the Web demo app. Press the the same button to stop screen sharing. + diff --git a/shared/video-sdk/develop/product-workflow/project-test/windows.mdx b/shared/video-sdk/develop/product-workflow/project-test/windows.mdx index fe43ae5cb..838e53515 100644 --- a/shared/video-sdk/develop/product-workflow/project-test/windows.mdx +++ b/shared/video-sdk/develop/product-workflow/project-test/windows.mdx @@ -1,4 +1,4 @@ - + 3. In **Solution Explorer**, open `AgoraImplementationDlg.cpp` and update `APP_ID`, `ChannelName` and `Token` with the values from . @@ -10,7 +10,7 @@ 5. Select a user role. Press **Join** to connect to the same channel as your web demo. - + 5. Press **Join** to connect to the same channel as your web demo. diff --git a/shared/video-sdk/develop/product-workflow/reference/android.mdx b/shared/video-sdk/develop/product-workflow/reference/android.mdx index 0929fe14d..c56ca9b9b 100644 --- a/shared/video-sdk/develop/product-workflow/reference/android.mdx +++ b/shared/video-sdk/develop/product-workflow/reference/android.mdx @@ -1,25 +1,3 @@ -### API reference - -- startScreenCapture - -- stopScreenCapture - -- updateChannelMediaOptions - -- adjustRecordingSignalVolume - -- adjustPlaybackSignalVolume - -- adjustUserPlaybackSignalVolume - -- adjustAudioMixingVolume - -- adjustAudioMixingPlayoutVolume - -- adjustAudioMixingPublishVolume - -- setInEarMonitoringVolume - diff --git a/shared/video-sdk/develop/product-workflow/reference/react-js.mdx b/shared/video-sdk/develop/product-workflow/reference/react-js.mdx new file mode 100644 index 000000000..a77a88372 --- /dev/null +++ b/shared/video-sdk/develop/product-workflow/reference/react-js.mdx @@ -0,0 +1,3 @@ + + + \ No newline at end of file diff --git a/shared/video-sdk/develop/product-workflow/reference/swift.mdx b/shared/video-sdk/develop/product-workflow/reference/swift.mdx index 382493359..4eeb6b004 100644 --- a/shared/video-sdk/develop/product-workflow/reference/swift.mdx +++ b/shared/video-sdk/develop/product-workflow/reference/swift.mdx @@ -1,37 +1,8 @@ -### API reference - -- adjustRecordingSignalVolume - -- adjustPlaybackSignalVolume - -- adjustUserPlaybackSignalVolume - -- adjustAudioMixingVolume - -- adjustAudioMixingPlayoutVolume - -- adjustAudioMixingPublishVolume -- setInEarMonitoringVolume - -- setExternalVideoSource -- adjustRecordingSignalVolume - -- adjustPlaybackSignalVolume - -- adjustUserPlaybackSignalVolume - -- adjustAudioMixingVolume - -- adjustAudioMixingPlayoutVolume - -- adjustAudioMixingPublishVolume -- setInEarMonitoringVolume -- setExternalVideoSource diff --git a/shared/video-sdk/develop/product-workflow/reference/unity.mdx b/shared/video-sdk/develop/product-workflow/reference/unity.mdx index 1df4ae800..72de34065 100644 --- a/shared/video-sdk/develop/product-workflow/reference/unity.mdx +++ b/shared/video-sdk/develop/product-workflow/reference/unity.mdx @@ -1,23 +1,3 @@ -### API reference - -- StartScreenCaptureByWindowId - -- StopScreenCapture - -- AdjustRecordingSignalVolume - -- AdjustPlaybackSignalVolume - -- AdjustUserPlaybackSignalVolume - -- AdjustAudioMixingVolume - -- AdjustAudioMixingPlayoutVolume - -- AdjustAudioMixingPublishVolume - -- SetInEarMonitoringVolume - - \ No newline at end of file + diff --git a/shared/video-sdk/develop/product-workflow/reference/web.mdx b/shared/video-sdk/develop/product-workflow/reference/web.mdx index af533fdb7..f2c794704 100644 --- a/shared/video-sdk/develop/product-workflow/reference/web.mdx +++ b/shared/video-sdk/develop/product-workflow/reference/web.mdx @@ -1,17 +1,16 @@ -### API reference - -* createScreenVideoTrack -* setVolume -* onAudioAutoplayFailed -* setEnabled -* onMicrophoneChanged -* onCameraChanged -* IMicrophoneAudioTrack.setDevice -* ICameraVideoTrack.setDevice - -### setEnabled and setMuted +### Web demos + +For more working examples, check out the following web demos: + +- [Share the Screen](https://webdemo.agora.io/shareTheScreen/index.html) with [source code](https://github.com/AgoraIO/API-Examples-Web/tree/main/Demo/shareTheScreen) +- [Mute using `setMuted` API](https://webdemo.agora.io/basicMute/index.html) with [source code](https://github.com/AgoraIO/API-Examples-Web/tree/main/Demo/basicMute) +- [Mute using `setEnabled` API](https://webdemo.agora.io/basicMuteSetEnabled/index.html) with [source code](https://github.com/AgoraIO/API-Examples-Web/tree/main/Demo/basicMuteSetEnabled) +- [Mute using `MediaStreamTrack.enabled` property](https://webdemo.agora.io/basicMuteMediaStreamTrackEnabled/index.html) with [source code](https://github.com/AgoraIO/API-Examples-Web/tree/main/Demo/basicMuteMediaStreamTrackEnabled) + + +### `setEnabled` and `setMuted` Both Web SDK 4.x and 3.x provide APIs for controlling the collection and sending of local audio and video. The differences between these APIs are detailed in the table below. You cannot call `setEnabled` and `setMuted` at the same time. diff --git a/shared/video-sdk/develop/spatial-audio/project-implementation/index.mdx b/shared/video-sdk/develop/spatial-audio/project-implementation/index.mdx index d0918e6f0..6e63ba582 100644 --- a/shared/video-sdk/develop/spatial-audio/project-implementation/index.mdx +++ b/shared/video-sdk/develop/spatial-audio/project-implementation/index.mdx @@ -1,21 +1,13 @@ -import Android from './android.mdx'; -import Ios from './ios.mdx'; +import Poc3 from './poc3.mdx'; import Electron from './electron.mdx'; -import Unity from './unity.mdx'; -import MacOS from './macos.mdx'; import Windows from './windows.mdx'; -import Web from './web.mdx'; import Unreal from './unreal.mdx'; import Flutter from './flutter.mdx'; import ReactNative from './react-native.mdx'; + - - - - - diff --git a/shared/video-sdk/develop/spatial-audio/project-implementation/poc3.mdx b/shared/video-sdk/develop/spatial-audio/project-implementation/poc3.mdx new file mode 100644 index 000000000..0db72c6cb --- /dev/null +++ b/shared/video-sdk/develop/spatial-audio/project-implementation/poc3.mdx @@ -0,0 +1,73 @@ +import ImportLibrary from '@docs/assets/code/video-sdk/spatial-audio/import-library.mdx'; +import SetVariables from '@docs/assets/code/video-sdk/spatial-audio/set-variables.mdx'; +import SetupSpatial from '@docs/assets/code/video-sdk/spatial-audio/setup-spatial.mdx'; +import SetupLocal from '@docs/assets/code/video-sdk/spatial-audio/setup-local.mdx'; +import SetupRemote from '@docs/assets/code/video-sdk/spatial-audio/setup-remote.mdx'; +import ClearPosition from '@docs/assets/code/video-sdk/spatial-audio/remove-spatial.mdx'; +import PlayMedia from '@docs/assets/code/video-sdk/spatial-audio/play-media.mdx'; + + + +### Add the required imports + + + Make sure to also add the Spatial Audio plugin. Check the [Agora plugin list](../reference/downloads#plugin-list) for more information. + + + +### Add the required variables + + + +### Setup spatial audio engine + + + + + + +### Set local position + + + + + +### Set remote position + + + + + +### Clear spatial positions before you leave the channel + + + + + + + +### Add the required imports + + + +### Add the required variables + + + +### Setup spatial audio engine + + + +### Set local position + + + +### Update remote positions + + + +### Play media files with spatial audio + + + + diff --git a/shared/video-sdk/develop/spatial-audio/project-implementation/swift.mdx b/shared/video-sdk/develop/spatial-audio/project-implementation/swift.mdx deleted file mode 100644 index 36a63bcd8..000000000 --- a/shared/video-sdk/develop/spatial-audio/project-implementation/swift.mdx +++ /dev/null @@ -1,112 +0,0 @@ -import AddUIElements from '@docs/assets/code/video-sdk/spatial-audio/swift/add-ui-elements.mdx'; -import ConfigureUIElements from '@docs/assets/code/video-sdk/spatial-audio/swift/configure-ui-elements.mdx'; - -### Implement a user interface - -In a real-world application, you report your local spatial position to a server in your environment and retrieve positions of remote users in the channel from your server. In this page, you use a single `Button` to set the spatial position of a remote user. - -To add the button to the UI, in the `ViewController` class: - -1. **Add the UI elements you need** - - Add the following code at the top of the class: - - - -1. **Configure the UI elements in your interface** - - Paste the following lines inside the `initViews` function: - - - -### Handle the system logic - -1. **Add the required variables** - - You create an instance of `AgoraLocalSpatialAudioKit` to configure spatial audio and set self and remote user positions. In your project, add the following declarations at the top of the `ViewController` class: - - ```swift - var localSpatial: AgoraLocalSpatialAudioKit! - var remoteUid: UInt = 0 // Stores the uid of the remote user - ``` - -### Implement spatial audio - -To implement these features in your , take the following steps: - -1. **Instantiate and configure the spatial audio engine** - - To create an instance of `AgoraLocalSpatialAudioKit` at startup take the following steps: - - 1. When a user launches the , you create an instance of `AgoraLocalSpatialAudioKit`, configure it and update the user's self position. To do this, add the following method to the `ViewController` class: - - ```swift - func configureSpatialAudioEngine() { - agoraEngine.enableAudio() - - agoraEngine.setAudioProfile(.speechStandard, scenario: .gameStreaming) - // The next line is only required if using bluetooth headphones from iOS/iPadOS - agoraEngine.setParameters(#"{"che.audio.force_bluetooth_a2dp":true}"#) - - agoraEngine.enableSpatialAudio(true) - let localSpatialAudioConfig = AgoraLocalSpatialAudioConfig() - localSpatialAudioConfig.rtcEngine = agoraEngine - localSpatial = AgoraLocalSpatialAudioKit.sharedLocalSpatialAudio(with: localSpatialAudioConfig) - - // By default Agora subscribes to the audio streams of all remote users. - // Unsubscribe all remote users; otherwise, the audio reception range you set - // is invalid. - localSpatial.muteLocalAudioStream(true) - localSpatial.muteAllRemoteAudioStreams(true) - - // Set the audio reception range, in meters, of the local user - localSpatial.setAudioRecvRange(50) - // Set the length, in meters, of unit distance - localSpatial.setDistanceUnit(1) - - // Update self position - let pos = [NSNumber(0.0), NSNumber(0.0), NSNumber(0.0)] - let forward = [NSNumber(1.0), NSNumber(0.0), NSNumber(0.0)] - let right = [NSNumber(0.0), NSNumber(1.0), NSNumber(0.0)] - let up = [NSNumber(0.0), NSNumber(0.0), NSNumber(1.0)] - - self.localSpatial.updateSelfPosition(pos, axisForward: forward, axisRight: right, axisUp: up) - } - ``` - - 1. To execute this method at startup, add the following line after `initializeAgoraEngine()` to the `viewDidLoad` method: - - ```swift - configureSpatialAudioEngine() - ``` - -1. **Set the spatial position of a remote user** - - To update the spatial position of a remote user: - - 1. Define the `AgoraRemoteVoicePositionInfo` and call `updateRemotePosition`. - - ```swift - @objc func updateSpatialAudioPosition() { - let positionInfo = AgoraRemoteVoicePositionInfo() - - // Set the coordinates in the world coordinate system. - // This parameter is an array of length 3 - // The three values represent the front, right, and top coordinates - positionInfo.position = [NSNumber(2.0), NSNumber(4.0), NSNumber(0.0)] - - // Set the unit vector of the x axis in the coordinate system. - // This parameter is an array of length 3, - // The three values represent the front, right, and top coordinates - positionInfo.forward = [NSNumber(1.0), NSNumber(0.0), NSNumber(0.0)] - - // Update the spatial position of the specified remote user - localSpatial.updateRemotePosition(remoteUid, positionInfo: positionInfo) - showMessage(title: "Update User Spatial Position", text: "Remote user spatial position updated.") - } - ``` - - 1. To update the spatial position of a specific remote user, you need the `uid` of that user. When a remote user joins the channel, the `didJoinedOfUid` event is fired. To store the `remoteUid`, in `ViewController`, add the following line inside `func rtcEngine(_ engine: AgoraRtcEngineKit, didJoinedOfUid uid: UInt, elapsed: Int) {}` event handler function: - ```swift - remoteUid = uid - ``` diff --git a/shared/video-sdk/develop/spatial-audio/project-implementation/web.mdx b/shared/video-sdk/develop/spatial-audio/project-implementation/web.mdx deleted file mode 100644 index 89b58f37f..000000000 --- a/shared/video-sdk/develop/spatial-audio/project-implementation/web.mdx +++ /dev/null @@ -1,178 +0,0 @@ - - -### Update the user interface - -In a real-word application, you report your local spatial position to a server in your environment and receive positions of remote users in the channel from your server. In this simple example, you use two buttons to set the spatial position of a remote user or the media player. - -To add these buttons to the UI, in `index.html`, add the following code after ``: - -```html -

- -

- Distance: - - - -``` - -### Handle the system logic - -1. **Declare the variables you need** - - You create an instance of `spatialAudioExtension` to implement audio effects. To initialize the extension, you specify the `assetsPath` to the `Wasm` and `JS` files. To keep track of the spatial distance, media player state, and the `SpatialAudioProcessor`s that you create, add the following declarations to your `main.js` file after the `import` statements: - - ```javascript - var distance = 0; // Used to define and change the the spatial position - var isMediaPlaying = false; - const processors = new Map(); - const spatialAudioExtension = new SpatialAudioExtension({ - assetsPath: "./node_modules/agora-extension-spatial-audio/external/", - }); - ``` - - For details on how to [Dynamically load `wasm` and `js` files](#dynamically-load-wasm-and-js-file-dependencies) in a production environment, see the references section. - -1. **Update channel parameters** - - To reference the media player track you create, add the following under `let channelParameters = {` - - ```javascript - // A variable to hold the media file track. - mediaPlayerTrack: null, - ``` - -1. **Register the spatial audio extension** - - To register the `spatialAudioExtension` instance with , add the following function to `main.js`: - - ```javascript - async function setupSpatial() { - AgoraRTC.registerExtensions([spatialAudioExtension]); - } - ``` - - To execute this code at startup, add the following after `startBasicCall();` in `main.js`: - - ```javascript - setupSpatial(); - ``` - -### Implement spatial audio effects - -1. **Update position of the local user** - - To the update spatial position of the local user, add the following code to `setupSpatial()` after `AgoraRTC.registerExtensions(...)`: - - ```javascript - const mockLocalUserNewPosition = { - // In a production app, the position can be generated by - // dragging the local user's avatar in a 3D scene. - position: [1, 1, 1], // Coordinates in the world coordinate system - forward: [1, 0, 0], // The unit vector of the front axis - right: [0, 1, 0], // The unit vector of the right axis - up: [0, 0, 1], // The unit vector of the vertical axis - }; - - spatialAudioExtension.updateSelfPosition( - mockLocalUserNewPosition.position, - mockLocalUserNewPosition.forward, - mockLocalUserNewPosition.right, - mockLocalUserNewPosition.up - ); - ``` - -1. **Set up spatial audio processors for remote users** - - When a remote user joins the channel, you create a `SpatialAudioProcessor` and inject it into the user's audio track using the `pipe` method. To do this, in `agoraEngine.on("user-published"` replace the code inside the `if (mediaType == "audio") {...}` block with the following: - - ```javascript - // Create a new SpatialAudioProcessor for the remote user - const processor = spatialAudioExtension.createProcessor(); - // Add the processor to the Map for future use - processors.set(user.uid.toString(), processor); - - // Inject the SpatialAudioProcessor into the audio track - const track = user.audioTrack; - track.pipe(processor).pipe(track.processorDestination); - - // Play the remote audio track. - track.play(); - channelParameters.remoteAudioTrack = user.audioTrack; - ``` - -1. **Update position of a remote user or the media player** - - In a real world application, you receive notification of change in a remote user's position through your server. You then update the spatial position of the remote user in the corresponding `SpatialAudioProcessor`. In this simple example, you change the position of a remote user or the media player when the local user uses UI buttons to increase or decrease the distance. To do this, add the following code to `main.js`: - - ```javascript - document.getElementById("decreaseDistance").onclick = async function () { - distance -= 5; - updatePosition(); - }; - - document.getElementById("increaseDistance").onclick = async function () { - distance += 5; - updatePosition(); - }; - - function updatePosition(){ - document.getElementById("distanceLabel").textContent = distance; - - if (isMediaPlaying){ - const processor = processors.get("media-player"); - processor.updatePlayerPositionInfo({ - position: [distance, 0, 0], - forward: [1, 0, 0], - }); - } else { - const processor = processors.get(channelParameters.remoteUid); - processor.updateRemotePosition({ - position: [distance, 0, 0], - forward: [1, 0, 0], - }); - } - }; - ``` - -1. **Play an audio file** - - Create a `resources` folder in `agora_project` and copy a sample audio file to this folder. When a local user clicks **Play audio file**, you create an audio track from the this file, inject a `SpatialAudioProcessor` into the track, update the player position info, and play the track. To do this, add the following function to `main.js`: - - ```javascript - document.getElementById("playAudioFile").onclick = - async function localPlayerStart() { - if (isMediaPlaying) { - channelParameters.mediaPlayerTrack.setEnabled(false); - isMediaPlaying = false; - document.getElementById("playAudioFile").textContent = "Play audio file"; - return; - } - - const processor = spatialAudioExtension.createProcessor(); - processors.set("media-player", processor); - - const track = await AgoraRTC.createBufferSourceAudioTrack({ - source: "./resources/", - }); - - // Define the spatial position for the local audio player. - const mockLocalPlayerNewPosition = { - position: [0, 0, 0], - forward: [0, 0, 0], - }; - - // Update the spatial position for the local audio player. - processor.updatePlayerPositionInfo(mockLocalPlayerNewPosition); - - track.startProcessAudioBuffer({ loop: true }); - track.pipe(processor).pipe(track.processorDestination); - track.play(); - - isMediaPlaying = true; - document.getElementById("playAudioFile").textContent = "Stop playing audio"; - channelParameters.mediaPlayerTrack = track; - }; - ``` - - \ No newline at end of file diff --git a/shared/video-sdk/develop/spatial-audio/project-test/android.mdx b/shared/video-sdk/develop/spatial-audio/project-test/android.mdx index b38666cae..4a63040af 100644 --- a/shared/video-sdk/develop/spatial-audio/project-test/android.mdx +++ b/shared/video-sdk/develop/spatial-audio/project-test/android.mdx @@ -1,20 +1,13 @@ -3. In Android Studio, open `app/java/com.example./MainActivity` and update `appId`, `channelName` and `token` with the values from Agora Console. - -4. Connect an Android device to your development device. - -5. In Android Studio, click **Run app**. A moment later, you see the project installed on your device. - - If this is the first time you run your app, grant camera and microphone permissions. - - -6. Select a user role. Press **Join** to connect to the same channel as your web demo. - - - -6. Press **Join** to connect to the same channel as your web demo. - +5. **Join a channel** + + Select a user role. Press **Join** to connect to the same channel as your web demo. + + + + Press **Join** to connect to the same channel as your web demo. + 7. **Test spatial audio effects for remote users** diff --git a/shared/video-sdk/develop/spatial-audio/project-test/index.mdx b/shared/video-sdk/develop/spatial-audio/project-test/index.mdx index 865635f93..9c07d8ab4 100644 --- a/shared/video-sdk/develop/spatial-audio/project-test/index.mdx +++ b/shared/video-sdk/develop/spatial-audio/project-test/index.mdx @@ -1,21 +1,20 @@ -import Android from './android.mdx'; -import Ios from './ios.mdx'; -import MacOS from './macos.mdx'; +import Poc3 from './poc3.mdx'; import Electron from './electron.mdx'; -import Unity from './unity.mdx'; import Windows from './windows.mdx'; -import Web from './web.mdx'; import Flutter from './flutter.mdx'; import ReactNative from './react-native.mdx'; import Unreal from './unreal.mdx'; + + + +1. [Generate a temporary token](../reference/manage-agora-account#generate-a-temporary-token) in . + +2. In your browser, navigate to the web demo and update _App ID_, _Channel_, and _Token_ with the values for your temporary token, then click **Join**. + + - - - - - diff --git a/shared/video-sdk/develop/spatial-audio/project-test/poc3.mdx b/shared/video-sdk/develop/spatial-audio/project-test/poc3.mdx new file mode 100644 index 000000000..070d9833e --- /dev/null +++ b/shared/video-sdk/develop/spatial-audio/project-test/poc3.mdx @@ -0,0 +1,20 @@ +import TestFirstSteps from '@docs/shared/common/project-test/rtc-first-steps.mdx'; +import Android from './android.mdx'; +import Ios from './ios.mdx'; +import Web from './web.mdx'; +import Unity from './unity.mdx'; +import ReactJS from './react-js.mdx'; +import MacOS from './macos.mdx'; + + + + + + + + + + + + + \ No newline at end of file diff --git a/shared/video-sdk/develop/spatial-audio/project-test/react-js.mdx b/shared/video-sdk/develop/spatial-audio/project-test/react-js.mdx new file mode 100644 index 000000000..6d582e875 --- /dev/null +++ b/shared/video-sdk/develop/spatial-audio/project-test/react-js.mdx @@ -0,0 +1,25 @@ + + +5. **Run the 3d spatial audio app**: + + In **Choose a sample code**, select **3D Spatial Audio**. + +1. **Join a channel** + + + Click **Join** to start a session. When you select **Host**, the local video is published and played in the . When you select **Audience**, the remote stream is subscribed and played. + + + + Press **Join** to connect to the same channel as your web demo. + + +1. **Test the spatial audio features** + + 1. Click **Enable spatial audio**. The extension is configured and some new UI controls appeared in the page. + 2. Click **Play Audio File**. The audio is played. + 3. Use the `+` and `-` buttons to increase and decrease the spatial audio distance of the audio file. + 4. Click **Stop Audio File**. The audio file is stopped. + 5. Use the Use the `+` and `-` buttons to update the spatial audio position of the remote user and note the resulting change in the remote audio. + + diff --git a/shared/video-sdk/develop/spatial-audio/project-test/swift.mdx b/shared/video-sdk/develop/spatial-audio/project-test/swift.mdx index e337da18e..a55228170 100644 --- a/shared/video-sdk/develop/spatial-audio/project-test/swift.mdx +++ b/shared/video-sdk/develop/spatial-audio/project-test/swift.mdx @@ -1,28 +1,20 @@ -3. In the `ViewController`, update `appID`, `channelName`, and `token` with the values from . +5. **Join a channel** -4. Run your . + + Select **Broadcaster** mode and press **Join** to connect to the same channel as your web demo. + - If this is the first time you run the project, grant microphone and camera access to your . + + Press **Join** to connect to the same channel as your web demo. + - - If you use an iOS simulator, you see the remote video only. You cannot see the local video stream because of Apple simulator hardware restrictions. - - - -5. Select **Broadcaster** mode and press **Join** to connect to the same channel as your web demo. - - - -5. Press **Join** to connect to the same channel as your web demo. - - -6. Test spatial audio effects for users +1. **Test spatial audio effects for users** 1. Press **Update Spatial Audio Position**. Your updates the position of the remote user in the spatial audio engine. 1. Listen to the audio of the remote user. You feel that the location of the remote user has shifted. -7. Test spatial audio effects for media player +1. **Test spatial audio effects for media player** 1. To setup spatial audio position of your media player, add [Media Playing](../../video-calling/develop/play-media) to your . diff --git a/shared/video-sdk/develop/spatial-audio/project-test/unity.mdx b/shared/video-sdk/develop/spatial-audio/project-test/unity.mdx index 1637e875d..b459bc5c3 100644 --- a/shared/video-sdk/develop/spatial-audio/project-test/unity.mdx +++ b/shared/video-sdk/develop/spatial-audio/project-test/unity.mdx @@ -1,20 +1,23 @@ - -3. In Unity Editor, in your script file, update `_appID`, `_channelName` and `_token` with the values for your temporary token. + -4. In Unity Editor, click **Play**. A moment later you see the running on your development device. +5. **Choose this sample in the reference app** + From the main screen of the , choose **** from the dropdown and then select **3D Spatial Audio**. -5. In Android Studio, click **Run app**. A moment later, you see the installed on your device. +1. **Join a channel** - -6. Select a user role. Press **Join** to connect to the same channel as your web demo. - + + 1. Click **Join** to start a session. + 2. Select a role using the toggles. + - When you join as a **Host**, the local video is published and played in the . + - When you join as **Audience**, the remote stream is subscribed and played. + - -6. Press **Join** to connect to the same channel as your web demo. - + + Click **Join** to start a call. Now, you can see yourself on the device screen and talk to the remote user using your . + -7. **Test spatial audio effects for remote users** +1. **Test spatial audio effects for remote users** 1. Put on earphones connected to the test device. diff --git a/shared/video-sdk/develop/spatial-audio/project-test/web.mdx b/shared/video-sdk/develop/spatial-audio/project-test/web.mdx index d441ff168..28892c73e 100644 --- a/shared/video-sdk/develop/spatial-audio/project-test/web.mdx +++ b/shared/video-sdk/develop/spatial-audio/project-test/web.mdx @@ -1,34 +1,25 @@ -3. On your separate development device, update `appID`, `channel` and `token` with your values in `main.js`. +5. **Join a channel** -4. To start the dev server, execute the following command in the terminal: + + Select a user role using the radio buttons and click **Join**. + - ```bash - npm run dev - ``` - Use the Url displayed in the terminal to open the in your browser. + + To connect to a channel, click **Join**. + - -5. Select a user role using the radio buttons and click **Join**. +1. **Test spatial audio** - If this is the first time you run the project, grant microphone and camera access to your . - + 1. Put on earphones connected to your development machine. Speak into the microphone connected to the web demo app. You hear the audio through your earphones. - -5. To connect to a channel, click **Join**. - - If this is the first time you run the project, you need to grant microphone and camera access to your . - + 1. Use the `-` and `+` buttons to change the distance. Speak into the microphone connected to the web demo app again. Note the change in quality of audio playing through the earphones. When you increase the distance, you feel that the remote user has moved away from you. When you decrease the distance you feel that the remote user is closer to you. -6. Put on earphones connected to your development machine. Speak into the microphone connected to the web demo app. You hear the audio through your earphones. + 1. Press **Play audio file**. -7. Use the `-` and `+` buttons to change the distance. Speak into the microphone connected to the web demo app again. Note the change in quality of audio playing through the earphones. When you increase the distance, you feel that the remote user has moved away from you. When you decrease the distance you feel that the remote user is closer to you. + You hear the audio file played through your earphones. -8. Press **Play audio file**. - - You hear the audio file played through your earphones. - -9. Use the `-` and `+` buttons to change the distance. When you increase the distance, you feel that the audio source has moved away from you. When you decrease the distance you feel that the audio source is closer to you. + 1. Use the `-` and `+` buttons to change the distance. When you increase the distance, you feel that the audio source has moved away from you. When you decrease the distance you feel that the audio source is closer to you. \ No newline at end of file diff --git a/shared/video-sdk/develop/spatial-audio/reference/android.mdx b/shared/video-sdk/develop/spatial-audio/reference/android.mdx index 81e79b348..c56ca9b9b 100644 --- a/shared/video-sdk/develop/spatial-audio/reference/android.mdx +++ b/shared/video-sdk/develop/spatial-audio/reference/android.mdx @@ -1,17 +1,3 @@ -### API Reference - -- ILocalSpatialAudioEngine - -- updateSelfPosition - -- updateRemotePosition - -- removeRemotePosition - -- clearRemotePositions - -- RemoteVoicePositionInfo - diff --git a/shared/video-sdk/develop/spatial-audio/reference/ios.mdx b/shared/video-sdk/develop/spatial-audio/reference/ios.mdx index 06f4955e0..52ad025f4 100644 --- a/shared/video-sdk/develop/spatial-audio/reference/ios.mdx +++ b/shared/video-sdk/develop/spatial-audio/reference/ios.mdx @@ -1,17 +1,3 @@ -### API Reference - -- AgoraLocalSpatialAudioKit - -- updateSelfPosition - -- updateRemotePosition - -- removeRemotePosition - -- clearRemotePositions - -- AgoraRemoteVoicePositionInfo - diff --git a/shared/video-sdk/develop/spatial-audio/reference/macos.mdx b/shared/video-sdk/develop/spatial-audio/reference/macos.mdx index 6a3be0961..5b1a67bc4 100644 --- a/shared/video-sdk/develop/spatial-audio/reference/macos.mdx +++ b/shared/video-sdk/develop/spatial-audio/reference/macos.mdx @@ -1,17 +1,3 @@ -### API Reference - -- AgoraLocalSpatialAudioKit - -- updateSelfPosition - -- updateRemotePosition - -- removeRemotePosition - -- clearRemotePositions - -- AgoraRemoteVoicePositionInfo - diff --git a/shared/video-sdk/develop/spatial-audio/reference/unity.mdx b/shared/video-sdk/develop/spatial-audio/reference/unity.mdx index 56655d6a4..72de34065 100644 --- a/shared/video-sdk/develop/spatial-audio/reference/unity.mdx +++ b/shared/video-sdk/develop/spatial-audio/reference/unity.mdx @@ -1,31 +1,3 @@ -### API Reference - - - -- ILocalSpatialAudioEngine - -- UpdateSelfPosition - -- UpdateRemotePosition - -- RemoveRemotePosition - -- ClearRemotePositions - - - - -- ILocalSpatialAudioEngine - -- UpdateSelfPosition - -- UpdateRemotePosition - -- RemoveRemotePosition - -- ClearRemotePositions - - diff --git a/shared/video-sdk/develop/spatial-audio/reference/web.mdx b/shared/video-sdk/develop/spatial-audio/reference/web.mdx index d3b74e710..66046a761 100644 --- a/shared/video-sdk/develop/spatial-audio/reference/web.mdx +++ b/shared/video-sdk/develop/spatial-audio/reference/web.mdx @@ -1,5 +1,7 @@ +- For a working example, check out the [Spatial audio web demo](https://webdemo.agora.io/spatialAudioExtention/index.html) and the associated [source code](https://github.com/AgoraIO/API-Examples-Web/tree/main/Demo/spatialAudioExtention). + ### Supported audio routes The following table summarizes true two-channel playback support for different audio routes. @@ -298,4 +300,4 @@ Spatial audio related error codes: * `INVALID_PARMS`:Invalid argument. * `UNKNOWN_ERROR`:Unknown problem. - \ No newline at end of file + diff --git a/shared/video-sdk/develop/stream-raw-audio-and-video/project-implementation/android.mdx b/shared/video-sdk/develop/stream-raw-audio-and-video/project-implementation/android.mdx index 0f6d790e3..b113ae559 100644 --- a/shared/video-sdk/develop/stream-raw-audio-and-video/project-implementation/android.mdx +++ b/shared/video-sdk/develop/stream-raw-audio-and-video/project-implementation/android.mdx @@ -5,7 +5,7 @@ To enable or disable processing of captured raw video data, add a button to the user interface. In `/app/res/layout/activity_main.xml` add the following lines before ``: -``` xml +```xml + ) : + ( + + + + + )} + ``` + + \ No newline at end of file diff --git a/shared/video-sdk/get-started/get-started-sdk/project-implementation/react-native.mdx b/shared/video-sdk/get-started/get-started-sdk/project-implementation/react-native.mdx index 40f81d649..54aa996f1 100644 --- a/shared/video-sdk/get-started/get-started-sdk/project-implementation/react-native.mdx +++ b/shared/video-sdk/get-started/get-started-sdk/project-implementation/react-native.mdx @@ -478,7 +478,7 @@ To implement this logic, take the following steps: } }; ``` - For choose `AudienceLatencyLevelLowLatency`. Low latency is a feature of and its use is subject to special [pricing](../reference/pricing#unit-pricing). + For choose `AudienceLatencyLevelLowLatency`. Low latency is a feature of and its use is subject to special [pricing](../overview/pricing#unit-pricing). diff --git a/shared/video-sdk/get-started/get-started-sdk/project-implementation/swift.mdx b/shared/video-sdk/get-started/get-started-sdk/project-implementation/swift.mdx deleted file mode 100644 index 0763363dd..000000000 --- a/shared/video-sdk/get-started/get-started-sdk/project-implementation/swift.mdx +++ /dev/null @@ -1,221 +0,0 @@ - -import CreateUI from '@docs/assets/code/video-sdk/get-started-sdk/swift/create-ui.mdx'; -import ShowMessage from '@docs/assets/code/video-sdk/get-started-sdk/swift/show-message.mdx'; -import JoinAndLeave from '@docs/assets/code/video-sdk/get-started-sdk/swift/join-and-leave.mdx'; -import ViewDidDisappear from '@docs/assets/code/video-sdk/get-started-sdk/swift/view-did-disappear.mdx'; -import RoleAction from '@docs/assets/code/video-sdk/get-started-sdk/swift/role-action.mdx'; - -### Implement the user interface - -To implement the user interface, create code with: - -- Views for local and remote video. - -- A button for the user to **Join** or **Leave** the channel. - - -- A selector so the user can join a channel as host or audience. - - -To create this UI, in `ViewController`, replace the contents of the file with the following: - - - -### Handle the system logic - -When your launches, ensure that the permissions necessary to insert feature into the are granted. If the permissions are not granted, use the built-in feature to request them; if they are, return `true`. - -1. **Import ** - - In `ViewController`, add the following line after the last `import` statement: - - ``` swift - import AgoraRtcKit - ``` - - If Xcode does not recognize this import, click **File** > **Packages** > **Reset Package Caches**. - -2. **Handle hardware permissions on the device** - - In `ViewController`, add the following lines after the `buttonAction(sender: UIButton!)` function: - - ``` swift - func checkForPermissions() async -> Bool { - var hasPermissions = await self.avAuthorization(mediaType: .video) - // Break out, because camera permissions have been denied or restricted. - if !hasPermissions { return false } - hasPermissions = await self.avAuthorization(mediaType: .audio) - return hasPermissions - } - - func avAuthorization(mediaType: AVMediaType) async -> Bool { - let mediaAuthorizationStatus = AVCaptureDevice.authorizationStatus(for: mediaType) - switch mediaAuthorizationStatus { - case .denied, .restricted: return false - case .authorized: return true - case .notDetermined: - return await withCheckedContinuation { continuation in - AVCaptureDevice.requestAccess(for: mediaType) { granted in - continuation.resume(returning: granted) - } - } - @unknown default: return false - } - } - ``` -3. **Show status updates to your users** - - In `ViewController`, add the following method to the `ViewController` class: - - - - -2. **Show status updates to your users** - - In `ViewController`, add the following method to the `ViewController` class: - - - - - -### Implement the channel logic - -When a user opens this , you initialize the . When the user taps a button, the joins or leaves a channel. - -The following figure shows the call sequence of implementing . - - -![video call logic ios](/images/video-sdk/video-call-logic-ios.svg) - - -![ils call logic ios](/images/video-sdk/ils-call-logic-ios.svg) - - -To implement this logic, take the following steps: - -1. **Declare the variables that you use to integrate into your ** - - Add the following lines to the top of the `ViewController` class: - - ``` swift - // The main entry point for Video SDK - var agoraEngine: AgoraRtcEngineKit! - // By default, set the current user role to broadcaster to both send and receive streams. - var userRole: AgoraClientRole = .broadcaster - - // Update with the App ID of your project generated on Agora Console. - let appID = "<#Your app ID#>" - // Update with the temporary token generated in Agora Console. - var token = "<#Your temp access token#>" - // Update with the channel name you used to generate the token in Agora Console. - var channelName = "<#Your channel name#>" - ``` - -2. **Initialize the ** - - To implement , you use to create an instance. In `ViewController`, add the following lines after the `leaveChannel()` function: - - ``` swift - func initializeAgoraEngine() { - let config = AgoraRtcEngineConfig() - // Pass in your App ID here. - config.appId = appID - // Use AgoraRtcEngineDelegate for the following delegate parameter. - agoraEngine = AgoraRtcEngineKit.sharedEngine(with: config, delegate: self) - } - ``` - - Each `AgoraRtcEngineKit` object supports one profile only. If you want to switch to another profile, call `destroy` to release the current `AgoraRtcEngineKit` object and then create a new one by calling `sharedEngine(with: , delegate: )` again. - - You see a compilation error. Worry not, you fix this now by coding `ViewController` to delegate `AgoraRtcEngineDelegate`. - -3. **Enable your to display a remote video stream** - - In `ViewController`, add the following lines after the `ViewController` class: - - ``` swift - extension ViewController: AgoraRtcEngineDelegate { - // Callback called when a new host joins the channel - func rtcEngine(_ engine: AgoraRtcEngineKit, didJoinedOfUid uid: UInt, elapsed: Int) { - let videoCanvas = AgoraRtcVideoCanvas() - videoCanvas.uid = uid - videoCanvas.renderMode = .hidden - videoCanvas.view = remoteView - agoraEngine.setupRemoteVideo(videoCanvas) - } - } - ``` - - The compilation error disappears. Yay. - -4. **Enable your to display a local video stream** - - In `ViewController`, add the following lines after the `initializeAgoraEngine` function: - - ``` swift - func setupLocalVideo() { - // Enable the video module - agoraEngine.enableVideo() - // Start the local video preview - agoraEngine.startPreview() - let videoCanvas = AgoraRtcVideoCanvas() - videoCanvas.uid = 0 - videoCanvas.renderMode = .hidden - videoCanvas.view = localView - // Set the local video view - agoraEngine.setupLocalVideo(videoCanvas) - } - ``` - - -

You can enable both cameras using enableMultiCamera.
-
- -5. **Join and leave a channel** - - - You assign all users in the channel the `.broadcaster` role. This role has rights to stream video and audio to a channel. For , set all users as `.broadcaster`. - - - You assign event hosts the `.broadcaster` role. This role has rights to stream video and audio to a channel, the audience views content streamed to the channel by the broadcaster, For , set the role chosen by the user. - - - In `ViewController`, replace the existing `joinChannel()` and `leaveChannel()` functions with the following: - - - - -6. **Enable the user to join a channel as the host or the audience** - In `ViewController`, replace the existing `func roleAction(sender: UISegmentedControl!)` function with the following: - - - - - -### Start and stop your - -In this implementation, you initiate when you open the . The user joins and leaves a call using the `Join` button. - -To implement this feature: - -1. **Initialize and local video when the view is loaded** - - In `ViewController`, update `viewDidLoad` as follows: - - ``` swift - override func viewDidLoad() { - super.viewDidLoad() - // Do any additional setup after loading the view. - // Initializes the video view - initViews() - // The following functions are used when calling Agora APIs - initializeAgoraEngine() - } - ``` - -2. **Leave the channel and clean up all the resources used by your ** - - In `ViewController`, add the following lines after the `viewDidLoad` function: - - - diff --git a/shared/video-sdk/get-started/get-started-sdk/project-implementation/unity.mdx b/shared/video-sdk/get-started/get-started-sdk/project-implementation/unity.mdx index 060161fb5..132d7e8ca 100644 --- a/shared/video-sdk/get-started/get-started-sdk/project-implementation/unity.mdx +++ b/shared/video-sdk/get-started/get-started-sdk/project-implementation/unity.mdx @@ -1,492 +1,328 @@ - -### Implement the user interface - -For a basic , you need: - -* One view for local video -* One view for remote video - -* Buttons so the user can join or leave a channel - - -* One selector to choose to join as the host or the audience + +![image](/images/video-sdk/ils-call-logic-unity.svg) -To implement this user interface, in your Unity project: - -1. **Add the join and leave buttons** - - To create buttons that you access programmatically to implement the workflow: - - 1. Right-click **Sample Scene**, then click **Game Object** > **UI** > **Button - TextMeshPro**. A button appears in the **Scene** Canvas. - - If you can’t see the button clearly, in Unity, zoom out and rotate the scene. - - 2. In **Inspector**, rename **Button** to **Leave**, then change the following coordinates: - - * **Pos X** - 350 - * **Pos Y** - -172 - - 3. Select the **Text(TMP)** sub-item of **Leave**, and in **Inspector**, change **Text** to *Leave*. - - 4. Use the same procedure to create a button called **Join** with a **Text(TMP)** sub-item where its **Text** says *Join*. - - 5. To shift **Join** to the left of the Canvas, in **Inspector** change the following coordinates: - - * **Pos X** - -350 - * **Pos Y** - -172 - -2. **Add the local and remote video views** - - To create raw images using that you access programmatically to display video streams on: - - 1. For local video, right-click **Canvas**, then click **UI** > **Raw Image**. - - 2. In **Inspector**, rename **RawImage** to **LocalView**, and change the following coordinates: - - * **Pos X** - -250 - * **Pos Y** - 0 - * **Width** - 250 - * **Height** - 250 - - 3. For remote video, use the same procedure to create a **RawImage** called **RemoteView** with the following coordinates: - - * **Pos X** - 250 - * **Pos Y** - 0 - * **Width** - 250 - * **Height** - 250 - - -3. **Enable the user to join a channel as the host or the audience** - - 1. In your project, right-click **Canvas**, then click **UI** > **Toggle**. - - 2. In **Inspector**, rename **Toggle** to "Broadcaster", then change **Pos Y** to "-30". - - 3. Select the **Label** sub-item of **Broadcaster**, in **Inspector**, change the **Label** to "Host". - - 4. Use the same procedure to create a **Toggle** called **Audience** with a **Label** sub-item which says **Audience**. Don’t change **Pos Y** for **Audience**, this **Toggle** is best in the center of the **Canvas**. + +![image](/images/video-sdk/video-call-logic-unity.svg) -### Handle the system logic - -Import the necessary .NET libraries, set up your to run on Android, and request permissions for the camera and microphone. - -1. **Create a new script and import the Unity libraray** +Best practice suggests separating the workflows involving from your UI implementation. In the sample project, The + sample project implements logic in the [`AgoraManager`](https://github.com/AgoraIO/video-sdk-samples-unity/blob/main/Assets/agora-manager/AgoraManager.cs); This class contains the + fundamental Agora logic required for implementation. [`GetStartedManager`](https://github.com/AgoraIO/video-sdk-samples-unity/blob/main/Assets/get-started/GetStartedManager.cs) extends the functionality + of `AgoraManager` and serves as an interface to the `GetStarted` classes. - 1. In **Project**, open **Assets** > **Agora-RTC-Plugin** > **Agora-Unity-RTC-SDK** > **Code**. Right-click **Code**, then click **Create** > **C# Script**. In **Assets**, you see the `NewBehaviourScript` file that you use to implement in your . +`AgoraManager` encapsulates the `RTCEngine` instance and core functionality as illustrated by the excerpts below: - 1. In **Inspector**, click **Open**. `NewBehaviourScript.cs` opens in your default text editor. +1. **Import the classes and interfaces** - 1. In `NewBehaviourScript.cs`, add the following namespace to the list of namespace declaration: - - ``` csharp - using UnityEngine.UI; - ``` -3. **Manage Android permissions** + ``` csharp + using Agora.Rtc; + ``` - 1. Add the Unity Android libraries. In your script file, add the following to the list of namespace declarations: +1. **Declare variables to create an instance and join a channel** - ```csharp - #if (UNITY_2018_3_OR_NEWER && UNITY_ANDROID) - using UnityEngine.Android; - #endif - ``` + ``` csharp + // Define some variables to be used later. + internal string _appID; + internal string _channelName; + internal string _token; + internal uint remoteUid; + internal IRtcEngine agoraEngine; + internal VideoSurface LocalView; + internal VideoSurface RemoteView; + internal ConfigData configData; + internal AREA_CODE region = AREA_CODE.AREA_CODE_GLOB; + internal string userRole = ""; + ``` + +1. **Request camera and microphone permissions** - 2. Create a permissions list. In your script file, add the following lines before `Start`: + ```csharp + #if (UNITY_2018_3_OR_NEWER && UNITY_ANDROID) + // Define an ArrayList of permissions required for Android devices. + private ArrayList permissionList = new ArrayList() { Permission.Camera, Permission.Microphone }; + #endif - ``` csharp + // Define a private function called CheckPermissions() to check for required permissions. + public void CheckPermissions() + { #if (UNITY_2018_3_OR_NEWER && UNITY_ANDROID) - private ArrayList permissionList = new ArrayList() { Permission.Camera, Permission.Microphone }; - #endif - ``` - - 3. Check permissions are granted. In your script file, add the following code after `Update`: - - ``` csharp - private void CheckPermissions() { - #if (UNITY_2018_3_OR_NEWER && UNITY_ANDROID) - foreach (string permission in permissionList) + // Check for each permission in the permission list and request the user to grant it if necessary. + foreach (string permission in permissionList) + { + if (!Permission.HasUserAuthorizedPermission(permission)) { - if (!Permission.HasUserAuthorizedPermission(permission)) - { - Permission.RequestUserPermission(permission); - } + Permission.RequestUserPermission(permission); } - #endif } - ``` + #endif + } + ``` -4. **Bind your script to the canvas** +1. **Configure an instance and set up an event handler** - 1. In `Assets/Agora-RTC-Plugin/Agora-Unity-RTC-SDK/Code`, drag and drop `NewBehaviourScript.cs` to **Canvas** on the **Hierarchy**. + ``` csharp + // Define a public function called SetupAgoraEngine to setup the video SDK engine. + public virtual void SetupAgoraEngine() + { + if(_appID == "" || _token == "") + { + Debug.Log("Please set an app ID and a token in the config file."); + return; + } + // Create an instance of the video SDK engine. + agoraEngine = Agora.Rtc.RtcEngine.CreateAgoraRtcEngine(); - 1. Click **Inspector**, you see that your script is added to **Canvas**. + // Set context configuration based on the product type + CHANNEL_PROFILE_TYPE channelProfile = configData.product == "Video Calling" + ? CHANNEL_PROFILE_TYPE.CHANNEL_PROFILE_COMMUNICATION + : CHANNEL_PROFILE_TYPE.CHANNEL_PROFILE_LIVE_BROADCASTING; -### Implement the channel logic + RtcEngineContext context = new RtcEngineContext(_appID, 0, channelProfile, + AUDIO_SCENARIO_TYPE.AUDIO_SCENARIO_DEFAULT, region, null); -The following figure shows the API call sequence of implementing . + agoraEngine.Initialize(context); - - ![image](/images/video-sdk/video-call-logic-unity.svg) - - - - ![image](/images/video-sdk/ils-call-logic-unity.svg) - + // Enable the video module. + agoraEngine.EnableVideo(); -To implement this logic, take the following steps: + // Set the user role as broadcaster. + agoraEngine.SetClientRole(CLIENT_ROLE_TYPE.CLIENT_ROLE_BROADCASTER); -1. **Import the library** + // Attach the eventHandler + InitEventHandler(); - In your script file, add the following library after `using UnityEngine.UI;`: - ``` csharp - using Agora.Rtc; + } ``` + For more details, see the following: -1. **Declare the variables that you use to create and join a channel** + - CreateAgoraRtcEngine - In your script file, add the following declarations to `NewBehaviourScript`: - - ``` csharp - // Fill in your app ID. - private string _appID = ""; - // Fill in your channel name. - private string _channelName = ""; - // Fill in the temporary token you obtained from Agora Console. - private string _token = ""; - // A variable to save the remote user uid. - private uint remoteUid; - internal VideoSurface LocalView; - internal VideoSurface RemoteView; - internal IRtcEngine RtcEngine; - ``` - + - RtcEngineContext - - ``` csharp - // Fill in your app ID. - private string _appID = ""; - // Fill in your channel name. - private string _channelName = ""; - // Fill in the temporary token you obtained from Agora Console. - private string _token = ""; - // A variable to hold the user role. - private string clientRole = ""; - // A variable to save the remote user uid. - private uint remoteUid; - private Toggle toggle1; - private Toggle toggle2; - internal VideoSurface LocalView; - internal VideoSurface RemoteView; - internal IRtcEngine RtcEngine; - ``` - + - Initialize -1. **Setup ** + - EnableVideo - To setup an instance of , in your script file, add the following after `CheckPermissions`: - - ``` csharp - private void SetupVideoSDKEngine() - { - // Create an instance of the video SDK. - RtcEngine = Agora.Rtc.RtcEngine.CreateAgoraRtcEngine(); - // Specify the context configuration to initialize the created instance. - RtcEngineContext context = new RtcEngineContext(_appID, 0, - CHANNEL_PROFILE_TYPE.CHANNEL_PROFILE_COMMUNICATION, - AUDIO_SCENARIO_TYPE.AUDIO_SCENARIO_DEFAULT,AREA_CODE.AREA_CODE_GLOB, null); - // Initialize the instance. - RtcEngine.Initialize(context); - } - ``` - - - ``` csharp - private void SetupVideoSDKEngine() - { - // Create an instance of the video SDK. - RtcEngine = Agora.Rtc.RtcEngine.CreateAgoraRtcEngine(); - // Specify the context configuration to initialize the created instance. - RtcEngineContext context = new RtcEngineContext(_appID, 0, - CHANNEL_PROFILE_TYPE.CHANNEL_PROFILE_LIVE_BROADCASTING, - AUDIO_SCENARIO_TYPE.AUDIO_SCENARIO_DEFAULT, AREA_CODE.AREA_CODE_GLOB, null); - // Initialize the created instance. - RtcEngine.Initialize(context); - } - ``` - + - InitEventHandler -1. **Handle and respond to events** + - SetClientRole - To register the callbacks, in your script file, add the following at the end of `NewBehaviourScript`: ``` csharp - private void InitEventHandler() - { - // Creates a UserEventHandler instance. - UserEventHandler handler = new UserEventHandler(this); - RtcEngine.InitEventHandler(handler); - } - + // An event handler class to deal with video SDK events internal class UserEventHandler : IRtcEngineEventHandler { - private readonly NewBehaviourScript _videoSample; - - internal UserEventHandler(NewBehaviourScript videoSample) + internal readonly AgoraManager agoraManager; + internal UserEventHandler(AgoraManager videoSample) { - _videoSample = videoSample; + agoraManager = videoSample; } // This callback is triggered when the local user joins the channel. public override void OnJoinChannelSuccess(RtcConnection connection, int elapsed) { Debug.Log("You joined channel: " +connection.channelId); } + // This callback is triggered when a remote user leaves the channel or drops offline. + public override void OnUserOffline(RtcConnection connection, uint uid, USER_OFFLINE_REASON_TYPE reason) + { + agoraManager.DestroyVideoView(uid); + } + public override void OnUserJoined(RtcConnection connecn, uint uid, int elapsed) + { + agoraManager.MakeVideoView(uid); + // Save the remote user ID in a variable. + agoraManager.remoteUid = uid; + } } ``` + For more details, see the following: -1. **Reference the UI elements from SampleScene** + - OnJoinChannelSuccess + + - OnUserOffline + + - OnUserJoined + +1. **Setup remote user video views** - In your script file, add the reference code before `SetupVideoSDKEngine`: - - ``` csharp - private void SetupUI() - { - GameObject go = GameObject.Find("LocalView"); - LocalView = go.AddComponent(); - go.transform.Rotate(0.0f, 0.0f, 180.0f); - go = GameObject.Find("RemoteView"); - RemoteView = go.AddComponent(); - go.transform.Rotate(0.0f, 0.0f, 180.0f); - go = GameObject.Find("Leave"); - go.GetComponent