Skip to content

Commit

Permalink
Merge pull request #3522 from AgoraIO/translation/4.x-api-ref
Browse files Browse the repository at this point in the history
New YiCAT updates
  • Loading branch information
Cilla-luodan authored Feb 21, 2024
2 parents d872ab3 + 06a5a6a commit 2ba6d55
Show file tree
Hide file tree
Showing 30 changed files with 69 additions and 83 deletions.
9 changes: 4 additions & 5 deletions en-US/dita/RTC-NG/API/api_imediaengine_pullaudioframe.dita
Original file line number Diff line number Diff line change
Expand Up @@ -25,17 +25,16 @@
</section>
<section id="detailed_desc">
<title>Details</title>
<p>Before calling this method, you need to call <xref keyref="setExternalAudioSink"/> to notify the app to enable and set the external rendering.</p>
<p>Before calling this method, call <xref keyref="setExternalAudioSink"/><codeph>(<parmname>enabled</parmname>: <ph keyref="true"/>)</codeph> to notify the app to enable and set the external audio rendering.</p>
<p>After a successful call of this method, the app pulls the decoded and mixed audio data for playback.</p>
<note type="attention">
<ul>
<li>This method only supports pulling data from custom audio source. If you need to pull the data captured by the SDK, do not call this method.</li>
<li>Call this method after joining a channel.</li>
<li>Once you enable the external audio sink, the app will not retrieve any audio data from the <xref keyref="onPlaybackAudioFrame"/> callback.</li>
<li>The difference between this method and the <apiname keyref="onPlaybackAudioFrame"/> callback is as follows:<ul>
<li>Both this method and <xref keyref="onPlaybackAudioFrame"/> callback can be used to get audio data after remote mixing. Note that after calling <apiname keyref="setExternalAudioSink"/> to enable external audio rendering, the app no longer receives data from the <apiname keyref="onPlaybackAudioFrame"/> callback. Therefore, you should choose between this method and the <apiname keyref="onPlaybackAudioFrame"/> callback based on your actual business requirements. The specific distinctions between them are as follows:<ul>
<li>After calling this method, the app automatically pulls the audio data from the SDK. By setting the audio data parameters, the SDK adjusts the frame buffer to help the app handle latency, effectively avoiding audio playback jitter.</li>
<li>The SDK sends the audio data to the app through the <apiname keyref="onPlaybackAudioFrame"/> callback. Any delay in processing the audio frames may result in audio jitter.</li>
<li>After a successful method call, the app automatically pulls the audio data from the SDK. After setting the audio data parameters, the SDK adjusts the frame buffer and avoids problems caused by jitter in the external audio playback.</li>
</ul></li>
<li>This method is only used for retrieving audio data after remote mixing. If you need to get audio data from different audio processing stages such as capture and playback, you can register the corresponding callbacks by calling <xref keyref="registerAudioFrameObserver"/>.</li>
</ul> </note> </section>
<section id="parameters" props="native unreal bp unity flutter cs">
<title>Parameters</title>
Expand Down
2 changes: 1 addition & 1 deletion en-US/dita/RTC-NG/API/api_irtcengine_querydevicescore.dita
Original file line number Diff line number Diff line change
Expand Up @@ -40,7 +40,7 @@
<title><ph keyref="return-section-title"/></title>
<p props="flutter">When the method call succeeds, it returns a value in the range of [0,100], indicating the current device's score. The larger the value, the stronger the device capability. Most devices are rated between 60 and 100. When the method call fails, the <xref keyref="AgoraRtcException"/> exception is thrown; and you need to catch the exception and handle it accordingly.</p>
<ul>
<li props="native electron unity rn">>0: The method call succeeeds, the value is the current device's score, the range is [0,100], the larger the value, the stronger the device capability. Most devices are rated between 60 and 100.</li>
<li props="native electron unity rn">>0: The method call succeeeds, the value is the current device's score, the range is [0,100], the larger the value, the stronger the device capability. Most devices are rated between 60 and 100.</li>
<li>&lt; 0: Failure.</li>
</ul> </section>
</refbody>
Expand Down
6 changes: 3 additions & 3 deletions en-US/dita/RTC-NG/API/api_irtcengine_setchannelprofile.dita
Original file line number Diff line number Diff line change
Expand Up @@ -42,14 +42,14 @@
<pd id="channelprofiletype">
<p props="ios mac cpp unreal bp electron unity flutter rn cs">The channel profile. See <xref keyref="CHANNEL_PROFILE_TYPE" />.</p>
<p props="android">The channel profile.<ul>
<li><ph keyref="CHANNEL_PROFILE_COMMUNICATION" />(0): Communication. Use this profile when there are only two users in the channel.</li>
<li><ph keyref="CHANNEL_PROFILE_LIVE_BROADCASTING" />(1): Live streaming. Use this profile when there are more than two users in the channel.</li>
<li><ph keyref="CHANNEL_PROFILE_COMMUNICATION" />(0): Communication. Agora recommends using the live streaming profile for a better audio and video experience.</li>
<li><ph keyref="CHANNEL_PROFILE_LIVE_BROADCASTING" />(1): (Default) Live streaming.</li>
<li><ph keyref="CHANNEL_PROFILE_GAME" />(2): Gaming.<dl outputclass="deprecated">
<dlentry>
<dt>Deprecated:</dt>
<dd>Use <ph keyref="CHANNEL_PROFILE_LIVE_BROADCASTING"/> instead.</dd>
</dlentry></dl>
</li>
</li>
<li><ph keyref="CHANNEL_PROFILE_CLOUD_GAMING" />(3): Interaction. The scenario is optimized for latency. Use this profile if the use case requires frequent interactions between users.<dl outputclass="deprecated">
<dlentry>
<dt>Deprecated:</dt>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@
<dd>v4.0.1</dd>
</dlentry>
</dl>
<p id="desc">The SDK defaults to enabling low-quality video stream adaptive mode (<apiname keyref="AUTO_SIMULCAST_STREAM"/>) on the sender side, which means the sender does not actively send low-quality video stream. The receiver can initiate a low-quality video stream request by calling <xref keyref="setRemoteVideoStreamType"/>, and the sender then automatically starts sending low-quality video stream upon receiving the request.<ul>
<p id="desc">The SDK defaults to enabling low-quality video stream adaptive mode (<apiname keyref="AUTO_SIMULCAST_STREAM"/>) on the sender side, which means the sender does not actively send low-quality video stream. The receiving end with the role of the <b>host</b> can initiate a low-quality video stream request by calling <xref keyref="setRemoteVideoStreamType"/>, and upon receiving the request, the sending end automatically starts sending low-quality stream.<ul>
<li>If you want to modify this behavior, you can call this method and set <parmname>mode</parmname> to <apiname keyref="DISABLE_SIMULCAST_STREAM"/> (never send low-quality video streams) or <apiname keyref="ENABLE_SIMULCAST_STREAM"/> (always send low-quality video streams).</li>
<li>If you want to restore the default behavior after making changes, you can call this method again with <parmname>mode</parmname> set to <apiname keyref="AUTO_SIMULCAST_STREAM"/>.</li></ul></p>
<note id="note">The difference and connection between this method and <xref keyref="enableDualStreamMode"/> is as follows:<ul>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,6 @@
<title>Details</title>
<p conkeyref="setRemoteVideoStreamType/desc1"/>
<p conkeyref="setRemoteVideoStreamType/desc2"/>
<p conkeyref="setRemoteVideoStreamType/desc3"/>
<note type="attention">
<ul>
<li>Call this method before joining a channel. The SDK does not support changing the default subscribed video stream type after joining a channel.</li>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -34,10 +34,13 @@
</section>
<section id="detailed_desc">
<title>Details</title>
<p id="desc1">Under limited network conditions, if the publisher does not disable the dual-stream mode using <xref keyref="enableDualStreamMode3" /><codeph>(<ph keyref="false" />)</codeph>, the receiver can choose to receive either the high-quality video stream, or the low-quality video stream. The high-quality video stream has a higher resolution and bitrate, while the low-quality video stream has a lower resolution and bitrate.</p>
<p id="desc2">By default, users receive the high-quality video stream. Call this method if you want to switch to the low-quality video stream. The SDK will dynamically adjust the size of the corresponding video stream based on the size of the video window to save bandwidth and computing resources. The default aspect ratio of the low-quality video stream is the same as that of the high-quality video stream. According to the current aspect ratio of the high-quality video stream, the system will automatically allocate the resolution, frame rate, and bitrate of the low-quality video stream.</p>
<p id="desc3">The SDK defaults to enabling low-quality video stream adaptive mode (<apiname keyref="AUTO_SIMULCAST_STREAM"/>) on the sender side, which means the sender does not actively send low-quality video stream. The receiver can initiate a low-quality video stream request by calling this method, and the sender will automatically start sending low-quality video stream upon receiving the request.</p>
<note type="attention">You can call this method either before or after joining a channel. If you call both <apiname keyref="setRemoteVideoStreamType" /> and <xref keyref="setRemoteDefaultVideoStreamType" />, the setting of <apiname keyref="setRemoteVideoStreamType" /> takes effect.</note> </section>
<p id="desc1">The SDK defaults to enabling low-quality video stream adaptive mode (<apiname keyref="AUTO_SIMULCAST_STREAM"/>) on the sending end, which means the sender does not actively send low-quality video stream. The receiver with the role of the <b>host</b> can initiate a low-quality video stream request by calling this method, and upon receiving the request, the sending end automatically starts sending the low-quality video stream.</p>
<p id="desc2">The SDK will dynamically adjust the size of the corresponding video stream based on the size of the video window to save bandwidth and computing resources. The default aspect ratio of the low-quality video stream is the same as that of the high-quality video stream. According to the current aspect ratio of the high-quality video stream, the system will automatically allocate the resolution, frame rate, and bitrate of the low-quality video stream.</p>
<note type="attention"><ul>
<li>You can call this method either before or after joining a channel.</li>
<li>If the publisher has already called <xref keyref="setDualStreamMode2"/> and set <parmname>mode</parmname> to <apiname keyref="DISABLE_SIMULCAST_STREAM"/> (never send low-quality video stream), calling this method will not take effect, you should call <apiname keyref="setDualStreamMode2"/> again on the sending end and adjust the settings.</li>
<li>Calling this method on the receiving end of the <b>audience</b> role will not take effect.</li>
<li>If you call both <apiname keyref="setRemoteVideoStreamType"/> and <xref keyref="setRemoteDefaultVideoStreamType"/>, the settings in <apiname keyref="setRemoteVideoStreamType"/> take effect.</li></ul></note> </section>
<section id="parameters">
<title>Parameters</title>
<parml>
Expand Down
2 changes: 1 addition & 1 deletion en-US/dita/RTC-NG/API/api_irtcengine_setuplocalvideo.dita
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@
<p>In real-time interactive scenarios, if you need to simultaneously view multiple preview frames in the local video preview, and each frame is at a different observation position along the video link, you can repeatedly call this method to set different <parmname>view</parmname>s and set different observation positions for each <parmname>view</parmname>. For example, by setting the video source to the camera and then configuring two <parmname>view</parmname>s with <parmname>position</parmname> setting to <ph keyref="POSITION_POST_CAPTURER_ORIGIN"/> and <ph keyref="POSITION_POST_CAPTURER"/>, you can simultaneously preview the raw, unprocessed video frame and the video frame that has undergone preprocessing (image enhancement effects, virtual background, watermark) in the local video preview.</p>
<note type="attention">
<ul>
<li props="unity cs">If you need to implement native window rendering, use this method; if you only need to render video images in your Unity project, use the methods in the <xref keyref="AgoraVideoSurface"/> class instead.</li>
<li props="unity cs">If you need to implement native window rendering, use this method; if you only need to render video images in your Unity project, use the methods in the <xref keyref="VideoSurface"/> class instead.</li>
<li>You can call this method either before or after joining a channel.</li>
<li props="native unreal bp unity rn flutter cs">To update the rendering or mirror mode of the local video view during a call, use the <xref keyref="setLocalRenderMode2" /> method.</li>
<li props="electron">If you want to stop rendering the view, set <parmname>view </parmname>to <ph keyref="NULL" /> and then call this method again to stop rendering and clear the rendering cache.</li>
Expand Down
2 changes: 1 addition & 1 deletion en-US/dita/RTC-NG/API/api_irtcengine_setupremotevideo.dita
Original file line number Diff line number Diff line change
Expand Up @@ -33,7 +33,7 @@
<p props="android ios cpp unity flutter rn">In the scenarios of custom layout for mixed videos on the mobile end, you can call this method and set a separate <parmname>view</parmname> for rendering each sub-video stream of the mixed video stream.</p>
<note type="attention">
<ul>
<li props="unity cs">If you need to implement native window rendering, use this method; if you only need to render video images in your Unity project, use the methods in the <xref keyref="AgoraVideoSurface"/> class instead.</li>
<li props="unity cs">If you need to implement native window rendering, use this method; if you only need to render video images in your Unity project, use the methods in the <xref keyref="VideoSurface"/> class instead.</li>
<li props="native unreal bp unity rn flutter cs">To update the rendering or mirror mode of the remote video view during a call, use the <xref keyref="setRemoteRenderMode2"/> method.</li>
<li>If you use the Agora recording function, the recording client joins the channel as a placeholder client, triggering the <xref keyref="onUserJoined"/> callback. Do not bind the placeholder client to the app view because the placeholder client does not send any video streams. If your app does not recognize the placeholder client, bind the remote user to the view when the SDK triggers the <xref keyref="onFirstRemoteVideoDecoded"/> callback.</li>
<li props="electron">If you want to stop rendering the view, set <parmname>view </parmname>to <ph keyref="NULL" /> and then call this method again to stop rendering and clear the rendering cache.</li>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -47,6 +47,12 @@
<dd>v4.0.1</dd>
</dlentry>
</dl>
<dl outputclass="deprecated">
<dlentry>
<dt>Deprecated:</dt>
<dd>This method is deprecated as of v4.2.0. Use <xref keyref="setDualStreamModeEx"/> instead.</dd>
</dlentry>
</dl>
<p conkeyref="enableDualStreamMode2/desc1"/>
<p conkeyref="enableDualStreamMode2/desc2"/>
<note type="note">This method is applicable to all types of streams from the sender, including but not limited to video streams collected from cameras, screen sharing streams, and custom-collected video streams.</note></section>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -48,7 +48,7 @@
<dd>v4.0.1</dd>
</dlentry>
</dl>
<p id="desc">The SDK defaults to enabling low-quality video stream adaptive mode (<apiname keyref="AUTO_SIMULCAST_STREAM"/>) on the sender side, which means the sender does not actively send low-quality video stream. The receiver can initiate a low-quality video stream request by calling <xref keyref="setRemoteVideoStreamTypeEx"/>, and the sender will automatically start sending low-quality video stream upon receiving the request.<ul>
<p id="desc">The SDK defaults to enabling low-quality video stream adaptive mode (<apiname keyref="AUTO_SIMULCAST_STREAM"/>) on the sending end, which means the sender does not actively send low-quality video stream. The receiver with the role of the <b>host</b> can initiate a low-quality video stream request by calling <xref keyref="setRemoteVideoStreamTypeEx"/>, and upon receiving the request, the sending end automatically starts sending the low-quality video stream.<ul>
<li>If you want to modify this behavior, you can call this method and set <parmname>mode</parmname> to <apiname keyref="DISABLE_SIMULCAST_STREAM"/> (never send low-quality video streams) or <apiname keyref="ENABLE_SIMULCAST_STREAM"/> (always send low-quality video streams).</li>
<li>If you want to restore the default behavior after making changes, you can call this method again with <parmname>mode</parmname> set to <apiname keyref="AUTO_SIMULCAST_STREAM"/>.</li></ul></p>
<note id="note">The difference and connection between this method and <xref keyref="enableDualStreamModeEx"/> is as follows:<ul>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -39,9 +39,11 @@
</section>
<section id="detailed_desc">
<title>Details</title>
<p id="desc1">Under limited network conditions, if the publisher does not disable the dual-stream mode using <xref keyref="enableDualStreamModeEx"/><codeph>(<ph keyref="false"/>)</codeph>, the receiver can choose to receive either the high-quality video stream, or the low-quality video stream. The high-quality video stream has a higher resolution and bitrate, while the low-quality video stream has a lower resolution and bitrate.</p>
<p conkeyref="setRemoteVideoStreamType/desc1"/>
<p conkeyref="setRemoteVideoStreamType/desc2"/>
<p conkeyref="setRemoteVideoStreamType/desc3"/>
<note type="attention"><ul>
<li>If the publisher has already called <xref keyref="setDualStreamModeEx"/> and set <parmname>mode</parmname> to <apiname keyref="DISABLE_SIMULCAST_STREAM"/> (never send low-quality video stream), calling this method will not take effect, you should call <apiname keyref="setDualStreamModeEx"/> again on the sending end and adjust the settings.</li>
<li>Calling this method on the receiving end of the <b>audience</b> role will not take effect.</li></ul></note>
</section>
<section id="parameters">
<title>Parameters</title>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -50,7 +50,7 @@
<p>After you call this method, the SDK triggers the <xref keyref="onRtmpStreamingStateChanged"/> callback on the local client to report the state of the streaming.</p>
<note type="attention" id="note">
<ul>
<li>Ensure that you enable the Media Push service before using this function. <ph props="cn">For details, see the prerequisites in <xref keyref="guide-cdn-streaming">Media Push</xref>.</ph></li>
<li>Ensure that you enable the Media Push service before using this function. <ph props="hide">For details, see the prerequisites in <xref keyref="guide-cdn-streaming">Media Push</xref>.</ph></li>
<li>Call this method after joining a channel.</li>
<li>Only hosts in the LIVE_BROADCASTING profile can call this method.</li>
<li>If you want to retry pushing streams after a failed push, make sure to call <xref keyref="stopRtmpStreamEx"/> first, then call this method to retry pushing streams; otherwise, the SDK returns the same error code as the last failed push.</li>
Expand Down
Loading

0 comments on commit 2ba6d55

Please sign in to comment.