Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

New YiCAT updates #3656

Merged
merged 40 commits into from
Jul 11, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
40 commits
Select commit Hold shift + click to select a range
4d6ded3
New translations
Cilla-luodan Jun 21, 2024
57ace1a
New translations
Cilla-luodan Jul 3, 2024
8a6d243
New translations
Cilla-luodan Jul 3, 2024
bfe9640
New translations
Cilla-luodan Jul 4, 2024
20f7956
New translations
Cilla-luodan Jul 4, 2024
b4682bb
New translations
Cilla-luodan Jul 4, 2024
acc9fce
New translations
Cilla-luodan Jul 4, 2024
f83cda0
New translations
Cilla-luodan Jul 4, 2024
7cc377a
New translations
Cilla-luodan Jul 4, 2024
2388dae
New translations
Cilla-luodan Jul 4, 2024
a78d065
New translations
Cilla-luodan Jul 4, 2024
d74dd5a
New translations
Cilla-luodan Jul 4, 2024
5dcf580
New translations
Cilla-luodan Jul 8, 2024
ec92c06
New translations
Cilla-luodan Jul 8, 2024
5a4e1a3
New translations
Cilla-luodan Jul 8, 2024
413578b
New translations
Cilla-luodan Jul 8, 2024
795f72d
New translations
Cilla-luodan Jul 8, 2024
8e687dd
New translations
Cilla-luodan Jul 9, 2024
b6e6c2c
New translations
Cilla-luodan Jul 9, 2024
e4336ef
New translations
Cilla-luodan Jul 9, 2024
7b479e3
New translations
Cilla-luodan Jul 9, 2024
b7e51f4
New translations
Cilla-luodan Jul 9, 2024
62f479a
New translations
Cilla-luodan Jul 9, 2024
c0a5e4f
New translations
Cilla-luodan Jul 9, 2024
7e1c71b
New translations
Cilla-luodan Jul 10, 2024
85651dd
New translations
Cilla-luodan Jul 10, 2024
e12a55f
New translations
Cilla-luodan Jul 10, 2024
048b633
New translations
Cilla-luodan Jul 10, 2024
ebfd4e0
New translations
Cilla-luodan Jul 10, 2024
2beca79
New translations
Cilla-luodan Jul 10, 2024
b6894c9
New translations
Cilla-luodan Jul 10, 2024
731172d
New translations
Cilla-luodan Jul 11, 2024
210e19b
New translations
Cilla-luodan Jul 11, 2024
11a8f27
New translations
Cilla-luodan Jul 11, 2024
a506af6
New translations
Cilla-luodan Jul 11, 2024
23202cd
New translations
Cilla-luodan Jul 11, 2024
c9582fa
New translations
Cilla-luodan Jul 11, 2024
4b4ee8a
New translations
Cilla-luodan Jul 11, 2024
ac13654
New translations
Cilla-luodan Jul 11, 2024
59b1392
Merge branch 'master' into translation/4.x-api-ref
Cilla-luodan Jul 11, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -36,7 +36,7 @@
<pd>The channel name of the target channel.</pd>
</plentry>
<plentry>
<pt props="android">destInfo</pt>
<pt props="android hmos">destInfo</pt>
<pt props="ios mac">destinationInfo</pt>
<pd>
<p>The information of the target channel. See <apiname keyref="ChannelMediaInfo"/>. It contains the following members:<ul>
Expand Down
17 changes: 11 additions & 6 deletions en-US/dita/RTC-NG/API/api_getmediaplayercachemanager.dita
Original file line number Diff line number Diff line change
Expand Up @@ -27,12 +27,17 @@
<codeblock props="flutter" outputclass="language-dart">MediaPlayerCacheManager getMediaPlayerCacheManager()</codeblock>
<codeblock props="reserve" outputclass="language-cpp"></codeblock></p>
</section>
<section id="detailed_desc">
<title>Details</title>
<p>When you successfully call this method, the SDK returns a media player cache manager instance. The cache manager is a singleton pattern. Therefore, multiple calls to this method returns the same instance.</p>
<note type="attention">
<p>Make sure the <xref keyref="IRtcEngine" /> is initialized before you call this method.</p>
</note> </section>
<section id="detailed_desc" deliveryTarget="details" otherprops="no-title">
<p>Before calling any APIs in the <xref keyref="IMediaPlayerCacheManager"/> class, you need to call this method to get a cache manager instance of a media player.</p>
</section>
<section id="timing" deliveryTarget="details">
<title>Call timing</title>
<p>Make sure the <xref keyref="IRtcEngine" /> is initialized before you call this method.</p>
</section>
<section id="restriction" deliveryTarget="details">
<title>Restrictions</title>
<p>The cache manager is a singleton pattern. Therefore, multiple calls to this method returns the same instance.</p>
</section>
<section id="return_values">
<title>Returns</title>
<p>The <xref keyref="IMediaPlayerCacheManager" /> instance.</p>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,7 @@
<p>This method tests whether the audio device for local playback works properly. Once a user starts the test, the SDK plays an audio file specified by the user. If the user can hear the audio, the playback device works properly.</p>
<p>After calling this method, the SDK triggers the <xref keyref="onAudioVolumeIndication"/> callback every 100 ms, reporting <parmname>uid</parmname> = 1 and the volume information of the playback device.</p>
<p>The difference between this method and the <xref keyref="startEchoTest3"/> method is that the former checks if the local audio playback device is working properly, while the latter can check the audio and video devices and network conditions.</p>
<note type="attention">Ensure that you call this method before joining a channel. After the test is completed, call <xref keyref="stopPlaybackDeviceTest"/> to stop the test before joining a channel.</note>
<note type="attention">Call this method before joining a channel. After the test is completed, call <xref keyref="stopPlaybackDeviceTest"/> to stop the test before joining a channel.</note>
</section>
<section id="parameters">
<title>Parameters</title>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@
<title>Details</title>
<p>This method tests whether the audio capturing device works properly. After calling this method, the SDK triggers the <xref keyref="onAudioVolumeIndication"/> callback at the time interval set in this method, which reports <parmname>uid</parmname> = 0 and the volume information of the capturing device.</p>
<p>The difference between this method and the <xref keyref="startEchoTest3"/> method is that the former checks if the local audio capturing device is working properly, while the latter can check the audio and video devices and network conditions.</p>
<note type="note">Ensure that you call this method before joining a channel. After the test is completed, call <xref keyref="stopRecordingDeviceTest"/> to stop the test before joining a channel.</note>
<note type="note">Call this method before joining a channel. After the test is completed, call <xref keyref="stopRecordingDeviceTest"/> to stop the test before joining a channel.</note>
</section>
<section id="parameters">
<title>Parameters</title>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@
<section id="detailed_desc">
<title>Details</title>
<p>This method stops the audio playback device test. You must call this method to stop the test after calling the <xref keyref="startPlaybackDeviceTest" /> method.</p>
<note type="attention">Ensure that you call this method before joining a channel.</note>
<note type="attention">Call this method before joining a channel.</note>
</section>
<section id="return_values">
<title><ph keyref="return-section-title"/></title>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@
<section id="detailed_desc">
<title>Details</title>
<p>This method stops the audio capturing device test. You must call this method to stop the test after calling the <xref keyref="startRecordingDeviceTest"/> method.</p>
<note type="note">Ensure that you call this method before joining a channel.</note>
<note type="note">Call this method before joining a channel.</note>
</section>
<section id="return_values">
<title><ph keyref="return-section-title"/></title>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,7 @@
<note type="note">
<ul>
<li props="cpp unreal bp">Call this method after calling <xref keyref="queryInterface"/><codeph>(<ph keyref="AGORA_IID_LOCAL_SPATIAL_AUDIO"/>)</codeph>.</li>
<li props="android">Call this method after calling <xref keyref="create_ILocalSpatialAudioEngine"/>.</li>
<li props="android hmos">Call this method after calling <xref keyref="create_ILocalSpatialAudioEngine"/>.</li>
<li>Before calling other methods of the <apiname keyref="ILocalSpatialAudioEngine"/> class, you need to call this method to initialize <apiname keyref="ILocalSpatialAudioEngine"/>.</li>
<li>The SDK supports creating only one <apiname keyref="ILocalSpatialAudioEngine"/> instance for an app.</li>
</ul> </note> </section>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -40,7 +40,7 @@
<dd>v4.2.0</dd>
</dlentry>
</dl>
<note type="attention">Ensure that you call this method before joining a channel.</note>
<note type="attention">Call this method before joining a channel.</note>
<p>To publish a custom audio source, see the following steps:<ol>
<li>Call this method to create a custom audio track and get the audio track ID.</li>
<li>Call <xref keyref="joinChannel2"/> to join the channel. In <xref keyref="ChannelMediaOptions"/>, set <parmname>publishCustomAudioTrackId</parmname> to the audio track ID that you want to publish, and set <parmname>publishCustomAudioTrack</parmname> to <codeph><ph keyref="true"/></codeph>.</li>
Expand Down
33 changes: 18 additions & 15 deletions en-US/dita/RTC-NG/API/api_imediaengine_pullaudioframe.dita
Original file line number Diff line number Diff line change
Expand Up @@ -24,22 +24,25 @@
<codeblock props="flutter" outputclass="language-dart">Future&lt;void&gt; pullAudioFrame(AudioFrame frame);</codeblock>
<codeblock props="reserve" outputclass="language-cpp"></codeblock></p>
</section>
<section id="detailed_desc">
<title>Details</title>
<p>Before calling this method, call <xref keyref="setExternalAudioSink"/><codeph>(<parmname>enabled</parmname>: <ph keyref="true"/>)</codeph> to notify the app to enable and set the external audio rendering.</p>
<section id="detailed_desc" deliveryTarget="details" otherprops="no-title">
<p>After a successful call of this method, the app pulls the decoded and mixed audio data for playback.</p>
<note type="attention">
<ul>
<li>Call this method after joining a channel.</li>
<li>Both this method and <xref keyref="onPlaybackAudioFrame"/> callback can be used to get audio data after remote mixing. Note that after calling <apiname keyref="setExternalAudioSink"/> to enable external audio rendering, the app no longer receives data from the <apiname keyref="onPlaybackAudioFrame"/> callback. Therefore, you should choose between this method and the <apiname keyref="onPlaybackAudioFrame"/> callback based on your actual business requirements. The specific distinctions between them are as follows:<ul>
</section>
<section id="timing" deliveryTarget="details">
<title>Call timing</title>
<p>Call this method after joining a channel.</p>
<p>Before calling this method, call <xref keyref="setExternalAudioSink"/><codeph>(<parmname>enabled</parmname>: <ph keyref="true"/>)</codeph> to notify the app to enable and set the external audio rendering.</p>
</section>
<section id="restriction" deliveryTarget="details">
<title>Restrictions</title>
<p>Both this method and the <xref keyref="onPlaybackAudioFrame"/> callback can be used to get audio data after remote mixing. After calling <apiname keyref="setExternalAudioSink"/> to enable external audio rendering, the app will no longer be able to obtain data from the <apiname keyref="onPlaybackAudioFrame"/> callback. Therefore, you should choose between this method and the <apiname keyref="onPlaybackAudioFrame"/> callback based on your actual business requirements. The specific distinctions between them are as follows:<ul>
<li>After calling this method, the app automatically pulls the audio data from the SDK. By setting the audio data parameters, the SDK adjusts the frame buffer to help the app handle latency, effectively avoiding audio playback jitter.</li>
<li>The SDK sends the audio data to the app through the <apiname keyref="onPlaybackAudioFrame"/> callback. Any delay in processing the audio frames may result in audio jitter.</li>
</ul></li>
<li>This method is only used for retrieving audio data after remote mixing. If you need to get audio data from different audio processing stages such as capture and playback, you can register the corresponding callbacks by calling <xref keyref="registerAudioFrameObserver"/>.</li>
</ul> </note> </section>
<section id="parameters" props="native unreal bp unity flutter cs">
<title>Parameters</title>
<parml>
<li>After registering the <apiname keyref="onPlaybackAudioFrame"/> callback, the SDK sends the audio data to the app through the callback. Any delay in processing the audio frames may result in audio jitter.</li>
</ul></p>
<p>This method is only used for retrieving audio data after remote mixing. If you need to get audio data from different audio processing stages such as capture and playback, you can register the corresponding callbacks by calling <xref keyref="registerAudioFrameObserver"/>.</p>
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

我觉得 retrieve data 比 pull 要好(pull data 有点迷,但应该也是前人的 legacy),上面和 shortdesc 里面可以考虑看看要不要改(

Copy link
Collaborator

@jinyuagora jinyuagora Jul 11, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

pull/push data 的用法我感觉也还好,和函数命名也一致,你觉得可以接受的话 shortdesc 就不动了

</section>
<section id="parameters" deliveryTarget="details">
<title><ph props="android apple cpp unreal bp flutter unity cs">Parameters</ph></title>
<parml props="android apple cpp unreal bp flutter unity cs">
<plentry props="cpp unreal bp unity flutter cs">
<pt>frame</pt>
<pd>Pointers to <xref keyref="AudioFrame"/>.</pd>
Expand Down Expand Up @@ -68,5 +71,5 @@
<li>The <apiname keyref="AudioFrame" /> instance, if the method call succeeds.</li>
<li>An error code, if the call fails,.</li>
</ul> </section>
</refbody>
</refbody>
</reference>
20 changes: 13 additions & 7 deletions en-US/dita/RTC-NG/API/api_imediaengine_pushaudioframe0.dita
Original file line number Diff line number Diff line change
Expand Up @@ -23,15 +23,21 @@
<codeblock props="flutter" outputclass="language-dart">Future&lt;void> pushAudioFrame({required AudioFrame frame, int trackId = 0});</codeblock>
<codeblock props="reserve" outputclass="language-cpp"></codeblock></p>
</section>
<section id="detailed_desc">
<title>Details</title>
<note type="attention">
<section id="detailed_desc" deliveryTarget="details" otherprops="no-title">
<p>Call this method to push external audio frames through the audio track.</p>
</section>
<section id="timing" deliveryTarget="details">
<title>Call timing</title>
<p>Before calling this method to push external audio data, perform the following steps:<ol>
<li>Call <xref keyref="createCustomAudioTrack"/> to create a custom audio track and get the audio track ID.</li>
<li>Call <xref keyref="joinChannel2"/> to join the channel. In <xref keyref="ChannelMediaOptions"/>, set <parmname>publishCustomAudioTrackId</parmname> to the audio track ID that you want to publish, and set <parmname>publishCustomAudioTrack</parmname> to <codeph><ph keyref="true"/></codeph>.</li>
</ol></p>
</note> </section>
<section id="parameters">
</section>
<section id="restriction" deliveryTarget="details">
<title>Restrictions</title>
<p>None.</p>
</section>
<section id="parameters" deliveryTarget="details">
<title>Parameters</title>
<parml>
<plentry>
Expand All @@ -46,8 +52,8 @@
<section id="return_values">
<title><ph keyref="return-section-title"/></title>
<p props="flutter">When the method call succeeds, there is no return value; when fails, the <xref keyref="AgoraRtcException"/> exception is thrown. You need to catch the exception and handle it accordingly. <ph props="cn">See <xref keyref="error-code-link"/> for details and resolution suggestions.</ph></p>
<ul>
<li props="cpp unreal bp unity electron rn cs">0: Success.</li>
<ul props="cpp unreal bp unity electron rn cs">
<li>0: Success.</li>
<li>&lt; 0: Failure. <ph props="cn">See <xref keyref="error-code-link"/> for details and resolution suggestions.</ph></li>
</ul> </section>
</refbody>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -48,7 +48,7 @@
<title>Parameters</title>
<parml>
<plentry id="data">
<pt props="android">data</pt>
<pt props="android hmos">data</pt>
<pt props="cpp unreal bp electron unity rn flutter cs">imageBuffer</pt>
<pd>The buffer of the external encoded video frame.</pd>
</plentry>
Expand All @@ -57,7 +57,7 @@
<pd>Length of the externally encoded video frames.</pd>
</plentry>
<plentry id="frameinfo">
<pt props="android">frameInfo</pt>
<pt props="android hmos">frameInfo</pt>
<pt props="cpp unreal bp electron unity rn flutter cs">videoEncodedFrameInfo</pt>
<pd>Information about externally encoded video frames. See <xref keyref="EncodedVideoFrameInfo"/>.</pd>
</plentry>
Expand Down
4 changes: 2 additions & 2 deletions en-US/dita/RTC-NG/API/api_imediaengine_pushvideoframe.dita
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,7 @@
</section>
<section id="detailed_desc">
<title>Details</title>
<dl outputclass="deprecated" props="android">
<dl outputclass="deprecated" props="android hmos">
<dlentry>
<dt>Deprecated:</dt>
<dd>If you need to push video frames in I422 format, you need to use this method; otherwise, use <xref keyref="pushVideoFrame3"/>.</dd>
Expand All @@ -46,7 +46,7 @@
<li>If you no longer need to capture external video data, you can call <xref keyref="destroyCustomVideoTrack"/> to destroy the custom video track.</li>
<li>If you only want to use the external video data for local preview and not publish it in the channel, you can call <xref keyref="muteLocalVideoStream"/> to cancel sending video stream or call <xref keyref="updateChannelMediaOptions"/> to set <parmname>publishCustomVideoTrack</parmname> to <codeph><ph keyref="false"/></codeph>.</li>
</ul></note>
<p props="android">You can push the video frame either by calling this method or by calling <xref keyref="pushVideoFrame3"/>. The difference is that this method does not support video data in Texture format.</p>
<p props="android hmos">You can push the video frame either by calling this method or by calling <xref keyref="pushVideoFrame3"/>. The difference is that this method does not support video data in Texture format.</p>
</section>
<section props="cpp" id="scenario">
<title>Applicable scenarios</title>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -23,19 +23,26 @@
<codeblock props="flutter" outputclass="language-dart">void registerAudioFrameObserver(AudioFrameObserver observer);</codeblock>
<codeblock props="reserve" outputclass="language-cpp"></codeblock></p>
</section>
<section id="detailed_desc">
<title>Details</title>
<p>Call this method to register an audio frame observer object (register a callback). When you need the SDK to trigger <xref keyref="onMixedAudioFrame"/>, <xref keyref="onRecordAudioFrame"/>, <xref keyref="onPlaybackAudioFrame"/> or <xref keyref="onEarMonitoringAudioFrame"/> callback, you need to use this method to register the callbacks.</p>
<note type="attention">Ensure that you call this method before joining a channel.</note> </section>
<section id="parameters">
<section id="detailed_desc" deliveryTarget="details" otherprops="no-title">
<p>Call this method to register an audio frame observer object (register a callback). When you need the SDK to trigger the <xref keyref="onMixedAudioFrame"/>, <xref keyref="onRecordAudioFrame"/>, <xref keyref="onPlaybackAudioFrame"/>, <xref keyref="onPlaybackAudioFrameBeforeMixing"/> or <xref keyref="onEarMonitoringAudioFrame"/> callback, you need to use this method to register the callbacks.</p>
</section>
<section id="timing" deliveryTarget="details">
<title>Call timing</title>
<p>Call this method before joining a channel.</p>
</section>
<section id="restriction" deliveryTarget="details">
<title>Restrictions</title>
<p>None.</p>
</section>
<section id="parameters" deliveryTarget="details">
<title>Parameters</title>
<parml>
<plentry>
<pt props="android cpp unreal bp electron rn flutter">observer</pt>
<pt props="ios mac">delegate</pt>
<pt props="unity cs">audioFrameObserver</pt>
<pd>
<p>The observer instance. <ph>See <xref keyref="IAudioFrameObserver"/></ph>. <ph props="android mac ios unity cpp unreal bp cs">Set the value as <ph keyref="NULL"/> to release the instance. </ph><ph>Agora recommends calling this method after receiving <xref keyref="onLeaveChannel"/> to release the audio observer object.</ph></p>
<p><ph>The observer instance. See <xref keyref="IAudioFrameObserver"/>. </ph><ph props="android mac ios unity cpp unreal bp cs">Set the value as <ph keyref="NULL"/> to release the instance. </ph><ph>Agora recommends calling this method after receiving <xref keyref="onLeaveChannel"/> to release the audio observer object.</ph></p>
</pd>
</plentry>
<plentry props="unity cs">
Expand Down Expand Up @@ -65,5 +72,5 @@
<li><codeph><ph keyref="true"/></codeph>: Success.</li>
<li><codeph><ph keyref="false"/></codeph>: Failure. <ph props="cn">See <xref keyref="error-code-link"/> for details and resolution suggestions.</ph></li>
</ul></section>
</refbody>
</reference>
</refbody>
</reference>
Loading
Loading