Skip to content

Commit

Permalink
fix publish errors
Browse files Browse the repository at this point in the history
  • Loading branch information
Cilla-luodan committed Sep 23, 2024
1 parent 707ddc3 commit 8f8a39c
Show file tree
Hide file tree
Showing 29 changed files with 88 additions and 85 deletions.
3 changes: 2 additions & 1 deletion dita/RTC-NG/API/api_imediaengine_setexternalvideosource.dita
Original file line number Diff line number Diff line change
Expand Up @@ -41,7 +41,8 @@
<codeblock props="reserve" outputclass="language-cpp"></codeblock></p>
</section>
<section id="detailed_desc" deliveryTarget="details" otherprops="no-title">
<p>调用该方法启用外部视频源后,你可以调用 <xref keyref="pushVideoFrame"/> 向 SDK 推送外部视频数据。</p>
<p props="android cpp apple framework">调用该方法启用外部视频源后,你可以调用 <xref keyref="pushVideoFrame"/> 向 SDK 推送外部视频数据。</p>
<p props="hmos">调用该方法启用外部视频源后,你可以调用 <xref keyref="pushVideoFrame3"/> 向 SDK 推送外部视频数据。</p>
</section>
<section id="timing" deliveryTarget="details">
<title>调用时机</title>
Expand Down
2 changes: 1 addition & 1 deletion dita/RTC-NG/API/api_irtcengine_createcustomvideotrack.dita
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@
<ol>
<li>调用该方法创建视频轨道并获得视频轨道 ID。</li>
<li>调用 <xref keyref="joinChannel2"/> 加入频道时,将 <xref keyref="ChannelMediaOptions"/> 中的 <parmname>customVideoTrackId</parmname> 设置为你想要发布的视频轨道 ID,并将 <parmname>publishCustomVideoTrack</parmname> 设置为 <codeph><ph keyref="true"/></codeph>。</li>
<li>调用 <xref keyref="pushVideoFrame" props="apple cpp framework"/><xref keyref="pushVideoFrame3" props="android"/> 将 <parmname>videoTrackId</parmname> 指定为步骤 2 中指定的视频轨道 ID,即可实现在频道内发布对应的自定义视频源。</li>
<li>调用 <xref keyref="pushVideoFrame" props="apple cpp framework"/><xref keyref="pushVideoFrame3" props="android hmos"/> 将 <parmname>videoTrackId</parmname> 指定为步骤 2 中指定的视频轨道 ID,即可实现在频道内发布对应的自定义视频源。</li>
</ol></p>
</section>
<section id="return_values">
Expand Down
6 changes: 3 additions & 3 deletions dita/RTC-NG/API/api_irtcengine_enableinearmonitoring2.dita
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@
<section id="prototype">
<p outputclass="codeblock">
<codeblock props="android" outputclass="language-java">public abstract int enableInEarMonitoring(boolean enabled, int includeAudioFilters);</codeblock>
<codeblock props="hmos" outputclass="language-arkts">public abstract enableInEarMonitoring(enabled: boolean, includeAudioFilters: number): number;</codeblock>
<codeblock props="hmos" outputclass="language-arkts">public abstract enableInEarMonitoring(enabled: boolean, includeAudioFilters?: Constants.EarMontoringFilterType): number;</codeblock>
<codeblock props="ios mac" outputclass="language-objectivec">- (int)enableInEarMonitoring:(BOOL)enabled includeAudioFilters:(AgoraEarMonitoringFilterType)includeAudioFilters;</codeblock>
<codeblock props="cpp unreal" outputclass="language-cpp">virtual int enableInEarMonitoring(bool enabled, int includeAudioFilters) = 0;</codeblock>
<codeblock props="bp" outputclass="language-cpp">UFUNCTION(BlueprintCallable, Category = &quot;Agora|IRtcEngine&quot;)
Expand Down Expand Up @@ -57,8 +57,8 @@
</plentry>
<plentry>
<pt>includeAudioFilters</pt>
<pd props="ios mac cpp unreal electron unity rn flutter cs">耳返 Audio filter 类型。详见 <xref keyref="EAR_MONITORING_FILTER_TYPE"/>。</pd>
<pd props="android hmos bp">耳返 Audio filter 类型:
<pd props="hmos ios mac cpp unreal electron unity rn flutter cs">耳返 Audio filter 类型。详见 <xref keyref="EAR_MONITORING_FILTER_TYPE"/>。</pd>
<pd props="android bp">耳返 Audio filter 类型:
<ul id="ul_nwv_hcy_4qb">
<li><ph keyref="EAR_MONITORING_FILTER_NONE"/> (1 &lt;&lt; 0):不在耳返中添加 Audio filter。</li>
<li><ph keyref="EAR_MONITORING_FILTER_BUILT_IN_AUDIO_FILTERS"/> (1 &lt;&lt; 1): 在耳返中添加人声效果 Audio filter。如果你实现了美声、音效等功能,用户可以在耳返中听到添加效果后的声音。该枚举值支持使用按位或运算符(|)进行组合。</li>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@
<section id="detailed_desc">
<title>详情</title>
<p>虚拟背景功能支持将本地用户原来的背景替换为静态图片、动态视频、将背景虚化,或者将人像与背景分割以实现人像画中画。成功开启虚拟背景功能后,频道内所有用户都能看到自定义的背景。</p>
<p>该方法和 <xref keyref="enableVirtualBackground"/> 均可用于开启/关闭虚拟背景,区别在于该方法支持在开启虚拟背景时指定应用虚拟背景的媒体源。</p>
<p props="android apple">该方法和 <xref keyref="enableVirtualBackground"/> 均可用于开启/关闭虚拟背景,区别在于该方法支持在开启虚拟背景时指定应用虚拟背景的媒体源。</p>
<p>请在 <xref keyref="enableVideo"/> 或 <xref keyref="startPreview2"/> 之后调用该方法。</p>
<note type="attention" conkeyref="enableVirtualBackground/hardware_req"/>
</section>
Expand Down
1 change: 1 addition & 0 deletions dita/RTC-NG/API/api_irtcengine_playeffect3.dita
Original file line number Diff line number Diff line change
Expand Up @@ -99,6 +99,7 @@
<p props="cpp framework">播放文件的地址,支持在线文件的 URL 地址、播放文件的绝对路径,需精确到文件名及后缀。支持的音频格式包括 MP3、AAC、M4A、MP4、WAV、3GP 等。<ph props="cpp">详见<xref keyref="filePath-link">支持的媒体格式</xref>。</ph></p>
<p props="android">播放文件的地址,支持以 <codeph>content://</codeph> 开头的 URI 地址、以 <codeph>/assets/</codeph> 开头的路径、在线文件的 URL 地址、本地文件的绝对路径,需精确到文件名及后缀。支持的音频格式包括 MP3、AAC、M4A、MP4、WAV、3GP。详见<xref keyref="filePath-link">支持的媒体格式</xref>。</p>
<p props="apple">播放文件的地址,支持以 <codeph>ipod-library//</codeph> 开头的文件路径、在线文件的 URL 地址、文件的绝对路径,需精确到文件名及后缀。支持的音频格式包括 MP3、AAC、M4A、MP4、WAV、3GP。详见<xref keyref="filePath-link">支持的媒体格式</xref>。</p>
<p props="hmos">播放文件的地址。</p>
<note type="attention" props="android cpp apple framework">如果你已通过 <xref keyref="preloadEffect" /> 将音效加载至内存,请确保该参数与 <apiname keyref="preloadEffect" /> 中设置的 <parmname>filePath</parmname> 相同。</note> </pd>
</plentry>
<plentry id="loopcount">
Expand Down
2 changes: 1 addition & 1 deletion dita/RTC-NG/API/api_irtcengine_setclientrole2.dita
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,7 @@
</section>
<section id="detailed_desc" deliveryTarget="details" otherprops="no-title">
<p>SDK 默认设置用户角色为观众,你可以调用该方法设置用户角色为主播。用户角色(<parmname>role</parmname>)确定用户在 SDK 层的权限,包含是否有发流权限等。</p>
<p props="native unreal unity cs">该方法与 <xref keyref="setClientRole1"/> 的区别在于,该方法还支持设置观众端延时级别(<parmname>audienceLatencyLevel</parmname>)。<parmname>audienceLatencyLevel</parmname> 需与 <parmname>role</parmname> 结合使用,确定用户在其权限范围内可以享受到的服务。例如对于观众,选择接收低延时还是超低延时的视频流。不同的延时级别会影响计费,详见<xref keyref="billing-streaming"/>。</p>
<p props="android cpp apple unreal unity cs">该方法与 <xref keyref="setClientRole1"/> 的区别在于,该方法还支持设置观众端延时级别(<parmname>audienceLatencyLevel</parmname>)。<parmname>audienceLatencyLevel</parmname> 需与 <parmname>role</parmname> 结合使用,确定用户在其权限范围内可以享受到的服务。例如对于观众,选择接收低延时还是超低延时的视频流。不同的延时级别会影响计费,详见<xref keyref="billing-streaming"/>。</p>
</section>
<section id="timing" deliveryTarget="details">
<title>调用时机</title>
Expand Down
2 changes: 1 addition & 1 deletion dita/RTC-NG/API/api_irtcengine_startaudiorecording.dita
Original file line number Diff line number Diff line change
Expand Up @@ -45,7 +45,7 @@
</plentry>
<plentry>
<pt>quality</pt>
<pd>录音质量。<ph props="hmoscpp unreal bp ios mac unity cs">详见 <xref keyref="AUDIO_RECORDING_QUALITY_TYPE"/> 。</ph>
<pd>录音质量。<ph props="hmos cpp unreal bp ios mac unity cs">详见 <xref keyref="AUDIO_RECORDING_QUALITY_TYPE"/> 。</ph>
<ul props="android">
<li>0: 低音质。采样率为 32 kHz,录制 10 分钟的文件大小为 1.2 M 左右。</li>
<li>1: 中音质。采样率为 32 kHz,录制 10 分钟的文件大小为 2 M 左右。</li>
Expand Down
7 changes: 4 additions & 3 deletions dita/RTC-NG/API/api_irtcengine_startlocalvideotranscoder.dita
Original file line number Diff line number Diff line change
Expand Up @@ -38,15 +38,15 @@
<p>你可以在远程会议、直播、在线教育等场景下开启本地合图功能,可以让用户更加方便地查看和管理多个视频画面,同时支持人像画中画等功能。</p>
<p>以下是一个实现人像画中画的典型场景:
<ol>
<li>调用 <xref keyref="enableVirtualBackground"/>,并将自定义背景图设置为 <apiname keyref="BACKGROUND_NONE"/>,即:在摄像头采集的视频中将人像和背景分割。</li>
<li>调用 <xref props="cpp unreal bp mac unity cs flutter electron" keyref="startScreenCapture2"/><xref props="ios android rn" keyref="startScreenCapture"/>,开始采集屏幕共享视频流。</li>
<li>调用 <xref keyref="enableVirtualBackground" props="android cpp apple framework"/><xref keyref="enableVirtualBackground2" props="hmos"/>,并将自定义背景图设置为 <apiname keyref="BACKGROUND_NONE"/>,即:在摄像头采集的视频中将人像和背景分割。</li>
<li>调用 <xref props="cpp unreal bp mac unity cs flutter electron" keyref="startScreenCapture2"/><xref props="ios android hmos rn" keyref="startScreenCapture"/>,开始采集屏幕共享视频流。</li>
<li>调用该方法,并将采集人像的视频源设置为参与本地合图的视频源之一,即可在合图后的视频中实现人像画中画。</li>
</ol></p>
</section>
<section id="timing" deliveryTarget="details">
<title>调用时机</title>
<ul>
<li>如果你需要对本地采集的视频流进行合图,需要在 <xref keyref="startCameraCapture"/> 或 <xref props="cpp unreal bp mac unity cs flutter electron" keyref="startScreenCapture2"/><xref props="ios android rn" keyref="startScreenCapture"/> 之后调用该方法。</li>
<li>如果你需要对本地采集的视频流进行合图,需要在 <xref keyref="startCameraCapture"/> 或 <xref props="cpp unreal bp mac unity cs flutter electron" keyref="startScreenCapture2"/><xref props="ios android hmos rn" keyref="startScreenCapture"/> 之后调用该方法。</li>
<li>如果你要将合图后的视频流发布到频道中,需要在调用 <xref keyref="joinChannel2"/> 或 <xref keyref="updateChannelMediaOptions"/> 时,将 <xref keyref="ChannelMediaOptions"/> 中的 <parmname>publishTranscodedVideoTrack</parmname> 设置为 <codeph><ph keyref="true"/></codeph>。</li>
</ul>
</section>
Expand All @@ -59,6 +59,7 @@
<li props="cpp unreal bp mac unity electron flutter">在 macOS 平台上,最多支持 4 路摄像头采集的视频流 + 1 路屏幕共享流合图。</li>
<li props="cpp unreal bp unity rn flutter">在 Android 和 iOS 平台上,最多支持 2 路摄像头采集的视频流(需要设备本身支持双摄或支持外接摄像头)+ 1 路屏幕共享合图。</li>
<li props="android">在 Android 平台上,最多支持 2 路摄像头采集的视频流(需要设备本身支持双摄或支持外接摄像头)+ 1 路屏幕共享合图。</li>
<li props="hmos">在 HarmonyOS 平台上,最多支持 2 路摄像头采集的视频流(需要设备本身支持双摄或支持外接摄像头)+ 1 路屏幕共享合图。</li>
<li props="ios">在 iOS 平台上,最多支持 2 路摄像头采集的视频流(需要设备本身支持双摄或支持外接摄像头)+ 1 路屏幕共享合图。</li></ul></li>
<li>在进行合图配置时,需确保采集人像的摄像头视频流在合图中的图层编号大于屏幕共享流的图层编号,否则人像会被屏幕共享覆盖、无法显示在最终合图的视频流中。</li>
</ul>
Expand Down
2 changes: 1 addition & 1 deletion dita/RTC-NG/API/api_irtcengine_startpreview.dita
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,7 @@
<title>调用限制</title>
<ul>
<li>本地预览默认开启镜像功能。</li>
<li>在离开频道后,本地预览依然处于开启状态。你需要调用 <xref keyref="stopPreview" props="native unreal unity"/><xref keyref="stopPreview2" props="flutter bp electron rn"/> 关闭本地预览。</li>
<li>在离开频道后,本地预览依然处于开启状态。你需要调用 <xref keyref="stopPreview" props="android cpp apple unreal unity"/><xref keyref="stopPreview2" props="hmos flutter bp electron rn"/> 关闭本地预览。</li>
</ul>
</section>
<section id="return_values">
Expand Down
2 changes: 1 addition & 1 deletion dita/RTC-NG/API/api_irtcengine_stopeffect.dita
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@
<codeblock props="reserve" outputclass="language-cpp"></codeblock></p>
</section>
<section id="detailed_desc" deliveryTarget="details" otherprops="no-title">
<p>当你不需要再播放某一音效文件时,可以调用该方法停止播放。如果你仅需暂停播放,请调用 <xref keyref="pauseEffect"/>。</p>
<p props="android cpp apple framework">当你不需要再播放某一音效文件时,可以调用该方法停止播放。如果你仅需暂停播放,请调用 <xref keyref="pauseEffect"/>。</p>
</section>
<section id="timing" deliveryTarget="details">
<title>调用时机</title>
Expand Down
1 change: 1 addition & 0 deletions dita/RTC-NG/API/api_irtcengine_takesnapshot.dita
Original file line number Diff line number Diff line change
Expand Up @@ -61,6 +61,7 @@
<li props="cpp unreal bp ios unity flutter rn">iOS: <codeph>/App Sandbox/Library/Caches/example.jpg</codeph></li>
<li props="cpp unreal bp mac unity flutter electron">macOS: <codeph>~/Library/Logs/example.jpg</codeph></li>
<li props="cpp unreal bp android unity flutter rn">Android: <codeph>/storage/emulated/0/Android/data/&lt;package name&gt;/files/example.jpg</codeph></li>
<li props="hmos">HarmonyOS: <codeph>/data/app/el2/100/base/PACKAGENAME/haps/ENTRYNAME/files/example.jpg</codeph></li>
</ul>
</p>
<note type="attention">请确保目录存在且可写。</note>
Expand Down
6 changes: 3 additions & 3 deletions dita/RTC-NG/API/api_irtcengineex_leavechannelex2.dita
Original file line number Diff line number Diff line change
Expand Up @@ -36,12 +36,12 @@
</section>
<section id="detailed_desc" deliveryTarget="details" otherprops="no-title">
<p>调用该方法后,SDK 会终止音视频互动、离开当前频道,并会释放会话相关的所有资源。</p>
<p props="native unity cs unreal">调用 <xref keyref="joinChannelEx"/> 成功加入频道后,必须调用本方法或 <xref keyref="leaveChannelEx"/> 结束通话,否则无法开始下一次通话。</p>
<p props="flutter electron rn flutter bp">调用 <xref keyref="joinChannelEx"/> 成功加入频道后,必须调用本方法结束通话,否则无法开始下一次通话。</p>
<p props="android cpp apple unity cs unreal">调用 <xref keyref="joinChannelEx"/> 成功加入频道后,必须调用本方法或 <xref keyref="leaveChannelEx"/> 结束通话,否则无法开始下一次通话。</p>
<p props="hmos flutter electron rn flutter bp">调用 <xref keyref="joinChannelEx"/> 成功加入频道后,必须调用本方法结束通话,否则无法开始下一次通话。</p>
<note type="attention">
<ul>
<li>该方法是异步操作,调用返回时并没有真正退出频道。</li>
<li props="native unity cs unreal">如果你调用了 <xref keyref="leaveChannel"/> 或 <xref keyref="leaveChannel2"/> 后,会同时离开 <xref keyref="joinChannel1"/> 或 <xref keyref="joinChannel2"/> 及 <xref keyref="joinChannelEx"/> 加入的频道。</li>
<li props="android cpp apple unity cs unreal">如果你调用了 <xref keyref="leaveChannel"/> 或 <xref keyref="leaveChannel2"/> 后,会同时离开 <xref keyref="joinChannel1"/> 或 <xref keyref="joinChannel2"/> 及 <xref keyref="joinChannelEx"/> 加入的频道。</li>
<li props="hmos flutter electron rn bp">如果你调用了 <xref keyref="leaveChannel2"/> 后,会同时离开 <xref keyref="joinChannel2"/> 及 <xref keyref="joinChannelEx"/> 加入的频道。</li>
</ul></note>
</section>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@
<p outputclass="codeblock">
<codeblock props="android" outputclass="language-java">void onPlayBufferUpdated(long playCachedBuffer);
</codeblock>
<codeblock props="hmos" outputclass="language-arkts"></codeblock>
<codeblock props="hmos" outputclass="language-arkts">onPlayBufferUpdated?:(playCachedBuffer:number) => void</codeblock>
<codeblock props="ios mac" outputclass="language-objectivec">- (void)AgoraRtcMediaPlayer:(id&lt;AgoraRtcMediaPlayerProtocol> _Nonnull)playerKit
didPlayBufferUpdated:(NSInteger)playCachedBuffer NS_SWIFT_NAME(AgoraRtcMediaPlayer(_:didPlayBufferUpdated:));</codeblock>
<codeblock props="cpp unreal" outputclass="language-cpp">virtual void onPlayBufferUpdated(int64_t playCachedBuffer) = 0;
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@
<p outputclass="codeblock">
<codeblock props="android" outputclass="language-java">void onPlayerEvent(Constants.MediaPlayerEvent eventCode, long elapsedTime, String message);
</codeblock>
<codeblock props="hmos" outputclass="language-arkts">onPlayerEvent?:(eventCode:Constants.MediaPlayerEvent,elapsedTime:bigint,message:string) =&gt; void</codeblock>
<codeblock props="hmos" outputclass="language-arkts">onPlayerEvent?:(eventCode:Constants.MediaPlayerEvent,elapsedTime:number,message:string) => void</codeblock>
<codeblock props="ios mac" outputclass="language-objectivec">(void)AgoraRtcMediaPlayer:(id&lt;AgoraRtcMediaPlayerProtocol&gt; _Nonnull)playerKit
didOccurEvent:(AgoraMediaPlayerEvent)eventCode
elapsedTime:(NSInteger)elapsedTime
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@
<section id="prototype">
<p outputclass="codeblock">
<codeblock props="android" outputclass="language-java">void onPositionChanged(long positionMs, long timestampMs);</codeblock>
<codeblock props="hmos" outputclass="language-arkts">onPositionChanged?:(positionMs:bigint,timestampMs:bigint) =&gt; void</codeblock>
<codeblock props="hmos" outputclass="language-arkts">onPositionChanged?:(positionMs:number,timestampMs:number) => void</codeblock>
<codeblock props="ios mac" outputclass="language-objectivec">- (void)AgoraMediaPlayer:(AgoraMediaPlayer *_Nonnull)playerKit
didChangedToPosition:(NSInteger)positionMs
atTimestamp:(NSTimeInterval)timestampMs NS_SWIFT_NAME(AgoraMediaPlayer(_:didChangedToPosition:atTimestamp:));</codeblock>
Expand Down
Loading

0 comments on commit 8f8a39c

Please sign in to comment.