Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add Dolby Vision Transcoding and Editing Support #1235

Open
wants to merge 13 commits into
base: main
Choose a base branch
from
Open
Original file line number Diff line number Diff line change
Expand Up @@ -1280,6 +1280,38 @@ public void transcode_withOutputVideoMimeTypeAv1_completesSuccessfully() throws
assertThat(exportResult.videoMimeType).isEqualTo(MimeTypes.VIDEO_AV1);
}

@Test
public void transcode_withOutputVideoMimeTypeDolbyVision_completesSuccessfully() throws Exception {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

On which device you have tested this which can produce dolby vision metadata?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

OPPO flagship series (e.g OPPO Find X6/X7 pro/ultra), ViVo flagship series (e.g. ViVo X90/pro/ultra, X100/pro/ultra) and XiaoMi flagship series.
I tested it on my Pixel phone with Dolby Vision codec integration.

if (AndroidTestUtil.skipAndLogIfFormatsUnsupported(
context,
testId,
/* inputFormat= */ MP4_ASSET_FORMAT,
/* outputFormat= */ MP4_ASSET_FORMAT
.buildUpon()
.setSampleMimeType(MimeTypes.VIDEO_DOLBY_VISION)
.setCodecs(null)
.build())) {
return;
}
MediaItem mediaItem = MediaItem.fromUri(Uri.parse(MP4_ASSET_URI_STRING));
EditedMediaItem editedMediaItem = new EditedMediaItem.Builder(mediaItem).build();
Transformer transformer =
new Transformer.Builder(context).setVideoMimeType(MimeTypes.VIDEO_DOLBY_VISION).build();

ExportTestResult exportTestResult =
new TransformerAndroidTestRunner.Builder(context, transformer)
.build()
.run(testId, editedMediaItem);
ExportResult exportResult = exportTestResult.exportResult;

String actualMimeType =
retrieveTrackFormat(context, exportTestResult.filePath, C.TRACK_TYPE_VIDEO).sampleMimeType;
assertThat(actualMimeType).isEqualTo(MimeTypes.VIDEO_DOLBY_VISION);
assertThat(exportResult.exportException).isNull();
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If there is any exception then test would actually throw so this check is not required.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I just follow Line 1278 and 1331 in transcode_withOutputVideoMimeTypeAv1_completesSuccessfully() and transcode_withOutputAudioMimeTypeAac_completesSuccessfully()

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am not sure why existing tests has this assertion but we can remove it for the newly added tests.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

OK, I'll remove it.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If there is any exception then test would actually throw so this check is not required.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I just followed Line 1278 and 1331 in transcode_withOutputVideoMimeTypeAv1_completesSuccessfully() and transcode_withOutputAudioMimeTypeAac_completesSuccessfully() respectively.

assertThat(exportResult.durationMs).isGreaterThan(0);
assertThat(exportResult.videoMimeType).isEqualTo(MimeTypes.VIDEO_DOLBY_VISION);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Since we are checking the actual mime type, this check can be removed.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I just followed Line 1280 and 1333 in transcode_withOutputVideoMimeTypeAv1_completesSuccessfully() and transcode_withOutputAudioMimeTypeAac_completesSuccessfully() respectively.

}

@Test
public void transcode_withOutputAudioMimeTypeAac_completesSuccessfully() throws Exception {
MediaItem mediaItem = MediaItem.fromUri(Uri.parse(MP3_ASSET_URI_STRING));
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,7 @@
*/
package androidx.media3.transformer.mh;

import static androidx.media3.common.MimeTypes.VIDEO_DOLBY_VISION;
import static androidx.media3.common.MimeTypes.VIDEO_H265;
import static androidx.media3.common.util.Assertions.checkState;
import static androidx.media3.test.utils.TestUtil.retrieveTrackFormat;
Expand Down Expand Up @@ -139,6 +140,43 @@ public void export_transmuxHlg10File() throws Exception {
assertThat(actualColorTransfer).isEqualTo(C.COLOR_TRANSFER_HLG);
}

@Test
public void export_transmuxDolbyVisionFile() throws Exception {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think transmuxing test case can also go into MediaItemExportTest as it does not need to run on an actual device.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could you give me more hint about it?

  1. I added the Dolby Vision test case here because there are export_transmuxHdr10File() and export_transmuxHlg10File() test cases in this file. I did the similar thing in my test case implementation.
  2. In MediaItemExportTest, I found some transmux test cases but I didn't find corresponding test cases for HLG and HDR10. Do I need to add test case for Dolby Vision in this file?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think these tests needs to run on real device because in Robolectric tests, FrameworkMuxer does not write to the actual output file and we are doing validations on the actual output file. So we should keep the test case here only. Thanks!

Context context = ApplicationProvider.getApplicationContext();

if (Util.SDK_INT < 24) {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We have now started using assumeTrue(Util.SDK_INT >= 24). By using this the test is marked as skipped instead of marked passed.
So the whole if block check can be replaced with assumeTrue(Util.SDK_INT >= 24)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

OK, I'll change the code.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You can also use @SdkSuppress(minSdkVersion = 24) if your only condition is SDK level.

https://developer.android.com/reference/androidx/test/filters/SdkSuppress

This is preferable to assumeTrue imo because it skips the test "earlier", and just doesn't run it at all, instead of needing to mark it as 'skipped' or 'passed' (which has mixed support in different test running environments).

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

OK, I think this method is better. I'll change the code again.

// TODO: b/285543404 - Remove suppression once we can transmux H.265/HEVC before API 24.
recordTestSkipped(context, testId, /* reason= */ "Can't transmux H.265/HEVC before API 24");
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The below check for format support should check whether muxer supports this format or not so this is not needed.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I just did the same thing as export_transmuxHdr10File() and export_transmuxHlg10File(). Shall we remove above code in these three methods together?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am not sure why existing tests are written in that way but as per my understanding for transmuxing decoder/encoder should not get involved. So the below check for decoder should not be there. We should definitely check for muxer, which is done separately here and should remain.

You can leave the existing test as it is.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this is not fully resolved as the decoder check (AndroidTestUtil.skipAndLogIfFormatsUnsupported) is still present which should not be required for transmuxing.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

OK, I'll remove this part of checking.

if (AndroidTestUtil.skipAndLogIfFormatsUnsupported(
    context,
    testId,
    /* inputFormat= */ MP4_ASSET_DOLBY_VISION_HDR_FORMAT,
    /* outputFormat= */ null)) {
  return;
}

return;
}

if (AndroidTestUtil.skipAndLogIfFormatsUnsupported(
context,
testId,
/* inputFormat= */ MP4_ASSET_DOLBY_VISION_HDR_FORMAT,
/* outputFormat= */ null)) {
return;
}

Transformer transformer = new Transformer.Builder(context).build();
MediaItem mediaItem = MediaItem.fromUri(Uri.parse(MP4_ASSET_DOLBY_VISION_HDR));

ExportTestResult exportTestResult =
new TransformerAndroidTestRunner.Builder(context, transformer)
.build()
.run(testId, mediaItem);
@C.ColorTransfer
int actualColorTransfer =
retrieveTrackFormat(context, exportTestResult.filePath, C.TRACK_TYPE_VIDEO)
.colorInfo
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This method is called twice, it can be called once and then the result can be stored in a variable.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

OK, I'll change it.

.colorTransfer;
assertThat(actualColorTransfer).isEqualTo(C.COLOR_TRANSFER_HLG);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we check the mime type as well?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

sure.

String actualMimeType =
retrieveTrackFormat(context, exportTestResult.filePath, C.TRACK_TYPE_VIDEO)
.sampleMimeType;
assertThat(actualMimeType).isEqualTo(VIDEO_DOLBY_VISION);
}

@Test
public void exportAndTranscode_hdr10File_whenHdrEditingIsSupported() throws Exception {
Context context = ApplicationProvider.getApplicationContext();
Expand Down Expand Up @@ -360,6 +398,71 @@ public void onFallbackApplied(
}
}

@Test
public void exportAndTranscode_dolbyVisionFile_whenHdrEditingUnsupported_toneMapsOrThrows()
throws Exception {
Context context = ApplicationProvider.getApplicationContext();
Format format = MP4_ASSET_DOLBY_VISION_HDR_FORMAT;
if (deviceSupportsHdrEditing(VIDEO_H265, format.colorInfo)) {
recordTestSkipped(context, testId, /* reason= */ "Device supports Dolby Vision editing.");
return;
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We can possibly use assumeDeviceDoesNotSupportHdrEditing() ?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sorry, I'm not familiar with resolving confliction on GitHub. Let's complete other issues then handle the confliction issue. (I fork the main branch at the beginning of this code contribution. At that time, there is no this method.)

}

if (AndroidTestUtil.skipAndLogIfFormatsUnsupported(
context, testId, /* inputFormat= */ format, /* outputFormat= */ null)) {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This method has been renamed to assumeFormatsSupported(). Sorry for the conflicts.

return;
}

AtomicBoolean isFallbackListenerInvoked = new AtomicBoolean();
AtomicBoolean isToneMappingFallbackApplied = new AtomicBoolean();
Transformer transformer =
new Transformer.Builder(context)
.addListener(
new Transformer.Listener() {
@Override
public void onFallbackApplied(
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This method is deprecated, use the alternative.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is still not resolved.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

OK, I'll change the code to use

default void onFallbackApplied(
    Composition composition,
    TransformationRequest originalTransformationRequest,
    TransformationRequest fallbackTransformationRequest) {
  MediaItem mediaItem = composition.sequences.get(0).editedMediaItems.get(0).mediaItem;
  onFallbackApplied(mediaItem, originalTransformationRequest, fallbackTransformationRequest);
}

MediaItem inputMediaItem,
TransformationRequest originalTransformationRequest,
TransformationRequest fallbackTransformationRequest) {
isFallbackListenerInvoked.set(true);
assertThat(originalTransformationRequest.hdrMode).isEqualTo(HDR_MODE_KEEP_HDR);
isToneMappingFallbackApplied.set(
fallbackTransformationRequest.hdrMode
== HDR_MODE_TONE_MAP_HDR_TO_SDR_USING_OPEN_GL);
}
})
.build();
MediaItem mediaItem = MediaItem.fromUri(Uri.parse(MP4_ASSET_DOLBY_VISION_HDR));
EditedMediaItem editedMediaItem =
new EditedMediaItem.Builder(mediaItem).setEffects(FORCE_TRANSCODE_VIDEO_EFFECTS).build();

try {
ExportTestResult exportTestResult =
new TransformerAndroidTestRunner.Builder(context, transformer)
.build()
.run(testId, editedMediaItem);
assertThat(isToneMappingFallbackApplied.get()).isTrue();
@C.ColorTransfer
int actualColorTransfer =
retrieveTrackFormat(context, exportTestResult.filePath, C.TRACK_TYPE_VIDEO)
.colorInfo
.colorTransfer;
assertThat(actualColorTransfer).isEqualTo(C.COLOR_TRANSFER_SDR);
} catch (ExportException exception) {
if (exception.getCause() != null) {
@Nullable String message = exception.getCause().getMessage();
if (message != null
&& (Objects.equals(message, "Decoding HDR is not supported on this device.")
|| message.contains(
"OpenGL ES 3.0 context support is required for HDR input or output.")
|| Objects.equals(message, "Device lacks YUV extension support."))) {
return;
}
}
throw exception;
}
}

private static boolean deviceSupportsHdrEditing(String mimeType, ColorInfo colorInfo) {
checkState(ColorInfo.isTransferHdr(colorInfo));
return !EncoderUtil.getSupportedEncodersForHdrEditing(mimeType, colorInfo).isEmpty();
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -42,6 +42,7 @@
import com.google.common.collect.ImmutableList;
import java.util.ArrayList;
import java.util.List;
import java.util.Objects;

/**
* Default implementation of {@link Codec.DecoderFactory} that uses {@link MediaCodec} for decoding.
Expand Down Expand Up @@ -135,6 +136,9 @@ public DefaultCodec createForVideoDecoding(
if (SDK_INT >= 31 && requestSdrToneMapping) {
mediaFormat.setInteger(
MediaFormat.KEY_COLOR_TRANSFER_REQUEST, MediaFormat.COLOR_TRANSFER_SDR_VIDEO);
} else if (SDK_INT >= 31 && Objects.equals(format.sampleMimeType, MimeTypes.VIDEO_DOLBY_VISION)) {
mediaFormat.setInteger(
MediaFormat.KEY_COLOR_TRANSFER_REQUEST, MediaFormat.COLOR_TRANSFER_HLG);
}

@Nullable
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -150,6 +150,13 @@ public static ImmutableList<Integer> getCodecProfilesForHdrFormat(
return ImmutableList.of(MediaCodecInfo.CodecProfileLevel.AV1ProfileMain10HDR10);
}
break;
case MimeTypes.VIDEO_DOLBY_VISION:
if (colorTransfer == C.COLOR_TRANSFER_HLG) {
return ImmutableList.of(
MediaCodecInfo.CodecProfileLevel.DolbyVisionProfileDvheSt);
Copy link
Contributor

@SheenaChhabra SheenaChhabra Jul 25, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Out of all profile, why have we chosen only this one?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good question. All exist devices which support Dolby Vision encoding on market will encode the input stream to this profile only. In future, maybe we will extend this list. But Dolby has not finalized which profiles we should support for encoding.

}
// CodecProfileLevel does not support PQ for Dolby Vision.
break;
default:
break;
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -22,9 +22,12 @@

import android.annotation.SuppressLint;
import android.media.MediaCodec;
import android.media.MediaCodecInfo;
import android.media.MediaFormat;
import android.media.MediaMuxer;
import android.os.Build;
import android.util.SparseLongArray;
import androidx.annotation.RequiresApi;
import androidx.media3.common.C;
import androidx.media3.common.Format;
import androidx.media3.common.Metadata;
Expand Down Expand Up @@ -108,6 +111,10 @@ public int addTrack(Format format) throws MuxerException {
if (isVideo) {
mediaFormat = MediaFormat.createVideoFormat(sampleMimeType, format.width, format.height);
MediaFormatUtil.maybeSetColorInfo(mediaFormat, format.colorInfo);
if (sampleMimeType.equals(MimeTypes.VIDEO_DOLBY_VISION) && SDK_INT >= 33) {
mediaFormat.setInteger(MediaFormat.KEY_PROFILE, getDvProfile(format));
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you please share some corresponding spec which explains these changes?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As far as Dolby Vision and Profile, the spec is here: https://professionalsupport.dolby.com/s/article/What-is-Dolby-Vision-Profile?language=en_US

And I'd like to add some explanation why I did this code change:

In Media3 library code FrameworkerMuxer.java, addTrack(Format format) method, it calls system MediaMuxer.addTrack(mediaFormat) method without PROFILE/LEVEL settings.

This is OK for other video formats but results in issue for Dolby Vision. The key point is in Android AOSP utils.cpp file, convertMessageToMetadata() method, KEY_PROFILE is checked. If there is no this parameter, BAD_VALUE will be returned.

see: https://cs.android.com/android/platform/superproject/main/+/main:frameworks/av/media/libstagefright/Utils.cpp;drc=216a97bdad3c0f492094250b33d82abf27098932;l=2115

mediaFormat.setInteger(MediaFormat.KEY_LEVEL, getDvLevel(format));
}
try {
mediaMuxer.setOrientationHint(format.rotationDegrees);
} catch (RuntimeException e) {
Expand Down Expand Up @@ -276,9 +283,64 @@ private static ImmutableList<String> getSupportedVideoSampleMimeTypes() {
if (SDK_INT >= 24) {
supportedMimeTypes.add(MimeTypes.VIDEO_H265);
}
if (SDK_INT >= 33) {
supportedMimeTypes.add(MimeTypes.VIDEO_DOLBY_VISION);
}
if (SDK_INT >= 34) {
supportedMimeTypes.add(MimeTypes.VIDEO_AV1);
}
return supportedMimeTypes.build();
}

// Get Dolby Vision profile
// Refer to https://professionalsupport.dolby.com/s/article/What-is-Dolby-Vision-Profile
@RequiresApi(33)
private static int getDvProfile(Format format) {
// Currently, only profile 8 is supported for encoding
// TODO: set profile ID based on format.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What is the work involved for this TODO?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Currently, only profile 8 (H.265 based) is supported for encoding. In future, maybe Dolby will support other Dolby Vision profile (e.g. profile 9/10). It's not finalized. If that is true, we need to add logic here to decide which profile should be used based on the parameter "format".

return MediaCodecInfo.CodecProfileLevel.DolbyVisionProfileDvheSt;
}

// Get Dolby Vision level
// Refer to https://professionalsupport.dolby.com/s/article/What-is-Dolby-Vision-Profile
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please write it as a java doc comment. Same comment for other method.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

OK.

@RequiresApi(33)
private static int getDvLevel(Format format) {
int level = -1;
int maxWidthHeight = Math.max(format.width, format.height);
float pps = format.width * format.height * format.frameRate;

if (maxWidthHeight <= 1280) {
if (pps <= 22118400) {
level = MediaCodecInfo.CodecProfileLevel.DolbyVisionLevelHd24; // Level 01
} else { // pps <= 27648000
level = MediaCodecInfo.CodecProfileLevel.DolbyVisionLevelHd30; // Level 02
}
} else if (maxWidthHeight <= 1920 && pps <= 49766400) {
level = MediaCodecInfo.CodecProfileLevel.DolbyVisionLevelFhd24; // Level 03
} else if (maxWidthHeight <= 2560 && pps <= 62208000) {
level = MediaCodecInfo.CodecProfileLevel.DolbyVisionLevelFhd30; // Level 04
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do you know why the value of these constants is not same as mentioned level? For example DolbyVisionLevelFhd30 = 8

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Because the codec profile/level constants should be a bitmask, which is defined by Android AOSP. Dolby defined our profile/level to follow this rule. See https://cs.android.com/android/platform/superproject/main/+/main:frameworks/base/media/java/android/media/MediaCodecInfo.java;drc=6a4bef0f90e822e19866e53a98b85029bff04ea0;l=4301

I think Google defined it in this way due to efficiency. For example, to judge whether one profile is supported by current device, you just need to do one time comparison:
if (appointedLevel & (DolbyVisionLevelFhd24 | DolbyVisionLevelFhd24) {
// Current device support this level
} else {
// Current device doesn't support this level
}

} else if (maxWidthHeight <= 3840) {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It's for different purpose. The method in your link is to PARSE the Dolby Vision profile and level from codec string. The use case is to play a streaming content. You can get codec string from a MPEG-DASH manifest file. Then paring this codec string, you will get profile/level info. This info can be used to select which Dolby Vision decoder to handle this streaming content (H.264 based Dolby Vision decoder or H.265 based Dolby Vision decoder). In this case, Format.codecs is a valid codec string.

As far as this code contribution case, it is used to decide which Dolby Vision profile/level should be used based on incoming content width/height/framerate. In this case. Format.codecs is NULL.

if (pps <= 124416000) {
level = MediaCodecInfo.CodecProfileLevel.DolbyVisionLevelFhd60; // Level 05
} else if (pps <= 199065600) {
level = MediaCodecInfo.CodecProfileLevel.DolbyVisionLevelUhd24; // Level 06
} else if (pps <= 248832000) {
level = MediaCodecInfo.CodecProfileLevel.DolbyVisionLevelUhd30; // Level 07
} else if (pps <= 398131200) {
level = MediaCodecInfo.CodecProfileLevel.DolbyVisionLevelUhd48; // Level 08
} else if (pps <= 497664000) {
level = MediaCodecInfo.CodecProfileLevel.DolbyVisionLevelUhd60; // Level 09
} else { // pps <= 995328000
level = MediaCodecInfo.CodecProfileLevel.DolbyVisionLevelUhd120; // Level 10
}
} else if (maxWidthHeight <= 7680) {
if (pps <= 995328000) {
level = MediaCodecInfo.CodecProfileLevel.DolbyVisionLevel8k30; // Level 11
} else { // pps <= 1990656000
level = MediaCodecInfo.CodecProfileLevel.DolbyVisionLevel8k60; // Level 12
}
}

return level;
}
}