Understanding 10-Bit Camera System in Android 13
Last Updated :
07 Feb, 2023
10-Bit Camera Stream capture had been a long-desired feature that was requested by the community worldwide, as smartphone cameras have been evolving for a very long, and now we have arrived at a point, where we do not wish to carry heavy cameras, as our phones already have supported hardware. In this article, we will be looking at the latest improvement to the Camera API, which allows users on Android 13 to shoot and record stuff using the 10 Billion Colors that they always wished to achieve.
How to Check For Requirements?
The device must have a 10-bit or higher capable camera sensor with the corresponding ISP capability in order to handle 10-bit camera output.
What is 10-Bit Capture?
For devices running Android 13 and higher, dynamic range profiles that can be set by the camera client as part of the stream configuration allow Android to enable 10-bit camera output. Support for 10-bit dynamic range profiles like HLG10, HDR 10, HDR 10+, and Dolby Vision can be added by device manufacturers.
By executing getSupportedProfiles, camera clients can learn which 10-bit dynamic range profiles are supported by a device thanks to 10-bit camera output support. When the capture request constraints are available, the framework also produces an object of DynamicRangeProfiles that contains details about the allowed dynamic range profiles. Support for the HLG10 profile is required. The REQUEST_RECOMMENDED_TEN BIT_DYNAMIC_RANGE_PROFILE field contains the recommended dynamic range profile.
How Does the Third Part App Differ from Stock Camera App?
Google highly advises making sure that the outcomes of recording 10-bit films with a third-party app are comparable to, if not the same, as the native camera app. As a result, adjustments made to the native app's exposure, dynamic range, and color should also be made to third-party apps. Use the Camera2Video example app on GitHub to test the video recording functionality of a third-party app that supports 10-bit camera output on your device. Due to the wide range of sensors, panels, viewing conditions, and vendor preferences, the following guidance is meant to serve as an illustration of the visible characteristics of HDR without providing any precise statistics.
How to Implement the Stream?
If you need to implement 10-Bit capturing you need to adhere to certain conditions, which are:
Step #1: Describe camera capabilities with ANDROID_REQUEST_AVAILABLE_CAPABILITIES_DYNAMIC_RANGE_TEN_BIT.
Step #2: Create a bitmap of all supported dynamic range profiles' restrictions and populate the ANDROID REQUEST AVAILABLE DYNAMIC RANGE PROFILES MAP variable.
Step #3: Support for the HLG10 profile is required. Additionally, you must provide a suggested dynamic range profile to let camera clients know what format is best supported.
Step #4: When configuring streams that use the P010 format, be sure to support the dynamic range profile value or an implementation-defined format (ImageFormat.PRIVATE).
Step #5: Before informing the camera service, configure the static or dynamic metadata buffer of processed Gralloc 4 buffers according to the dynamic range profile.
How to Test the Implementation?
We advise carrying out the following three rounds of validation to verify your implementation of 10-bit camera output and make sure that third-party apps may enable the feature.
- Compared to third-party software, the native camera
- Check the functionality of the API.
- High dynamic range and regular dynamic range can be compared.
Use Cases of 10-Bit Capture
Record videos utilizing a variety of various scenes using both the native camera app and the Camera2Video to compare the native camera app and third-party software.
The use cases which can be thought of are as follows:
- A vivid outdoor landscape with shiny things like a car's chrome bumpers that reflect the light, creating brilliant highlights. This validates the creation of vivid images with vividly colored highlights.
- Depending on the device's tuning, the HDR clip appears to have a brighter screen than the SDR clip in the bright output scene.
- Depending on headroom, the total scene's screen brightness for the HDR clip needs to be higher.
The color and tone of the HDR and SDR clips in the mid-range, low dynamic range indoor capture are similar, with the HDR capture possibly being brighter than the SDR. - It shouldn't be darker in the HDR than in the SDR. If tuning decisions prevent this, make sure that the third-party app's behavior corresponds to that of the built-in camera app.
- A setting that ranges from medium to low light with a bright object, like a candle or a little bright light, produces a wide brightness range. This demonstrates the dynamic range and auto-exposure behavior.
- A medium-sized, low-dynamic range scene, such as a natural inside setting in a house or workplace.
Conclusion
Comparing video captures made with SDR (no HDR profile) to HDR videos can show you whether or not the benefits of utilizing a 10-bit dynamic range profile over a conventional dynamic range profile are apparent in the captured images. Hope this article helped you understand the main concept behind implementing the 10-Bit recording support, and how it can benefit your app. Do check out the other articles in the Android 13 articles series.
Similar Reads
Understanding Sensor Rate Limitations in Android 13
Sensors are an aromatic part of your Android application, they can be useful for all purposes, whether your app is a gaming app, a compass, or even a simple directional utility app. However, Android 13 introduced some new Sensor Rate Limitations, so it's really vital to understand how these affect y
3 min read
Understanding USB Communication in Android Apps
In this article, we are going to see how to interact with USB in Android apps. What things are Required and Also 1 mini Project to Get a Clear Idea of How Things Work When We Need to Interact Android Apps With USB. USB Endpoints and InterfacesEndpoints:Endpoints are the channels through which data i
7 min read
Understanding Density Independence Pixel: sp, dp, dip in Android
One of the most crucial factors to consider while developing an Android Application is the Responsiveness of the UI. It's very important that your app looks good on all kinds of devices. Mobile devices can have different screen sizes as well as different pixel densities. Using constant values to def
3 min read
Migrate From RenderScript in Android 13
The majority of the time, Android apps are made to use as few resources as feasible. However, some Android applications, such as some 3D games, require powerful processing. RenderScript is a framework for high-performance computation-intensive job execution on Android. Although serial workloads can
4 min read
Storage Access Framework in Android 13
Storage Access Framework is first introduced in Android 4.4, but has come a long way, and has hugely improved in Android 13. Users can easily browse and open documents, photos, and other files from any of their favorite document storage providers thanks to the SAF. Users may browse files and access
5 min read
How to Capture HDR Videos in Android 13?
You can preview and capture HDR video material with your camera thanks to the Camera2 APIs' support for high dynamic range (HDR) video capture. The video which is taken in HDR is way ahead of the Standard Video that your current app is currently rendering, and is a must to use if the user's device s
4 min read
Granular Media Permissions in Android 13
If you are developing or upgrading your app for Android 13 then you will need to have allowed the more granular permission which is newly introduced in the Android 13 SDK. Using these new APIs the user will tend to have better control over the data which he/she shares with your app, this is again do
4 min read
App Modularization in Android 13
Multi Gradle Modularization is a feature introduced in Android 13, wherein you can have multiple grade and project files working inside the same component. In this Geeks for Geeks article we will understand how App Modularization can help you develop apps that use the best practices and get more use
5 min read
Transitioning to x64 Architecture in Android
If your app solely utilizes code written in Java or Kotlin, including any libraries or SDKs, it is already ready for 64-bit devices. If your app utilizes native code, or if you are unclear whether it does, you must evaluate it and take action. But why are we actually transitioning? Google plans to t
4 min read
How to Resize a Bitmap in Android?
ImageViews are used within the android application to display different types of images. Many times images are displayed within the ImageView using bitmap instead of drawable files. In this article, we will take a look at How to Resize Bitmap in the android application to resize our image to be disp
3 min read