I've been deveoping an android app (using ChatGPT - I have no real android development knowledge!) which animates a dog's mouth through the camera feed. I had the app working okay but with some latency between the camera feed and the overlay (using CameraX and a PreviewView), so I tried to update it to an offscreen rendering approach, using an EGL context and MediaCodec surface to render the camera feed plus overlays into a single composited output.
However, the I have not implemented this correclty and the camera feed is just showing a black screen.
Here's ChatGPT's summary of what we had originally (working) and what we changed it to. Both of these approaches are saved on different github branches which you will have access to. You can see the previously working version on the master branch, and the newly updated (not working) version on a test branch.
What We Had Originally - An Android app (Kotlin) using CameraX and a PreviewView to display the live camera feed. - A BoundingBoxOverlay custom View drawn on top for YOLO pose detection (dog’s mouth, eyes, etc.). - We wanted to animate the dog’s mouth in sync with audio but were limited by the traditional overlay.
What We Changed It To Offscreen Rendering Approach: - Created an EGL context and MediaCodec surface (via OffscreenRenderer.kt) to render the camera feed plus overlays into a single composited output. - The camera feed is now provided by a SurfaceTexture created from an external OES texture (rather than using PreviewView). - CameraX is configured to output frames to that SurfaceTexture. - A custom pipeline composites the camera and any overlays (animated mouth, etc.) in OpenGL, then encodes to a video file with MediaCodec.
What We’re Trying to Achieve A Snapchat-like pipeline: - Grab the camera feed directly via a SurfaceTexture (external OES texture). - Apply real-time overlays/filters (like mouth animation, hats, glasses) in OpenGL. - Composite everything offscreen and encode it into a video file (for recording). - Avoid the performance constraints of standard Android Views by relying on OpenGL compositing (similar to Snapchat filters).
Despite these changes, the feed currently displays a black screen. We suspect an issue with the GL pipeline, resolution mismatch, or camera frames not drawing as intended.
Goal: Fix the offscreen renderer so that camera frames plus overlays appear in real time, and video recording also works as expected.