Table of Contents

Live555 Streaming in Android Application: A Practical Approach from eInfochips

Since its 2008 debut, Android has grown into a widely adopted operating system, extending beyond smartphones and TVs to embedded devices like vending machines, parking meters, binoculars, and medical devices. At eInfochips, we have contributed to 70+ embedded products based on Android operating systems across applications including AR, VR, medical, and video conferencing solutions. Our customers frequently highlight live streaming from Android-powered embedded devices as a fundamental requirement across various use cases. 

Leveraging our expertise in Android customization, this blog explores implementing live streaming on an Android-based embedded product, detailing how to port Live555 to the platform and design an architecture from application to native interface to enable live video streaming. 

Key Components 

First, let’s start with the basics of live streaming. When we say live streaming, it requires core hardware components such as camera module and Image Sensor Processor (ISP), encoders, and network interfaces like ethernet or Wi-Fi. Software implementation is required to encode the camera frame and transmit over the network via TCP/UDP packet. 

 

Key Components 

 

Bayer camera sensors provide RAW frame data which is then processed by on-chip ISP to get the YUV frame data. YUV sensors already provide ISP processed YUV frames data which can be passed onto Camera software stack (for e.g., Driver/HAL). These frames are propagated to the application layer with the help of standard Android OS architecture. Different vendors such as NXP and Qualcomm have different vendor HAL implementations but the interface with the application framework follows Android standards. At the application layer, these YUV frames are provided to HW codecs and HW codecs encodes the YUV frames to H.264 or H.265. Encoded frames are provided to Live555 (These encoded frames can be dumped to a file for debugging and played using media players such as VLC, FFMPEG, and others). 

The Live555 source is available at Live555Media. It is implemented in C++ and released under LGPL license. For more details, please refer to the license terms. I used Live555 version 2023.11.07 (you can find this in liveMedia/include/liveMedia_version.hh) along with Android 10 for one of our customers’ products. Live555 needs to be cross compiled for the architecture ARM/ARM64 and can be integrated into the application via a Java Native Interface (JNI) implementation. Live555 uses the standard network socket APIs, which are supported at the JNI layer of the Android application. Encoded frames received by Live555 are packetized to transmit over the network as Real-Time Streaming Protocol (RTSP) video stream. This video stream can be played on any RTSP players such as VLC and FFMPEG.

 

 

The above diagram represents the high-level architecture of the Android application implementation. The Android application has Java/Kotlin implementation and Native C/C++ implementation that is a dynamic library or group of libraries linked during runtime. The application has three major sub modules that run in parallel and are internally dependent on each other to accomplish the video livestream use case. 

Camera Manager 

Camera Manager is a Java/Kotlin implementation built on Android’s Camera2 APIs. It manages tasks such as opening and closing the camera device, creating camera sessions, and configuring capture requests to obtain YUV frames from the camera. YUV frames received at the application layers are put into a double ended queue (yuv_deque). 

Encoder 

Encoder is a Java/Kotlin implementation based on MediaCodec APIs of Android. It is used for encoding the YUV frames on hardware codecs. MedicaCodec APIs provide various configurations that can be set to the H/W codecs. For e.g., color format, bitrate, frame rate, and encoder type. MediaCodec works on input/output buffer mechanism where YUV frame data is provided to the input buffer and encoded data is received by the output buffer. The encoded data is then provided to the Native layer via JNI. 

Live555 

The native C/C++ layer that contains the Live555 implementation must be compliant with LGPL. The internal structure should be developed to subclass the LIVE555 library code without modifying it. A wrapper library can be created with the deque for holding the encoded frames (enc_deque) that are received via JNI. Subclasses can be created to start the RTSP server and create the MediaSubSession for H.264 or H.265. Encoded frames are picked from the deque (enc_deque) and transmitted over the network.  

Challenges faced during the implementation 

  • Live555 does not provide support to make file configuration for the Android platform 
  • Memory intensive operation to handle a queue of YUV frames in application 
  • Proper synchronization is crucial for managing input buffers (YUV frames) and output buffers (encoded frames) effectively 
  • Handling Live555 events (for e.g., client connected, client disconnected, and so on) at the Android application layer 

 

By addressing the challenges encountered, the outlined architecture was successfully implemented for one of our customers, achieving a streaming latency of 200ms for 1080p@30FPS, from the camera module to the RTSP player. 

Conclusion:  

Android is widely used in embedded products and the Live555 architecture can be implemented on such custom products. The architecture described above can be an easy fit for these products that have use cases of live video streaming such as digital cameras, smart binoculars, smart glasses, and instruments used for civil engineering and in medical devices. 

Picture of Ronak Patel

Ronak Patel

Mr. Ronak Patel is currently serving as a Technical Lead in the Embedded Software group at eInfochips. He has worked on various products, including marine tablets, digital binoculars, wearable devices that support Android or Linux operating systems. Mr. Patel specializes in Android framework and middleware customization. He has extensive experience in porting and optimizing different Android versions for embedded products, with expertise in platforms such as Qualcomm (APQ8096, SDM845, QCS605, SDM429W) and NXP (iMX6).

Explore More

Talk to an Expert

Subscribe
to our Newsletter
Stay in the loop! Sign up for our newsletter & stay updated with the latest trends in technology and innovation.

Start a conversation today

Schedule a 30-minute consultation with our Battery Management Solutions Expert

Start a conversation today

Schedule a 30-minute consultation with our Industrial & Energy Solutions Experts

Start a conversation today

Schedule a 30-minute consultation with our Automotive Industry Experts

Start a conversation today

Schedule a 30-minute consultation with our experts

Please Fill Below Details and Get Sample Report

Reference Designs

Our Work

Innovate

Transform.

Scale

Partnerships

Device Partnerships
Digital Partnerships
Quality Partnerships
Silicon Partnerships

Company

Products & IPs