Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Audio & Video Sync Issue #1649

Open
nirmala-ncompass opened this issue Nov 21, 2024 · 5 comments
Open

Audio & Video Sync Issue #1649

nirmala-ncompass opened this issue Nov 21, 2024 · 5 comments

Comments

@nirmala-ncompass
Copy link

nirmala-ncompass commented Nov 21, 2024

Hi @pedroSG94 My Custom Implementation is:
I am recording audio and video in file, mean while I am extending and modified AndroidMuxerRecordController class to get audio and video buffers. In my custom class I am receiving it in one method (onBuffervailable) Here based on my video frame count i am splitting chunks and writing in different file. Here I am resetting Buffer PTS value for each file. Kindly check below code. Audio and video is good in local video. But i my case when i am stetching those videos in mediaLive. The video is good but audio is playing bit delay. Its not in sync. Could you please help me How to set new PTS value for the buffers in one file with audio and video in sync?

`

   private static final int VIDEO_TIME_BASE = 1000 * 1000; 
   private static final int AUDIO_TIME_BASE = 1000 * 1000;
  private static final int VIDEO_FRAME_RATE = 30;
  private static final long VIDEO_FRAME_DURATION = VIDEO_TIME_BASE / VIDEO_FRAME_RATE;
  private static final int AUDIO_SAMPLE_RATE = 32000; // 32 kHz
  private static final int AUDIO_FRAME_SIZE = 1024; // AAC frame size
 private static final long AUDIO_FRAME_DURATION = (long) AUDIO_FRAME_SIZE * AUDIO_TIME_BASE / 
 AUDIO_SAMPLE_RATE;
 private long videoPtsCounter = 0;
 private long audioPtsCounter = 0;

     public void onBufferAvailable(ByteBuffer byteBuffer, MediaCodec.BufferInfo bufferInfo, boolean isVideo) {
   
      synchronized (this) {
        
      // Copy the ByteBuffer to prevent overwriting
        ByteBuffer bufferCopy = ByteBuffer.allocate(byteBuffer.remaining());
        bufferCopy.put(byteBuffer);
        bufferCopy.flip();

        long calculatedPts;

        if (isVideo) {
            // Calculate video PTS
            calculatedPts = videoPtsCounter * VIDEO_FRAME_DURATION;
            videoPtsCounter++;
        }
        else {
            // Calculate audio PTS
            calculatedPts = audioPtsCounter * AUDIO_FRAME_DURATION;
            audioPtsCounter++;
        }

        // Set the final adjusted PTS
        bufferInfo.presentationTimeUs = calculatedPts;
        
        // Create buffer data object
        BufferData bufferData = new BufferData(bufferCopy, new MediaCodec.BufferInfo(), isVideo);
        bufferData.bufferInfo.set(bufferInfo.offset, bufferInfo.size, bufferInfo.presentationTimeUs, bufferInfo.flags);

        // Track video frame count
        if (isVideo) {
            videoFrameCounter++;
        } else audioFrameCounter++;
        // Add buffer data to the buffer list
        if (useFirstArray) {
            bufferList1.add(bufferData);
        } else {
            bufferList2.add(bufferData);
        }
        // Process buffer if we hit the frame limit
        if (videoFrameCounter >= MAX_VIDEO_FRAMES) {
            processBufferInBackground(useFirstArray ? bufferList1 : bufferList2, false);
            if (useFirstArray) {
                bufferList1.clear();
            } else {
                bufferList2.clear();
            }
            audioPtsCounter = 0;
            videoPtsCounter = 0;
            audioFrameCounter = 0;
            videoFrameCounter = 0; // Reset frame counter
            useFirstArray = !useFirstArray; // Switch buffer list
        }
    }
}

`

@pedroSG94
Copy link
Owner

Hello,

Did you tried using the clock to create the pts instead of calculate it based in the config?
You can try it in both (video/audio) or only in one to test:

bufferInfo.presentationTimeUs = System.nanoTime() / 1000 - presentTimeUs;

Where presentTimeUs is (when you start encoder or receive the first frame):

System.nanoTime() / 1000

@nirmala-ncompass
Copy link
Author

nirmala-ncompass commented Nov 25, 2024

Thanks @pedroSG94 I tried your logic for PTS calculation But in media live, the entire audio is playing like cracked audio. In local video its good. Basically I want the video and audio is in sync and also good. every file buffers need to reset the PTS. Is it any way to find the fps received in any class method? How could I achieve this custom implementation with audio and video good quality? Give me insights.
`

          if(isVideo && videoBaseTimestamp == -1)
                   videoBaseTimestamp = System.nanoTime() / 1000;
          else if(!isVideo && audioBaseTimestamp == -1)  
                  audioBaseTimestamp = System.nanoTime() / 1000;

        if(isVideo)  {
            bufferInfo.presentationTimeUs = System.nanoTime() / 1000 - videoBaseTimestamp;
        }
        else  {
            bufferInfo.presentationTimeUs = System.nanoTime() / 1000 - audioBaseTimestamp;
        }`

@pedroSG94
Copy link
Owner

Hello,

I have other possible solution.

If you always have the same delay you can try change the start PTS to the audio artificially using your first way to calculate the PTS.
For example, if you have the audio with a delay of 500ms you can start with PTS value of 500_000_000 (500ms in nanoseconds) instead of 0.

@nirmala-ncompass
Copy link
Author

Hi @pedroSG94 Audio and video is good and also in sync with 30FPS devices. For some devices FPA is low - 14 to 28, so audio and video not in sync. I am calculating VIDEO_FRAME_DURATION with static 30FPS value. Is it because of that? Is there any way to know frames received in a devices ? How could I calculate PTS value with support all the devices?

@pedroSG94
Copy link
Owner

Hello,

You can try calculate only video PTS using the clock like this:

Thanks @pedroSG94 I tried your logic for PTS calculation But in media live, the entire audio is playing like cracked audio. In local video its good. Basically I want the video and audio is in sync and also good. every file buffers need to reset the PTS. Is it any way to find the fps received in any class method? How could I achieve this custom implementation with audio and video good quality? Give me insights. `

          if(isVideo && videoBaseTimestamp == -1)
                   videoBaseTimestamp = System.nanoTime() / 1000;
          else if(!isVideo && audioBaseTimestamp == -1)  
                  audioBaseTimestamp = System.nanoTime() / 1000;

        if(isVideo)  {
            bufferInfo.presentationTimeUs = System.nanoTime() / 1000 - videoBaseTimestamp;
        }
        else  {
            bufferInfo.presentationTimeUs = System.nanoTime() / 1000 - audioBaseTimestamp;
        }`

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants