Android & Kotlin

Camera2 API Android, Audio Video out of Sync in Issue

Pinterest LinkedIn Tumblr

Introduction

If you facing any issues in video recording in camera2 APIs in Android. Almost, you are in right place. Camera2 APIs are not working well in few devices like Samsung, Nexus 5, etc.

Mostly the followings issue have occurred in Camera2 API Android

  • The recording is successfully complete but while you play this file video will freeze in the first frame.
  • Sometimes you feel the video appears to be frozen in playback or recorded video playback video and audio is out of Sync
  • First audio plays completely and then the video animates after the audio completes. If you look at the video playback controls the video starts when it looks like the playback is at the end of the video

So all the above issue is the same.

Its happen due to audio and video frame out of sync. Now come on solution We need to sync audio and video frame.

1. Prerqusion

If you want to no more about Camera2 APIs, Please go to my previous blog Camera2VideoRecording. In this blog, I have build plug-and-play solutions for Camera2 APIs.

2. Add mp4parser dependency in build.gradle

dependencies {
    //Video Parser
    implementation 'com.googlecode.mp4parser:isoparser:1.1.22'
}

3. In the previous example in Video Recording in Camera2 API in CameraFragment.java.

While we call startRecordingVideo() Utility methods. The file created in local storage get this file using getCurrentFile() methods.

startRecordingVideo();
//Receive out put file here
 mOutputFilePath = getCurrentFile().getAbsolutePath();

mOutputFilePath is the path out the recorded file. Now fetch audio and video tracks check audio and video delta. if delta is greater then 10000 means audio and video track is out of sync. You are thinking why we will take 10000 here,10000 seems sufficient since for 30 fps the normal delta is about 3000. If the frame is out of sync then merge these track using MP$ Parser lib. The code looks like

private String parseVideo(String mFilePath) throws IOException {
       DataSource channel = new FileDataSourceImpl(mFilePath);
       IsoFile isoFile = new IsoFile(channel);
       List<TrackBox> trackBoxes = isoFile.getMovieBox().getBoxes(TrackBox.class);
       boolean isError = false;
       for (TrackBox trackBox : trackBoxes) {
           TimeToSampleBox.Entry firstEntry = trackBox.getMediaBox().getMediaInformationBox().getSampleTableBox().getTimeToSampleBox().getEntries().get(0);
           // Detect if first sample is a problem and fix it in isoFile
           // This is a hack. The audio deltas are 1024 for my files, and video deltas about 3000
           // 10000 seems sufficient since for 30 fps the normal delta is about 3000
           if (firstEntry.getDelta() > 10000) {
               isError = true;
               firstEntry.setDelta(3000);
           }
       }
       File file = getOutputMediaFile();
       String filePath = file.getAbsolutePath();
       if (isError) {
           Movie movie = new Movie();
           for (TrackBox trackBox : trackBoxes) {
               movie.addTrack(new Mp4TrackImpl(channel.toString() + "[" + trackBox.getTrackHeaderBox().getTrackId() + "]", trackBox));
           }
           movie.setMatrix(isoFile.getMovieBox().getMovieHeaderBox().getMatrix());
           Container out = new DefaultMp4Builder().build(movie);
 
           //delete file first!
           FileChannel fc = new RandomAccessFile(filePath, "rw").getChannel();
           out.writeContainer(fc);
           fc.close();
           Log.d(TAG, "Finished correcting raw video");
           return filePath;
       }
       return mFilePath;
   } 

The getOutputMediaFile() method create a new file and return File Object.

**
 * Create directory and return file
 * returning video file
 */
private File getOutputMediaFile() {
    // External sdcard file location
    File mediaStorageDir = new File(Environment.getExternalStorageDirectory(),
            VIDEO_DIRECTORY_NAME);
    // Create storage directory if it does not exist
    if (!mediaStorageDir.exists()) {
        if (!mediaStorageDir.mkdirs()) {
            Log.d(TAG, "Oops! Failed create "
                    + VIDEO_DIRECTORY_NAME + " directory");
            return null;
        }
    }
    String timeStamp = new SimpleDateFormat("yyyyMMdd_HHmmss",
            Locale.getDefault()).format(new Date());
    File mediaFile;
 
    mediaFile = new File(mediaStorageDir.getPath() + File.separator
            + "VID_" + timeStamp + ".mp4");
    return mediaFile;
}

Therefore final fragment looks like this

CameraFragment.java
**
 * A simple {@link Fragment} subclass.
 * Use the {@link CameraFragment#newInstance} factory method to
 * create an instance of this fragment.
 */
public class CameraFragment extends CameraVideoFragment {
 
    private static final String TAG = "CameraFragment";
    private static final String VIDEO_DIRECTORY_NAME = "AndroidWave";
    @BindView(R.id.mTextureView)
    AutoFitTextureView mTextureView;
    @BindView(R.id.mRecordVideo)
    ImageView mRecordVideo;
    @BindView(R.id.mVideoView)
    VideoView mVideoView;
    @BindView(R.id.mPlayVideo)
    ImageView mPlayVideo;
    Unbinder unbinder;
    private String mOutputFilePath;
 
 
    public CameraFragment() {
        // Required empty public constructor
    }
 
    /**
     * Use this factory method to create a new instance of
     * this fragment using the provided parameters.
     */
 
 
    public static CameraFragment newInstance() {
        CameraFragment fragment = new CameraFragment();
        Bundle args = new Bundle();
        fragment.setArguments(args);
        return fragment;
    }
 
    @Override
    public void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
 
    }
 
    @Override
    public View onCreateView(LayoutInflater inflater, ViewGroup container,
                             Bundle savedInstanceState) {
        // Inflate the layout for this fragment
        View view = inflater.inflate(R.layout.fragment_camera, container, false);
        unbinder = ButterKnife.bind(this, view);
        return view;
    }
 
    @Override
    public int getTextureResource() {
        return R.id.mTextureView;
    }
 
    @Override
    protected void setUp(View view) {
 
    }
 
    @OnClick({R.id.mRecordVideo, R.id.mPlayVideo})
    public void onViewClicked(View view) {
        switch (view.getId()) {
            case R.id.mRecordVideo:
                /**
                 * If media is not recoding then start recording else stop recording
                 */
                if (mIsRecordingVideo) {
                    try {
                        stopRecordingVideo();
                        prepareViews();
                    } catch (Exception e) {
                        e.printStackTrace();
                    }
 
                } else {
                    startRecordingVideo();
                    mRecordVideo.setImageResource(R.drawable.ic_stop);
                    //Receive out put file here
                    mOutputFilePath = getCurrentFile().getAbsolutePath();
                }
                break;
            case R.id.mPlayVideo:
                mVideoView.start();
                mPlayVideo.setVisibility(View.GONE);
                break;
        }
    }
 
    private void prepareViews() {
        if (mVideoView.getVisibility() == View.GONE) {
            mVideoView.setVisibility(View.VISIBLE);
            mPlayVideo.setVisibility(View.VISIBLE);
            mTextureView.setVisibility(View.GONE);
            try {
                setMediaForRecordVideo();
            } catch (IOException e) {
                e.printStackTrace();
            }
        }
    }
 
    private void setMediaForRecordVideo() throws IOException {
        mOutputFilePath = parseVideo(mOutputFilePath);
        // Set media controller
        mVideoView.setMediaController(new MediaController(getActivity()));
        mVideoView.requestFocus();
        mVideoView.setVideoPath(mOutputFilePath);
        mVideoView.seekTo(100);
        mVideoView.setOnCompletionListener(mp -> {
            // Reset player
            mVideoView.setVisibility(View.GONE);
            mTextureView.setVisibility(View.VISIBLE);
            mPlayVideo.setVisibility(View.GONE);
            mRecordVideo.setImageResource(R.drawable.ic_record);
        });
    }
 
    @Override
    public void onDestroyView() {
        super.onDestroyView();
        unbinder.unbind();
    }
 
   <strong> private String parseVideo(String mFilePath) throws IOException {
        DataSource channel = new FileDataSourceImpl(mFilePath);
        IsoFile isoFile = new IsoFile(channel);
        List<TrackBox> trackBoxes = isoFile.getMovieBox().getBoxes(TrackBox.class);
        boolean isError = false;
        for (TrackBox trackBox : trackBoxes) {
            TimeToSampleBox.Entry firstEntry = trackBox.getMediaBox().getMediaInformationBox().getSampleTableBox().getTimeToSampleBox().getEntries().get(0);
            // Detect if first sample is a problem and fix it in isoFile
            // This is a hack. The audio deltas are 1024 for my files, and video deltas about 3000
            // 10000 seems sufficient since for 30 fps the normal delta is about 3000
            if (firstEntry.getDelta() > 10000) {
                isError = true;
                firstEntry.setDelta(3000);
            }
        }
        File file = getOutputMediaFile();
        String filePath = file.getAbsolutePath();
        if (isError) {
            Movie movie = new Movie();
            for (TrackBox trackBox : trackBoxes) {
                movie.addTrack(new Mp4TrackImpl(channel.toString() + "[" + trackBox.getTrackHeaderBox().getTrackId() + "]", trackBox));
            }
            movie.setMatrix(isoFile.getMovieBox().getMovieHeaderBox().getMatrix());
            Container out = new DefaultMp4Builder().build(movie);
 
            //delete file first!
            FileChannel fc = new RandomAccessFile(filePath, "rw").getChannel();
            out.writeContainer(fc);
            fc.close();
            Log.d(TAG, "Finished correcting raw video");
            return filePath;
        }
        return mFilePath;
    }</strong>
 
    /**
     * Create directory and return file
     * returning video file
     */
    private File getOutputMediaFile() {
        // External sdcard file location
        File mediaStorageDir = new File(Environment.getExternalStorageDirectory(),
                VIDEO_DIRECTORY_NAME);
        // Create storage directory if it does not exist
        if (!mediaStorageDir.exists()) {
            if (!mediaStorageDir.mkdirs()) {
                Log.d(TAG, "Oops! Failed create "
                        + VIDEO_DIRECTORY_NAME + " directory");
                return null;
            }
        }
        String timeStamp = new SimpleDateFormat("yyyyMMdd_HHmmss",
                Locale.getDefault()).format(new Date());
        File mediaFile;
 
        mediaFile = new File(mediaStorageDir.getPath() + File.separator
                + "VID_" + timeStamp + ".mp4");
        return mediaFile;
    }
}

Download Sample Project- Camera2 API Android, Audio Video out of Sync in Issue

Other related articles

4 Comments

  1. Nilser Stip Reply

    hi, very good tutorial … I am looking for a way to pause and resume recording, what I found is that several videos are recorded and then joined; could you implement that in a new tutorial?

  2. Works perfectly! Thanks a lot mate . This piece of code is golden.

  3. Brett Rose Reply

    Hi there, I am having this problem on Samsung devices but when I use your code all it does is produce a black video. It overwrites the originally recorded video and doesn’t have video anymore. I’m not seeing any errors either.

Write A Comment