Android & Kotlin

Video Recording with Camera2 API in Android

Pinterest LinkedIn Tumblr

Introduction

Camera2 API is an upgraded model of the Camera device. Today you look many apps with rich camera features in markets like Instagram and Snapchat.   In earlier, we used the camera for video and image capture.

In 2014 google introduce Camera2 API with lollipop version (API Version 21). I would like to suggest you must use Camera API 2 if not have version constraint. Camera2 API is not supported below 21 API.

Camera2 device model takes input request to capture a single frame and a single image as per user. Each request is processed in a order, multiple requests can process at a time.

If already integrate camera2 API in your project and you feel the video appears to be frozen in playback, read our another article Audio Video out of Sync in Issue

Prerequisite

  • Create a new Project
  • Ensure is  MinSDK 21
  • Request storage, mic and camera permission
  • Setup Camera2 API

Step-1. Project Setup

Create a new project in android studio from File Menu => New Project => Enter Project Name => Set min SDK 21 => Select EmptyActivity from template.

Step-2. First of all set Min SDK 21

You must set minSdkVersion 21.  Camera2 API is not supported below 21 API.    

android {
    compileSdkVersion 27
    defaultConfig {
        applicationId "com.androidwave.camera2video"
        minSdkVersion 21
        targetSdkVersion 27
        versionCode 1
        versionName "1.0"
        testInstrumentationRunner "android.support.test.runner.AndroidJUnitRunner"
        vectorDrawables.useSupportLibrary = true
    }
    buildTypes {
        release {
            minifyEnabled false
            proguardFiles getDefaultProguardFile('proguard-android.txt'), 'proguard-rules.pro'
        }
    }
    compileOptions {
        sourceCompatibility JavaVersion.VERSION_1_8
        targetCompatibility JavaVersion.VERSION_1_8
    }
}

Step-3. Furthermore, add below dependency Permission Request

3.1 Set below dependency for runtime permission request
dependencies {
    implementation fileTree(include: ['*.jar'], dir: 'libs')
    implementation 'com.android.support:appcompat-v7:27.1.1'
    implementation 'com.android.support.constraint:constraint-layout:1.1.3'
    implementation 'com.android.support:support-v4:27.1.1'
    implementation 'com.android.support:design:27.1.1'
    testImplementation 'junit:junit:4.12'
    androidTestImplementation 'com.android.support.test:runner:1.0.2'
    androidTestImplementation 'com.android.support.test.espresso:espresso-core:3.0.2'
    // request permission
    implementation 'com.karumi:dexter:4.2.0'
    // ButterKnife Dependency Injection
    implementation 'com.jakewharton:butterknife:8.8.1'
    annotationProcessor 'com.jakewharton:butterknife-compiler:8.8.1'
}
3.2 Declare below permission in AndroidManifest.xml
 <!-- declare storage, camera and audio permission -->
 <uses-permission android:name="android.permission.CAMERA"/>
 <uses-permission android:name="android.permission.RECORD_AUDIO"/>
 <uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE"/>
3.3 Request Camera Permission in runtime in activity
  /**
   * Requesting permissions storage, audio and camera at once
   */
  public void requestPermission() {
    Dexter.withActivity(getActivity())
        .withPermissions(Manifest.permission.CAMERA,
            Manifest.permission.RECORD_AUDIO,
            Manifest.permission.READ_EXTERNAL_STORAGE,
            Manifest.permission.WRITE_EXTERNAL_STORAGE)
        .withListener(new MultiplePermissionsListener() {
          @Override
          public void onPermissionsChecked(MultiplePermissionsReport report) {
            // check if all permissions are granted or not
            if (report.areAllPermissionsGranted()) {
              if (mTextureView.isAvailable()) {
                openCamera(mTextureView.getWidth(), mTextureView.getHeight());
              } else {
                mTextureView.setSurfaceTextureListener(mSurfaceTextureListener);
              }
            }
            // check for permanent denial of any permission show alert dialog
            if (report.isAnyPermissionPermanentlyDenied()) {
              // open Settings activity
              showSettingsDialog();
            }
          }
          @Override
          public void onPermissionRationaleShouldBeShown(List<PermissionRequest> permissions,
              PermissionToken token) {
            token.continuePermissionRequest();
          }
        })
        .withErrorListener(
            error -> Toast.makeText(getActivity().getApplicationContext(), "Error occurred! ",
                Toast.LENGTH_SHORT).show())
        .onSameThread()
        .check();
  }
  /**
   * Showing Alert Dialog with Settings option in case of deny any permission
   */
  private void showSettingsDialog() {
    AlertDialog.Builder builder = new AlertDialog.Builder(getActivity());
    builder.setTitle(getString(R.string.message_need_permission));
    builder.setMessage(getString(R.string.message_permission));
    builder.setPositiveButton(getString(R.string.title_go_to_setting), (dialog, which) -> {
      dialog.cancel();
      openSettings();
    });
    builder.show();
  }
  // navigating settings app
  private void openSettings() {
    Intent intent = new Intent(Settings.ACTION_APPLICATION_DETAILS_SETTINGS);
    Uri uri = Uri.fromParts("package", getActivity().getPackageName(), null);
    intent.setData(uri);
    startActivityForResult(intent, 101);
  }

Step-4. Create java file for camera preview name is AutoFitTextureView.java

Create a AutoFitTextureView which extends TextureView. The Java code looks Like this.

package com.androidwave.camera2video.camera;
import android.content.Context;
import android.util.AttributeSet;
import android.view.TextureView;
  public class AutoFitTextureView extends TextureView {
    private int mRatioWidth = 0;
    private int mRatioHeight = 0;
    public AutoFitTextureView(Context context) {
      this(context, null);
    }
    public AutoFitTextureView(Context context, AttributeSet attrs) {
      this(context, attrs, 0);
    }
    public AutoFitTextureView(Context context, AttributeSet attrs, int defStyle) {
      super(context, attrs, defStyle);
    }
    /**
     * Sets the aspect ratio for this view. The size of the view will be measured based on the ratio
     * calculated from the parameters. Note that the actual sizes of parameters don't matter, that
     * is, calling setAspectRatio(2, 3) and setAspectRatio(4, 6) make the same result.
     *
     * @param width Relative horizontal size
     * @param height Relative vertical size
     */
    public void setAspectRatio(int width, int height) {
      if (width < 0 || height < 0) {
        throw new IllegalArgumentException("Size cannot be negative.");
      }
      mRatioWidth = width;
      mRatioHeight = height;
      requestLayout();
    }
    @Override
    protected void onMeasure(int widthMeasureSpec, int heightMeasureSpec) {
      super.onMeasure(widthMeasureSpec, heightMeasureSpec);
      int width = MeasureSpec.getSize(widthMeasureSpec);
      int height = MeasureSpec.getSize(heightMeasureSpec);
      if (0 == mRatioWidth || 0 == mRatioHeight) {
        setMeasuredDimension(width, height);
      } else {
        if (width < height * mRatioWidth / mRatioHeight) {
          setMeasuredDimension(width, width * mRatioHeight / mRatioWidth);
        } else {
          setMeasuredDimension(height * mRatioWidth / mRatioHeight, height);
        }
      }
    }
  }

Step-5. Open Camera

Check if all permissions are granted then open camera using below code

  /**
   * Tries to open a {@link CameraDevice}. The result is listened by `mStateCallback`.
   */
  private void openCamera(int width, int height) {
    final Activity activity = getActivity();
    if (null == activity || activity.isFinishing()) {
      return;
    }
    CameraManager manager = (CameraManager) activity.getSystemService(Context.CAMERA_SERVICE);
    try {
      Log.d(TAG, "tryAcquire");
      if (!mCameraOpenCloseLock.tryAcquire(2500, TimeUnit.MILLISECONDS)) {
        throw new RuntimeException("Time out waiting to lock camera opening.");
      }
      /**
       * default front camera will activate
       */
      String cameraId = manager.getCameraIdList()[0];
      CameraCharacteristics characteristics = manager.getCameraCharacteristics(cameraId);
      StreamConfigurationMap map = characteristics
          .get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP);
      mSensorOrientation = characteristics.get(CameraCharacteristics.SENSOR_ORIENTATION);
      if (map == null) {
        throw new RuntimeException("Cannot get available preview/video sizes");
      }
      mVideoSize = chooseVideoSize(map.getOutputSizes(MediaRecorder.class));
      mPreviewSize = chooseOptimalSize(map.getOutputSizes(SurfaceTexture.class),
          width, height, mVideoSize);
      int orientation = getResources().getConfiguration().orientation;
      if (orientation == Configuration.ORIENTATION_LANDSCAPE) {
        mTextureView.setAspectRatio(mPreviewSize.getWidth(), mPreviewSize.getHeight());
      } else {
        mTextureView.setAspectRatio(mPreviewSize.getHeight(), mPreviewSize.getWidth());
      }
      configureTransform(width, height);
      mMediaRecorder = new MediaRecorder();
      if (ActivityCompat.checkSelfPermission(getActivity(), Manifest.permission.CAMERA)
          != PackageManager.PERMISSION_GRANTED) {
        // TODO: Consider calling
        requestPermission();
        return;
      }
      manager.openCamera(cameraId, mStateCallback, null);
    } catch (CameraAccessException e) {
      Log.e(TAG, "openCamera: Cannot access the camera.");
    } catch (NullPointerException e) {
      Log.e(TAG, "Camera2API is not supported on the device.");
    } catch (InterruptedException e) {
      throw new RuntimeException("Interrupted while trying to lock camera opening.");
    }
  }

Step-6. Choose the aspect ratio of video size

Choose the aspect ratio of video size. The video aspect ratio of the video should be 3×4 or 16×9. Also, we don’t use sizes larger than 1080p, since MediaRecorder cannot handle such a high-resolution video.

  /**
   * In this sample, we choose a video size with 3x4 for  aspect ratio. for more perfectness 720 as
   * well Also, we don't use sizes
   * larger than 1080p, since MediaRecorder cannot handle such a high-resolution video.
   *
   * @param choices The list of available sizes
   * @return The video size 1080p,720px
   */
  private static Size chooseVideoSize(Size[] choices) {
    for (Size size : choices) {
      if (1920 == size.getWidth() && 1080 == size.getHeight()) {
        return size;
      }
    }
    for (Size size : choices) {
      if (size.getWidth() == size.getHeight() * 4 / 3 && size.getWidth() <= 1080) {
        return size;
      }
    }
    Log.e(TAG, "Couldn't find any suitable video size");
    return choices[choices.length - 1];
  }
  /**
   * Given {@code choices} of {@code Size}s supported by a camera, chooses the smallest one whose
   * width and height are at least as large as the respective requested values, and whose aspect
   * ratio matches with the specified value.
   *
   * @param choices The list of sizes that the camera supports for the intended output class
   * @param width The minimum desired width
   * @param height The minimum desired height
   * @param aspectRatio The aspect ratio
   * @return The optimal {@code Size}, or an arbitrary one if none were big enough
   */
  private static Size chooseOptimalSize(Size[] choices, int width, int height, Size aspectRatio) {
    // Collect the supported resolutions that are at least as big as the preview Surface
    List<Size> bigEnough = new ArrayList<>();
    int w = aspectRatio.getWidth();
    int h = aspectRatio.getHeight();
    for (Size option : choices) {
      if (option.getHeight() == option.getWidth() * h / w &&
          option.getWidth() >= width && option.getHeight() >= height) {
        bigEnough.add(option);
      }
    }
    // Pick the smallest of those, assuming we found any
    if (bigEnough.size() > 0) {
      return Collections.min(bigEnough, new CompareSizesByArea());
    } else {
      Log.e(TAG, "Couldn't find any suitable preview size");
      return choices[0];
    }
  }

Step-7. Show video previews

After allowing above permission render preview on AutofitTextureView

  /**
   * Start the camera preview.
   */
  private void startPreview() {
    if (null == mCameraDevice || !mTextureView.isAvailable() || null == mPreviewSize) {
      return;
    }
    try {
      closePreviewSession();
      SurfaceTexture texture = mTextureView.getSurfaceTexture();
      assert texture != null;
      texture.setDefaultBufferSize(mPreviewSize.getWidth(), mPreviewSize.getHeight());
      mPreviewBuilder = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
      Surface previewSurface = new Surface(texture);
      mPreviewBuilder.addTarget(previewSurface);
      mCameraDevice.createCaptureSession(Collections.singletonList(previewSurface),
          new CameraCaptureSession.StateCallback() {
            @Override
            public void onConfigured(@NonNull CameraCaptureSession session) {
              mPreviewSession = session;
              updatePreview();
            }
            @Override
            public void onConfigureFailed(@NonNull CameraCaptureSession session) {
              Log.e(TAG, "onConfigureFailed: Failed ");
            }
          }, mBackgroundHandler);
    } catch (CameraAccessException e) {
      e.printStackTrace();
    }
  }

Step-8. Setup media recorder

So now you able to preview on screen. If we want to start video recording lets configure the MediaRecorder

  private void setUpMediaRecorder() throws IOException {
    final Activity activity = getActivity();
    if (null == activity) {
      return;
    }
    mMediaRecorder.setAudioSource(MediaRecorder.AudioSource.MIC);
    mMediaRecorder.setVideoSource(MediaRecorder.VideoSource.SURFACE);
    mMediaRecorder.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4);
    /**
     * create video output file
     */
    mCurrentFile = getOutputMediaFile();
    /**
     * set output file in media recorder
     */
    mMediaRecorder.setOutputFile(mCurrentFile.getAbsolutePath());
    CamcorderProfile profile = CamcorderProfile.get(CamcorderProfile.QUALITY_480P);
    mMediaRecorder.setVideoFrameRate(profile.videoFrameRate);
    mMediaRecorder.setVideoSize(profile.videoFrameWidth, profile.videoFrameHeight);
    mMediaRecorder.setVideoEncodingBitRate(profile.videoBitRate);
    mMediaRecorder.setVideoEncoder(MediaRecorder.VideoEncoder.H264);
    mMediaRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AAC);
    mMediaRecorder.setAudioEncodingBitRate(profile.audioBitRate);
    mMediaRecorder.setAudioSamplingRate(profile.audioSampleRate);
    int rotation = activity.getWindowManager().getDefaultDisplay().getRotation();
    switch (mSensorOrientation) {
      case SENSOR_ORIENTATION_DEFAULT_DEGREES:
        mMediaRecorder.setOrientationHint(DEFAULT_ORIENTATIONS.get(rotation));
        break;
      case SENSOR_ORIENTATION_INVERSE_DEGREES:
        mMediaRecorder.setOrientationHint(INVERSE_ORIENTATIONS.get(rotation));
        break;
    }
    mMediaRecorder.prepare();
  }

Step-9. After setting, media lets start Video Recording via below code

public void startRecordingVideo() {
    if (null == mCameraDevice || !mTextureView.isAvailable() || null == mPreviewSize) {
      return;
    }
    try {
      closePreviewSession();
      setUpMediaRecorder();
      SurfaceTexture texture = mTextureView.getSurfaceTexture();
      assert texture != null;
      texture.setDefaultBufferSize(mPreviewSize.getWidth(), mPreviewSize.getHeight());
      mPreviewBuilder = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_RECORD);
      List<Surface> surfaces = new ArrayList<>();
      /**
       * Surface for the camera preview set up
       */
      Surface previewSurface = new Surface(texture);
      surfaces.add(previewSurface);
      mPreviewBuilder.addTarget(previewSurface);
      //MediaRecorder setup for surface
      Surface recorderSurface = mMediaRecorder.getSurface();
      surfaces.add(recorderSurface);
      mPreviewBuilder.addTarget(recorderSurface);
      // Start a capture session
      mCameraDevice.createCaptureSession(surfaces, new CameraCaptureSession.StateCallback() {
        @Override
        public void onConfigured(@NonNull CameraCaptureSession cameraCaptureSession) {
          mPreviewSession = cameraCaptureSession;
          updatePreview();
          getActivity().runOnUiThread(() -> {
            mIsRecordingVideo = true;
            // Start recording
            mMediaRecorder.start();
          });
        }
        @Override
        public void onConfigureFailed(@NonNull CameraCaptureSession cameraCaptureSession) {
          Log.e(TAG, "onConfigureFailed: Failed");
        }
      }, mBackgroundHandler);
    } catch (CameraAccessException | IOException e) {
      e.printStackTrace();
    }
  }

Step-10. After that stop video record via below code.

  public void stopRecordingVideo() throws Exception {
    // UI
    mIsRecordingVideo = false;
    try {
      mPreviewSession.stopRepeating();
      mPreviewSession.abortCaptures();
    } catch (CameraAccessException e) {
      e.printStackTrace();
    }
    // Stop recording
    mMediaRecorder.stop();
    mMediaRecorder.reset();
  }
All things are done now Now CameraVideoFragment utility is ready to use. Just extend CameraVideoFragment class in place of Fragment. This final preview of CameraVideoFragment.java
package com.androidwave.camera2video.camera;
import android.Manifest;
import android.app.Activity;
import android.content.Context;
import android.content.Intent;
import android.content.pm.PackageManager;
import android.content.res.Configuration;
import android.graphics.Matrix;
import android.graphics.RectF;
import android.graphics.SurfaceTexture;
import android.hardware.camera2.CameraAccessException;
import android.hardware.camera2.CameraCaptureSession;
import android.hardware.camera2.CameraCharacteristics;
import android.hardware.camera2.CameraDevice;
import android.hardware.camera2.CameraManager;
import android.hardware.camera2.CameraMetadata;
import android.hardware.camera2.CaptureRequest;
import android.hardware.camera2.params.StreamConfigurationMap;
import android.media.CamcorderProfile;
import android.media.MediaRecorder;
import android.net.Uri;
import android.os.Bundle;
import android.os.Environment;
import android.os.Handler;
import android.os.HandlerThread;
import android.provider.Settings;
import android.support.annotation.NonNull;
import android.support.v4.app.ActivityCompat;
import android.support.v7.app.AlertDialog;
import android.util.Log;
import android.util.Size;
import android.util.SparseIntArray;
import android.view.Surface;
import android.view.TextureView;
import android.view.View;
import android.widget.Toast;
import com.androidwave.camera2video.R;
import com.androidwave.camera2video.ui.base.BaseFragment;
import com.karumi.dexter.Dexter;
import com.karumi.dexter.MultiplePermissionsReport;
import com.karumi.dexter.PermissionToken;
import com.karumi.dexter.listener.PermissionRequest;
import com.karumi.dexter.listener.multi.MultiplePermissionsListener;
import java.io.File;
import java.io.IOException;
import java.text.SimpleDateFormat;
import java.util.ArrayList;
import java.util.Collections;
import java.util.Comparator;
import java.util.Date;
import java.util.List;
import java.util.Locale;
import java.util.concurrent.Semaphore;
import java.util.concurrent.TimeUnit;
  public abstract class CameraVideoFragment extends BaseFragment {
    private static final String TAG = "CameraVideoFragment";
    private static final int SENSOR_ORIENTATION_INVERSE_DEGREES = 270;
    private static final int SENSOR_ORIENTATION_DEFAULT_DEGREES = 90;
    private static final SparseIntArray INVERSE_ORIENTATIONS = new SparseIntArray();
    private static final SparseIntArray DEFAULT_ORIENTATIONS = new SparseIntArray();
    static {
      INVERSE_ORIENTATIONS.append(Surface.ROTATION_270, 0);
      INVERSE_ORIENTATIONS.append(Surface.ROTATION_180, 90);
      INVERSE_ORIENTATIONS.append(Surface.ROTATION_90, 180);
      INVERSE_ORIENTATIONS.append(Surface.ROTATION_0, 270);
    }
    static {
      DEFAULT_ORIENTATIONS.append(Surface.ROTATION_90, 0);
      DEFAULT_ORIENTATIONS.append(Surface.ROTATION_0, 90);
      DEFAULT_ORIENTATIONS.append(Surface.ROTATION_270, 180);
      DEFAULT_ORIENTATIONS.append(Surface.ROTATION_180, 270);
    }
    private File mCurrentFile;
    private static final String VIDEO_DIRECTORY_NAME = "AndroidWave";
    /**
     * An {@link AutoFitTextureView} for camera preview.
     */
    private AutoFitTextureView mTextureView;
    /**
     * A reference to the opened {@link CameraDevice}.
     */
    private CameraDevice mCameraDevice;
    /**
     * A reference to the current {@link CameraCaptureSession} for
     * preview.
     */
    private CameraCaptureSession mPreviewSession;
    /**
     * {@link TextureView.SurfaceTextureListener} handles several lifecycle events on a
     * {@link TextureView}.
     */
    private TextureView.SurfaceTextureListener mSurfaceTextureListener
        = new TextureView.SurfaceTextureListener() {
      @Override
      public void onSurfaceTextureAvailable(SurfaceTexture surfaceTexture,
          int width, int height) {
        openCamera(width, height);
      }
      @Override
      public void onSurfaceTextureSizeChanged(SurfaceTexture surfaceTexture,
          int width, int height) {
        configureTransform(width, height);
      }
      @Override
      public boolean onSurfaceTextureDestroyed(SurfaceTexture surfaceTexture) {
        return true;
      }
      @Override
      public void onSurfaceTextureUpdated(SurfaceTexture surfaceTexture) {
      }
    };
    /**
     * The {@link Size} of camera preview.
     */
    private Size mPreviewSize;
    /**
     * The {@link Size} of video recording.
     */
    private Size mVideoSize;
    /**
     * MediaRecorder
     */
    private MediaRecorder mMediaRecorder;
    /**
     * Whether the app is recording video now
     */
    public boolean mIsRecordingVideo;
    /**
     * An additional thread for running tasks that shouldn't block the UI.
     */
    private HandlerThread mBackgroundThread;
    /**
     * A {@link Handler} for running tasks in the background.
     */
    private Handler mBackgroundHandler;
    /**
     * A {@link Semaphore} to prevent the app from exiting before closing the camera.
     */
    private Semaphore mCameraOpenCloseLock = new Semaphore(1);
    /**
     * {@link CameraDevice.StateCallback} is called when {@link CameraDevice} changes its status.
     */
    private CameraDevice.StateCallback mStateCallback = new CameraDevice.StateCallback() {
      @Override
      public void onOpened(@NonNull CameraDevice cameraDevice) {
        mCameraDevice = cameraDevice;
        startPreview();
        mCameraOpenCloseLock.release();
        if (null != mTextureView) {
          configureTransform(mTextureView.getWidth(), mTextureView.getHeight());
        }
      }
      @Override
      public void onDisconnected(@NonNull CameraDevice cameraDevice) {
        mCameraOpenCloseLock.release();
        cameraDevice.close();
        mCameraDevice = null;
      }
      @Override
      public void onError(@NonNull CameraDevice cameraDevice, int error) {
        mCameraOpenCloseLock.release();
        cameraDevice.close();
        mCameraDevice = null;
        Activity activity = getActivity();
        if (null != activity) {
          activity.finish();
        }
      }
    };
    private Integer mSensorOrientation;
    private CaptureRequest.Builder mPreviewBuilder;
    /**
     * In this sample, we choose a video size with 3x4 for  aspect ratio. for more perfectness 720
     * as well Also, we don't use sizes
     * larger than 1080p, since MediaRecorder cannot handle such a high-resolution video.
     *
     * @param choices The list of available sizes
     * @return The video size 1080p,720px
     */
    private static Size chooseVideoSize(Size[] choices) {
      for (Size size : choices) {
        if (1920 == size.getWidth() && 1080 == size.getHeight()) {
          return size;
        }
      }
      for (Size size : choices) {
        if (size.getWidth() == size.getHeight() * 4 / 3 && size.getWidth() <= 1080) {
          return size;
        }
      }
      Log.e(TAG, "Couldn't find any suitable video size");
      return choices[choices.length - 1];
    }
    /**
     * Given {@code choices} of {@code Size}s supported by a camera, chooses the smallest one whose
     * width and height are at least as large as the respective requested values, and whose aspect
     * ratio matches with the specified value.
     *
     * @param choices The list of sizes that the camera supports for the intended output class
     * @param width The minimum desired width
     * @param height The minimum desired height
     * @param aspectRatio The aspect ratio
     * @return The optimal {@code Size}, or an arbitrary one if none were big enough
     */
    private static Size chooseOptimalSize(Size[] choices, int width, int height, Size aspectRatio) {
      // Collect the supported resolutions that are at least as big as the preview Surface
      List<Size> bigEnough = new ArrayList<>();
      int w = aspectRatio.getWidth();
      int h = aspectRatio.getHeight();
      for (Size option : choices) {
        if (option.getHeight() == option.getWidth() * h / w &&
            option.getWidth() >= width && option.getHeight() >= height) {
          bigEnough.add(option);
        }
      }
      // Pick the smallest of those, assuming we found any
      if (bigEnough.size() > 0) {
        return Collections.min(bigEnough, new CompareSizesByArea());
      } else {
        Log.e(TAG, "Couldn't find any suitable preview size");
        return choices[0];
      }
    }
    public abstract int getTextureResource();
    @Override
    public void onViewCreated(final View view, Bundle savedInstanceState) {
      mTextureView = view.findViewById(getTextureResource());
    }
    @Override
    public void onResume() {
      super.onResume();
      startBackgroundThread();
      requestPermission();
    }
    @Override
    public void onPause() {
      closeCamera();
      stopBackgroundThread();
      super.onPause();
    }
    protected File getCurrentFile() {
      return mCurrentFile;
    }
    /**
     * Starts a background thread and its {@link Handler}.
     */
    private void startBackgroundThread() {
      mBackgroundThread = new HandlerThread("CameraBackground");
      mBackgroundThread.start();
      mBackgroundHandler = new Handler(mBackgroundThread.getLooper());
    }
    /**
     * Stops the background thread and its {@link Handler}.
     */
    private void stopBackgroundThread() {
      mBackgroundThread.quitSafely();
      try {
        mBackgroundThread.join();
        mBackgroundThread = null;
        mBackgroundHandler = null;
      } catch (InterruptedException e) {
        e.printStackTrace();
      }
    }
    /**
     * Requesting permissions storage, audio and camera at once
     */
    public void requestPermission() {
      Dexter.withActivity(getActivity())
          .withPermissions(Manifest.permission.CAMERA,
              Manifest.permission.RECORD_AUDIO,
              Manifest.permission.READ_EXTERNAL_STORAGE,
              Manifest.permission.WRITE_EXTERNAL_STORAGE)
          .withListener(new MultiplePermissionsListener() {
            @Override
            public void onPermissionsChecked(MultiplePermissionsReport report) {
              // check if all permissions are granted or not
              if (report.areAllPermissionsGranted()) {
                if (mTextureView.isAvailable()) {
                  openCamera(mTextureView.getWidth(), mTextureView.getHeight());
                } else {
                  mTextureView.setSurfaceTextureListener(mSurfaceTextureListener);
                }
              }
              // check for permanent denial of any permission show alert dialog
              if (report.isAnyPermissionPermanentlyDenied()) {
                // open Settings activity
                showSettingsDialog();
              }
            }
            @Override
            public void onPermissionRationaleShouldBeShown(List<PermissionRequest> permissions,
                PermissionToken token) {
              token.continuePermissionRequest();
            }
          })
          .withErrorListener(
              error -> Toast.makeText(getActivity().getApplicationContext(), "Error occurred! ",
                  Toast.LENGTH_SHORT).show())
          .onSameThread()
          .check();
    }
    /**
     * Showing Alert Dialog with Settings option in case of deny any permission
     */
    private void showSettingsDialog() {
      AlertDialog.Builder builder = new AlertDialog.Builder(getActivity());
      builder.setTitle(getString(R.string.message_need_permission));
      builder.setMessage(getString(R.string.message_permission));
      builder.setPositiveButton(getString(R.string.title_go_to_setting), (dialog, which) -> {
        dialog.cancel();
        openSettings();
      });
      builder.show();
    }
    // navigating settings app
    private void openSettings() {
      Intent intent = new Intent(Settings.ACTION_APPLICATION_DETAILS_SETTINGS);
      Uri uri = Uri.fromParts("package", getActivity().getPackageName(), null);
      intent.setData(uri);
      startActivityForResult(intent, 101);
    }
    /**
     * Tries to open a {@link CameraDevice}. The result is listened by `mStateCallback`.
     */
    private void openCamera(int width, int height) {
      final Activity activity = getActivity();
      if (null == activity || activity.isFinishing()) {
        return;
      }
      CameraManager manager = (CameraManager) activity.getSystemService(Context.CAMERA_SERVICE);
      try {
        Log.d(TAG, "tryAcquire");
        if (!mCameraOpenCloseLock.tryAcquire(2500, TimeUnit.MILLISECONDS)) {
          throw new RuntimeException("Time out waiting to lock camera opening.");
        }
        /**
         * default front camera will activate
         */
        String cameraId = manager.getCameraIdList()[0];
        CameraCharacteristics characteristics = manager.getCameraCharacteristics(cameraId);
        StreamConfigurationMap map = characteristics
            .get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP);
        mSensorOrientation = characteristics.get(CameraCharacteristics.SENSOR_ORIENTATION);
        if (map == null) {
          throw new RuntimeException("Cannot get available preview/video sizes");
        }
        mVideoSize = chooseVideoSize(map.getOutputSizes(MediaRecorder.class));
        mPreviewSize = chooseOptimalSize(map.getOutputSizes(SurfaceTexture.class),
            width, height, mVideoSize);
        int orientation = getResources().getConfiguration().orientation;
        if (orientation == Configuration.ORIENTATION_LANDSCAPE) {
          mTextureView.setAspectRatio(mPreviewSize.getWidth(), mPreviewSize.getHeight());
        } else {
          mTextureView.setAspectRatio(mPreviewSize.getHeight(), mPreviewSize.getWidth());
        }
        configureTransform(width, height);
        mMediaRecorder = new MediaRecorder();
        if (ActivityCompat.checkSelfPermission(getActivity(), Manifest.permission.CAMERA)
            != PackageManager.PERMISSION_GRANTED) {
          // TODO: Consider calling
          requestPermission();
          return;
        }
        manager.openCamera(cameraId, mStateCallback, null);
      } catch (CameraAccessException e) {
        Log.e(TAG, "openCamera: Cannot access the camera.");
      } catch (NullPointerException e) {
        Log.e(TAG, "Camera2API is not supported on the device.");
      } catch (InterruptedException e) {
        throw new RuntimeException("Interrupted while trying to lock camera opening.");
      }
    }
    /**
     * Create directory and return file
     * returning video file
     */
    private File getOutputMediaFile() {
      // External sdcard file location
      File mediaStorageDir = new File(Environment.getExternalStorageDirectory(),
          VIDEO_DIRECTORY_NAME);
      // Create storage directory if it does not exist
      if (!mediaStorageDir.exists()) {
        if (!mediaStorageDir.mkdirs()) {
          Log.d(TAG, "Oops! Failed create "
              + VIDEO_DIRECTORY_NAME + " directory");
          return null;
        }
      }
      String timeStamp = new SimpleDateFormat("yyyyMMdd_HHmmss",
          Locale.getDefault()).format(new Date());
      File mediaFile;
      mediaFile = new File(mediaStorageDir.getPath() + File.separator
          + "VID_" + timeStamp + ".mp4");
      return mediaFile;
    }
    /**
     * close camera and release object
     */
    private void closeCamera() {
      try {
        mCameraOpenCloseLock.acquire();
        closePreviewSession();
        if (null != mCameraDevice) {
          mCameraDevice.close();
          mCameraDevice = null;
        }
        if (null != mMediaRecorder) {
          mMediaRecorder.release();
          mMediaRecorder = null;
        }
      } catch (InterruptedException e) {
        throw new RuntimeException("Interrupted while trying to lock camera closing.");
      } finally {
        mCameraOpenCloseLock.release();
      }
    }
    /**
     * Start the camera preview.
     */
    private void startPreview() {
      if (null == mCameraDevice || !mTextureView.isAvailable() || null == mPreviewSize) {
        return;
      }
      try {
        closePreviewSession();
        SurfaceTexture texture = mTextureView.getSurfaceTexture();
        assert texture != null;
        texture.setDefaultBufferSize(mPreviewSize.getWidth(), mPreviewSize.getHeight());
        mPreviewBuilder = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
        Surface previewSurface = new Surface(texture);
        mPreviewBuilder.addTarget(previewSurface);
        mCameraDevice.createCaptureSession(Collections.singletonList(previewSurface),
            new CameraCaptureSession.StateCallback() {
              @Override
              public void onConfigured(@NonNull CameraCaptureSession session) {
                mPreviewSession = session;
                updatePreview();
              }
              @Override
              public void onConfigureFailed(@NonNull CameraCaptureSession session) {
                Log.e(TAG, "onConfigureFailed: Failed ");
              }
            }, mBackgroundHandler);
      } catch (CameraAccessException e) {
        e.printStackTrace();
      }
    }
    /**
     * Update the camera preview. {@link #startPreview()} needs to be called in advance.
     */
    private void updatePreview() {
      if (null == mCameraDevice) {
        return;
      }
      try {
        setUpCaptureRequestBuilder(mPreviewBuilder);
        HandlerThread thread = new HandlerThread("CameraPreview");
        thread.start();
        mPreviewSession.setRepeatingRequest(mPreviewBuilder.build(), null, mBackgroundHandler);
      } catch (CameraAccessException e) {
        e.printStackTrace();
      }
    }
    private void setUpCaptureRequestBuilder(CaptureRequest.Builder builder) {
      builder.set(CaptureRequest.CONTROL_MODE, CameraMetadata.CONTROL_MODE_AUTO);
    }
    /**
     * Configures the necessary {@link Matrix} transformation to `mTextureView`.
     * This method should not to be called until the camera preview size is determined in
     * openCamera, or until the size of `mTextureView` is fixed.
     *
     * @param viewWidth The width of `mTextureView`
     * @param viewHeight The height of `mTextureView`
     */
    private void configureTransform(int viewWidth, int viewHeight) {
      Activity activity = getActivity();
      if (null == mTextureView || null == mPreviewSize || null == activity) {
        return;
      }
      int rotation = activity.getWindowManager().getDefaultDisplay().getRotation();
      Matrix matrix = new Matrix();
      RectF viewRect = new RectF(0, 0, viewWidth, viewHeight);
      RectF bufferRect = new RectF(0, 0, mPreviewSize.getHeight(), mPreviewSize.getWidth());
      float centerX = viewRect.centerX();
      float centerY = viewRect.centerY();
      if (Surface.ROTATION_90 == rotation || Surface.ROTATION_270 == rotation) {
        bufferRect.offset(centerX - bufferRect.centerX(), centerY - bufferRect.centerY());
        matrix.setRectToRect(viewRect, bufferRect, Matrix.ScaleToFit.FILL);
        float scale = Math.max(
            (float) viewHeight / mPreviewSize.getHeight(),
            (float) viewWidth / mPreviewSize.getWidth());
        matrix.postScale(scale, scale, centerX, centerY);
        matrix.postRotate(90 * (rotation - 2), centerX, centerY);
      }
      mTextureView.setTransform(matrix);
    }
    private void setUpMediaRecorder() throws IOException {
      final Activity activity = getActivity();
      if (null == activity) {
        return;
      }
      mMediaRecorder.setAudioSource(MediaRecorder.AudioSource.MIC);
      mMediaRecorder.setVideoSource(MediaRecorder.VideoSource.SURFACE);
      mMediaRecorder.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4);
      /**
       * create video output file
       */
      mCurrentFile = getOutputMediaFile();
      /**
       * set output file in media recorder
       */
      mMediaRecorder.setOutputFile(mCurrentFile.getAbsolutePath());
      CamcorderProfile profile = CamcorderProfile.get(CamcorderProfile.QUALITY_480P);
      mMediaRecorder.setVideoFrameRate(profile.videoFrameRate);
      mMediaRecorder.setVideoSize(profile.videoFrameWidth, profile.videoFrameHeight);
      mMediaRecorder.setVideoEncodingBitRate(profile.videoBitRate);
      mMediaRecorder.setVideoEncoder(MediaRecorder.VideoEncoder.H264);
      mMediaRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AAC);
      mMediaRecorder.setAudioEncodingBitRate(profile.audioBitRate);
      mMediaRecorder.setAudioSamplingRate(profile.audioSampleRate);
      int rotation = activity.getWindowManager().getDefaultDisplay().getRotation();
      switch (mSensorOrientation) {
        case SENSOR_ORIENTATION_DEFAULT_DEGREES:
          mMediaRecorder.setOrientationHint(DEFAULT_ORIENTATIONS.get(rotation));
          break;
        case SENSOR_ORIENTATION_INVERSE_DEGREES:
          mMediaRecorder.setOrientationHint(INVERSE_ORIENTATIONS.get(rotation));
          break;
      }
      mMediaRecorder.prepare();
    }
    public void startRecordingVideo() {
      if (null == mCameraDevice || !mTextureView.isAvailable() || null == mPreviewSize) {
        return;
      }
      try {
        closePreviewSession();
        setUpMediaRecorder();
        SurfaceTexture texture = mTextureView.getSurfaceTexture();
        assert texture != null;
        texture.setDefaultBufferSize(mPreviewSize.getWidth(), mPreviewSize.getHeight());
        mPreviewBuilder = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_RECORD);
        List<Surface> surfaces = new ArrayList<>();
        /**
         * Surface for the camera preview set up
         */
        Surface previewSurface = new Surface(texture);
        surfaces.add(previewSurface);
        mPreviewBuilder.addTarget(previewSurface);
        //MediaRecorder setup for surface
        Surface recorderSurface = mMediaRecorder.getSurface();
        surfaces.add(recorderSurface);
        mPreviewBuilder.addTarget(recorderSurface);
        // Start a capture session
        mCameraDevice.createCaptureSession(surfaces, new CameraCaptureSession.StateCallback()
        @Override
        public void onConfigured (@NonNull CameraCaptureSession cameraCaptureSession){
          mPreviewSession = cameraCaptureSession;
          updatePreview();
          getActivity().runOnUiThread(() -> {
            mIsRecordingVideo = true;
            // Start recording
            mMediaRecorder.start();
          });
        }
        @Override
        public void onConfigureFailed (@NonNull CameraCaptureSession cameraCaptureSession){
          Log.e(TAG, "onConfigureFailed: Failed");
        }
      },mBackgroundHandler);
    } catch(CameraAccessException |
    IOException e)
    {
      e.printStackTrace();
    }
  }
  private void closePreviewSession() {
    if (mPreviewSession != null) {
      mPreviewSession.close();
      mPreviewSession = null;
    }
  }
  public void stopRecordingVideo() throws Exception {
    // UI
    mIsRecordingVideo = false;
    try {
      mPreviewSession.stopRepeating();
      mPreviewSession.abortCaptures();
    } catch (CameraAccessException e) {
      e.printStackTrace();
    }
    // Stop recording
    mMediaRecorder.stop();
    mMediaRecorder.reset();
  }
  /**
   * Compares two {@code Size}s based on their areas.
   */
  static class CompareSizesByArea implements Comparator<Size> {
    @Override
    public int compare(Size lhs, Size rhs) {
      // We cast here to ensure the multiplications won't overflow
      return Long.signum((long) lhs.getWidth() * lhs.getHeight() -
          (long) rhs.getWidth() * rhs.getHeight());
    }
  }
}

10. Now all utilities are done. Now comes on a real implementation.

10.1 Create a Fragment with the name of CameraFragment. Which extends CameraVideoFragment utility class
10.2 Create fragment_camera.xml. add below components
<?xml version="1.0" encoding="utf-8"?>
<android.support.constraint.ConstraintLayout xmlns:android="http://schemas.android.com/apk/res/android"
    xmlns:app="http://schemas.android.com/apk/res-auto"
    xmlns:tools="http://schemas.android.com/tools"
    android:layout_width="match_parent"
    android:layout_height="match_parent"
    tools:context=".CameraFragment"
    >
  <com.androidwave.camera2video.camera.AutoFitTextureView
      android:id="@+id/mTextureView"
      android:layout_width="match_parent"
      android:layout_height="match_parent"
      app:layout_constraintBottom_toBottomOf="parent"
      app:layout_constraintEnd_toEndOf="parent"
      app:layout_constraintStart_toStartOf="parent"
      app:layout_constraintTop_toTopOf="parent"
      />
  <ImageView
      android:id="@+id/mRecordVideo"
      android:layout_width="wrap_content"
      android:layout_height="wrap_content"
      android:layout_marginStart="8dp"
      android:layout_marginEnd="8dp"
      android:layout_marginBottom="24dp"
      android:contentDescription="@string/play_stop"
      android:src="@drawable/ic_record"
      app:layout_constraintBottom_toBottomOf="parent"
      app:layout_constraintEnd_toEndOf="parent"
      app:layout_constraintStart_toStartOf="parent"
      />
  <VideoView
      android:id="@+id/mVideoView"
      android:layout_width="match_parent"
      android:layout_height="match_parent"
      android:visibility="gone"
      app:layout_constraintBottom_toBottomOf="parent"
      app:layout_constraintEnd_toEndOf="parent"
      app:layout_constraintStart_toStartOf="parent"
      app:layout_constraintTop_toTopOf="parent"
      />
  <ImageView
      android:id="@+id/mPlayVideo"
      android:layout_width="wrap_content"
      android:layout_height="wrap_content"
      android:layout_marginStart="8dp"
      android:layout_marginTop="8dp"
      android:layout_marginEnd="8dp"
      android:layout_marginBottom="8dp"
      android:src="@drawable/ic_play_button"
      android:visibility="gone"
      app:layout_constraintBottom_toBottomOf="parent"
      app:layout_constraintEnd_toEndOf="parent"
      app:layout_constraintStart_toStartOf="parent"
      app:layout_constraintTop_toTopOf="parent"
      />
</android.support.constraint.ConstraintLayout>
10.3. SetTextureResource by implement getTextureResource() methods of parent class
package com.androidwave.camera2video;
import android.os.Bundle;
import android.support.v4.app.Fragment;
import android.view.LayoutInflater;
import android.view.View;
import android.view.ViewGroup;
import android.widget.ImageView;
import android.widget.MediaController;
import android.widget.VideoView;
import com.androidwave.camera2video.camera.AutoFitTextureView;
import com.androidwave.camera2video.camera.CameraVideoFragment;
import butterknife.BindView;
import butterknife.ButterKnife;
import butterknife.OnClick;
import butterknife.Unbinder;
/**
 * A simple {@link Fragment} subclass.
 * Use the {@link CameraFragment#newInstance} factory method to
 * create an instance of this fragment.
 */
public class CameraFragment extends CameraVideoFragment {
    @BindView(R.id.mTextureView)
    AutoFitTextureView mTextureView;
    @BindView(R.id.mRecordVideo)
    ImageView mRecordVideo;
    @BindView(R.id.mVideoView)
    VideoView mVideoView;
    @BindView(R.id.mPlayVideo)
    ImageView mPlayVideo;
    Unbinder unbinder;
    private String mOutputFilePath;
    public CameraFragment() {
        // Required empty public constructor
    }
    /**
     * Use this factory method to create a new instance of
     * this fragment using the provided parameters.
     */
    public static CameraFragment newInstance() {
        CameraFragment fragment = new CameraFragment();
        Bundle args = new Bundle();
        fragment.setArguments(args);
        return fragment;
    }
    @Override
    public void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
    }
    @Override
    public View onCreateView(LayoutInflater inflater, ViewGroup container,
                             Bundle savedInstanceState) {
        // Inflate the layout for this fragment
        View view = inflater.inflate(R.layout.fragment_camera, container, false);
        unbinder = ButterKnife.bind(this, view);
        return view;
    }
    @Override
    public int getTextureResource() {
        return R.id.mTextureView;
    }
    @Override
    protected void setUp(View view) {
    }
    @OnClick({R.id.mRecordVideo, R.id.mPlayVideo})
    public void onViewClicked(View view) {
        switch (view.getId()) {
            case R.id.mRecordVideo:
                /**
                 * If media is not recoding then start recording else stop recording
                 */
                if (mIsRecordingVideo) {
                    try {
                        stopRecordingVideo();
                        prepareViews();
                    } catch (Exception e) {
                        e.printStackTrace();
                    }
                } else {
                    startRecordingVideo();
                    mRecordVideo.setImageResource(R.drawable.ic_stop);
                    //Receive out put file here
                    mOutputFilePath = getCurrentFile().getAbsolutePath();
                }
                break;
            case R.id.mPlayVideo:
                mVideoView.start();
                mPlayVideo.setVisibility(View.GONE);
                break;
        }
    }
    private void prepareViews() {
        if (mVideoView.getVisibility() == View.GONE) {
            mVideoView.setVisibility(View.VISIBLE);
            mPlayVideo.setVisibility(View.VISIBLE);
            mTextureView.setVisibility(View.GONE);
            setMediaForRecordVideo();
        }
    }
    private void setMediaForRecordVideo() {
        // Set media controller
        mVideoView.setMediaController(new MediaController(getActivity()));
        mVideoView.requestFocus();
        mVideoView.setVideoPath(mOutputFilePath);
        mVideoView.seekTo(100);
        mVideoView.setOnCompletionListener(mp -> {
            // Reset player
            mVideoView.setVisibility(View.GONE);
            mTextureView.setVisibility(View.VISIBLE);
            mPlayVideo.setVisibility(View.GONE);
            mRecordVideo.setImageResource(R.drawable.ic_record);
        });
    }
    @Override
    public void onDestroyView() {
        super.onDestroyView();
        unbinder.unbind();
    }
}

Step-11. As a result Fragment in with Activity seems below.

package com.androidwave.camera2video;
import android.os.Bundle;
import android.support.v7.app.AppCompatActivity;
import android.support.v7.app.AppCompatDelegate;
public class MainActivity extends AppCompatActivity {
    @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        setContentView(R.layout.activity_main);
        getSupportActionBar().hide();
        AppCompatDelegate.setCompatVectorFromResourcesEnabled(true);
        if (null == savedInstanceState) {
            getFragmentManager().beginTransaction()
                    .replace(R.id.container, CameraFragment.newInstance())
                    .commit();
        }
    }
}

In this video recording android example, we have learned video recording using Camera2 API. Happy Coding 😁

Continue with …

Download Sample Project- Video Recording with Camera2 API in Android

22 Comments

  1. Vardaan Aggarwal Reply

    Hi,
    I am having an error in MainActivity

    @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        setContentView(R.layout.activity_camera);
    
        AppCompatDelegate.setCompatVectorFromResourcesEnabled(true);
        if (null == savedInstanceState) {
            getFragmentManager().beginTransaction()
                    .replace(R.id.container, CameraFragment.newInstance())
                    .commit();
        }
    }
    

    It is showing an error in CameraFragment.newInstance(). Required Type: Fragment, Provided: CameraFragment. Please Help!

  2. Dude, just check steps again. I feel some steps are missing, followed step by step not able to compile the code. I am not download example as well, entered email id , but page is stuck at download

  3. Neshi Jain Reply

    Hello Front camera with this comes stretched in the android phones.
    Please provide a solution for this

  4. Sahil Thakar Reply

    Hello, I need urgent help, When i record video from rear camera after recording complete video is flipped, I tried to found many solutions but non of them worked Please help me as soon as possible.

  5. I have implemented above code to my project and its working perfectly on Some devices(Nokia 8.1, OnePlus7T etc) but facing some issue on some devices like when recording is in progress its throwing the onConfigureFailed

  6. Jose Fernandez Reply

    I have error in the followings code lines

    import com.coremedia.iso.IsoFile;
    import com.coremedia.iso.boxes.Container;
    import com.coremedia.iso.boxes.TimeToSampleBox;
    import com.coremedia.iso.boxes.TrackBox;
    import com.googlecode.mp4parser.FileDataSourceImpl;
    import com.googlecode.mp4parser.authoring.Movie;
    import com.googlecode.mp4parser.authoring.Mp4TrackImpl;
    import com.googlecode.mp4parser.authoring.builder.DefaultMp4Builder;

    How Can I repair it?

  7. Nishikanta Reply

    The above code works fine but Only thing I want that a counter comes in top of the texture view which I didn’t want in my project. Can you suggest me where I can make it invisible. I am not getting the code Please help its urgent

  8. Thank you for the tut. It works fine when my Camera Setting is FHD (1920 x 1080). However, when my Camera Setting is 720p (1280 x 720), I got “java.lang.RuntimeException: Time out waiting to lock camera opening” error. That because the chooseVideoSize() method return 640 x 480. Rather than looping through all the available sizes one by one, is it possible to just get the current Camera Setting size then simply use that size?

  9. CHARLES WEHRENBERG Reply

    import com.androidwave.camera2video.R;
    import com.androidwave.camera2video.ui.base.BaseFragment;

    These imports are unexplained because your file naming convention is inconsistent and thus hard to track.
    This sample code COULD be useful were you to fix it up a bit.

  10. I copied the 2 fragments as Java files into kotlin project, had added a missing closing brace, and changed “BaseFragment” to “Fragment” in CameraVideoFragment.java, compiled successfully, but pressing “mRecordVideo” ImageView has no response, no exceptions or runtime errors in debug output window, as if it were never clicked.
    It’s on Huawei SLA-AL00 (android 7.0).

    I’m lost, not knowing where to start to look for the problem.

    Sorry for the silly question asked earlier, your main codes are fragments, I meant to ask if you have Kotlin version for this project.

  11. I’m a beginner to Android and Kotlin.
    I want to put all of the camera2 usage logic in a fragment rather than activity. Is it possible to do so?
    Do you have samples for camera2 API used from a fragment in Kotlin?

    • You can download the project website only if you are facing any error let me know.

  12. Application is not working on Huawei P30 Pro and EMUI 9.1.0 (Android Pie). It freezes after you start recording with no exception displayed.

  13. It is not working on my device 7.0 pls update this tutorial it only shows black screen with play and pause button

  14. Hi,
    Your blog was awesome, Thank you for the wonderful effort. I have a doubt in play pause recording when the file reaches maximum size, Say for example I set maximum file size using MediaRecorder.setMaxFileSize(26214400);/*25MB*/ once reaches the limit, I would like to continue video recording, here I tried MediaRecorder.pause(); then change the file path and resume the video recording with the help of MediaRecorder.OnInfoListener. but im facing an issue like this
    java.lang.IllegalStateException W/System.err: at android.media.MediaRecorder.setMaxFileSize(Native Method).
    Can please suggest what is the best approach to resolve this issue.

Write A Comment