下面介绍UVC Camera工程的主要思路和调用过程。该工程主要驱动usb摄像头,进行打开,关闭,录像,拍照等一些操作,调用平台为android,通过外接usb camera进行调用。

(1)、工程主要依赖库有libuvccamera和usbCameraCommon两个库,其中libuvccamera库是底层库,包括c++实现部分,外加几个基本控制类,如USBMonitor和UVCCamera类等。

USBMonitor类负责管理usb设备类,从该类可以选出自己需要的usb camera设备。

UVCCamera则表示该类就是一个usb的摄像头,基本方法均调用native,通过jni方式直接调用c++实现。该类是一个桥接类,连接上层控制到底层实现的基本事件转发。

比如聚焦,亮度,缩放,白平衡,预览,打开,关闭等,均实际通过该类来实现。


CameraDialog则是以对话框的形式提供对USBMonitor的调用,如发现并找寻自己需要操作的设备,里面用到了DeviceFilter类,主要用来过滤设备,如果不过滤设备,则认为是所有usb设备,主要还是通过USBManager来获取设备,然后进行筛选,如:


HashMap<String, UsbDevice> deviceList = mUsbManager.getDeviceList();


DeviceFilter类,这个是实际的筛选类,主要依据xml中的filter进行筛选,也就是说可以在xml中定义自己想要的设备的class,或者说不想要的设备的class,然后用xmlparser进行解析该xml文件,之后用此filter就可以过滤出实际想要的deviceList。


(2)usbCameraCommon库是上层封装库,主要用来操作调用摄像头,也就是USBMonitor和UVCCamera等。

上层调用比较主要的是AbstractUVCCameraHandler类,该类基本提供了对于UVCCamera的调用,然后同时开放api给Activity使用。

因为AbstractUVCCameraHandler类是一个Handler类,类定义是:


abstract class AbstractUVCCameraHandler extends Handler{...}


也就是说它主要负责消息发送接收,任何调用它的方法它都会转发给内部类,也就是其实内部主要维护的是一个Thread类,叫CameraThread,定义为:


static final class CameraThread extends Thread {...}


该CameraThread类为static类。任何发送给AbstractUVCCameraHandler的方法都会以消息的方式发送给CameraThread进行处理,当没有消息的时候,就什么也不做。以下可以看到转发过程:

public void handleMessage(final Message msg) {
		final CameraThread thread = mWeakThread.get();
		if (thread == null) return;
		switch (msg.what) {
		case MSG_OPEN:
			thread.handleOpen((USBMonitor.UsbControlBlock)msg.obj);
			break;
		case MSG_CLOSE:
			thread.handleClose();
			break;
		case MSG_PREVIEW_START:
			thread.handleStartPreview(msg.obj);
			break;
		case MSG_PREVIEW_STOP:
			thread.handleStopPreview();
			break;
		case MSG_CAPTURE_STILL:
			thread.handleCaptureStill((String)msg.obj);
			break;
		case MSG_CAPTURE_START:
			thread.handleStartRecording();
			break;
		case MSG_CAPTURE_STOP:
			thread.handleStopRecording();
			break;
		case MSG_MEDIA_UPDATE:
			thread.handleUpdateMedia((String)msg.obj);
			break;
		case MSG_RELEASE:
			thread.handleRelease();
			break;
		default:
			throw new RuntimeException("unsupported message:what=" + msg.what);
		}
	}

然后CameraThread类内部持有UVCCamera类实例,所有调用CameraThread的方法又会发送给UVCCamera实例进行处理,举例如下:

开始预览的事件传递:

public void handleStartPreview(final Object surface) {
			if (DEBUG) Log.v(TAG_THREAD, "handleStartPreview:");
			if ((mUVCCamera == null) || mIsPreviewing) return;
			try {
				mUVCCamera.setPreviewSize(mWidth, mHeight, 1, 31, mPreviewMode, mBandwidthFactor);
			} catch (final IllegalArgumentException e) {
				try {
					// fallback to YUV mode
					mUVCCamera.setPreviewSize(mWidth, mHeight, 1, 31, UVCCamera.DEFAULT_PREVIEW_MODE, mBandwidthFactor);
				} catch (final IllegalArgumentException e1) {
					callOnError(e1);
					return;
				}
			}
			if (surface instanceof SurfaceHolder) {
				mUVCCamera.setPreviewDisplay((SurfaceHolder)surface);
			} if (surface instanceof Surface) {
				mUVCCamera.setPreviewDisplay((Surface)surface);
			} else {
				mUVCCamera.setPreviewTexture((SurfaceTexture)surface);
			}
			mUVCCamera.startPreview();
			mUVCCamera.updateCameraParams();
			synchronized (mSync) {
				mIsPreviewing = true;
			}
			callOnStartPreview();
		}


开始录像:


public void handleStartRecording() {
			if (DEBUG) Log.v(TAG_THREAD, "handleStartRecording:");
			try {
				if ((mUVCCamera == null) || (mMuxer != null)) return;
				final MediaMuxerWrapper muxer = new MediaMuxerWrapper(".mp4");	// if you record audio only, ".m4a" is also OK.
				MediaVideoBufferEncoder videoEncoder = null;
				switch (mEncoderType) {
				case 1:	// for video capturing using MediaVideoEncoder
					new MediaVideoEncoder(muxer, getWidth(), getHeight(), mMediaEncoderListener);
					break;
				case 2:	// for video capturing using MediaVideoBufferEncoder
					videoEncoder = new MediaVideoBufferEncoder(muxer, getWidth(), getHeight(), mMediaEncoderListener);
					break;
				// case 0:	// for video capturing using MediaSurfaceEncoder
				default:
					new MediaSurfaceEncoder(muxer, getWidth(), getHeight(), mMediaEncoderListener);
					break;
				}
				if (true) {
					// for audio capturing
					new MediaAudioEncoder(muxer, mMediaEncoderListener);
				}
				muxer.prepare();
				muxer.startRecording();
				if (videoEncoder != null) {
					mUVCCamera.setFrameCallback(mIFrameCallback, UVCCamera.PIXEL_FORMAT_NV21);
				}
				synchronized (mSync) {
					mMuxer = muxer;
					mVideoEncoder = videoEncoder;
				}
				callOnStartRecording();
			} catch (final IOException e) {
				callOnError(e);
				Log.e(TAG, "startCapture:", e);
			}
		}


可以看到,实际都是通过mUVCCamera进行调用的。


而UVCCamera则直接调用native层进行处理。

所以事件的处理实际上是这个流程:

Activity-->UVCCameraHandler--->AbstractUVCCameraHandler--->CameraThread--->UVCCamera--->native。

为什么说Activity先发送给了UVCCameraHandler,需要看构造过程。

在Activity调用示例中引用了UVCCameraHandler,也就是说UVCCameraHandler是实际操作摄像头的类,AbstractUVCCameraHandler因为是abstract,无法实例化,同时UVCCameraHandler继承了AbstractUVCCameraHandler,类关系为:


public class UVCCameraHandler extends AbstractUVCCameraHandler {。。。}


所以Activity中实例化了UVCCameraHandler后,发送给UVCCameraHandler的事件都经过消息发送给了CameraThread。然后CameraThread又发送给了UVCCamera类。

如下为实例化过程:

在Activity中实例化UVCCameraHandler代码为:

final View view = findViewById(R.id.camera_view);
		mUVCCameraView = (CameraViewInterface)view;
		mUSBMonitor = new USBMonitor(this, mOnDeviceConnectListener);
		mCameraHandler = UVCCameraHandler.createHandler(this, mUVCCameraView,
			USE_SURFACE_ENCODER ? 0 : 1, PREVIEW_WIDTH, PREVIEW_HEIGHT, PREVIEW_MODE);


这里多贴了几行,方便后面介绍。


先看这句,

mCameraHandler = UVCCameraHandler.createHandler(this, mUVCCameraView,
   USE_SURFACE_ENCODER ? 0 : 1, PREVIEW_WIDTH, PREVIEW_HEIGHT, PREVIEW_MODE);

等于:

public static final UVCCameraHandler createHandler(
			final Activity parent, final CameraViewInterface cameraView,
			final int encoderType, final int width, final int height, final int format) {

		return createHandler(parent, cameraView, encoderType, width, height, format, UVCCamera.DEFAULT_BANDWIDTH);
	}


等于:


public static final UVCCameraHandler createHandler(
			final Activity parent, final CameraViewInterface cameraView,
			final int encoderType, final int width, final int height, final int format, final float bandwidthFactor) {

		final CameraThread thread = new CameraThread(UVCCameraHandler.class, parent, cameraView, encoderType, width, height, format, bandwidthFactor);
		thread.start();
		return (UVCCameraHandler)thread.getHandler();
	}


,也就是说,在Activity中实例化UVCCameraHandler其实就是构造了一个CameraThread,并将CameraThread中引用的UVCCameraHandler返回了回去。也就是说UVCCameraHandler一旦实例化,则CameraThread线程也创建了。


可以看到CameraThread中确实引用了UVCCameraHandler:

static final class CameraThread extends Thread {
		private static final String TAG_THREAD = "CameraThread";
		private final Object mSync = new Object();
		private final Class<? extends AbstractUVCCameraHandler> mHandlerClass;
		private final WeakReference<Activity> mWeakParent;
		private final WeakReference<CameraViewInterface> mWeakCameraView;
		private final int mEncoderType;
		private final Set<CameraCallback> mCallbacks = new CopyOnWriteArraySet<CameraCallback>();
		private int mWidth, mHeight, mPreviewMode;
		private float mBandwidthFactor;
		private boolean mIsPreviewing;
		private boolean mIsRecording;
		/**
		 * shutter sound
		 */
		private SoundPool mSoundPool;
		private int mSoundId;
		private AbstractUVCCameraHandler mHandler;
		/**
		 * for accessing UVC camera
		 */
		private UVCCamera mUVCCamera;
		/**
		 * muxer for audio/video recording
		 */
		private MediaMuxerWrapper mMuxer;
		private MediaVideoBufferEncoder mVideoEncoder;

		/**
		 *
		 * @param clazz Class extends AbstractUVCCameraHandler
		 * @param parent parent Activity
		 * @param cameraView for still capturing
		 * @param encoderType 0: use MediaSurfaceEncoder, 1: use MediaVideoEncoder, 2: use MediaVideoBufferEncoder
		 * @param width
		 * @param height
		 * @param format either FRAME_FORMAT_YUYV(0) or FRAME_FORMAT_MJPEG(1)
		 * @param bandwidthFactor
		 */
		CameraThread(final Class<? extends AbstractUVCCameraHandler> clazz,
			final Activity parent, final CameraViewInterface cameraView,
			final int encoderType, final int width, final int height, final int format,
			final float bandwidthFactor) {

			super("CameraThread");
			mHandlerClass = clazz;
			mEncoderType = encoderType;
			mWidth = width;
			mHeight = height;
			mPreviewMode = format;
			mBandwidthFactor = bandwidthFactor;
			mWeakParent = new WeakReference<Activity>(parent);
			mWeakCameraView = new WeakReference<CameraViewInterface>(cameraView);
			loadShutterSound(parent);
		}
。。。。。
}


其中mHandlerClass就是UVCCameraHandler。


(3),在上面Activity中的调用中,初始化代码为:

final View view = findViewById(R.id.camera_view);
		mUVCCameraView = (CameraViewInterface)view;
		mUSBMonitor = new USBMonitor(this, mOnDeviceConnectListener);
		mCameraHandler = UVCCameraHandler.createHandler(this, mUVCCameraView,
			USE_SURFACE_ENCODER ? 0 : 1, PREVIEW_WIDTH, PREVIEW_HEIGHT, PREVIEW_MODE);

其中mUSBMonitor类负责扫描设备并发现目标usb camera设备,然后mUVCCameraView负责预览页面,并将该预览页面传给UVCCameraHandler并传给CameraThread,见弱引用变量:


private final WeakReference<CameraViewInterface> mWeakCameraView;


也就是CameraThread完成了承上启下的作用,负责对上提供api调用,对下进行实际操作。

既有页面遇见,声音播放,操作控制UVCCamera,进行录像等各功能。

(4),native层

native层由java类UVCCamera发起调用,调用传递为jni-->UVCCamera-->UVCCamear.cpp--->UVCPreview.cpp

类,UVCCamear.cpp部分方法见下图:



 举例,如startPreview,stopPreview俩方法则又调用了UVCPreview 的方法,因为UVCCamear.cpp有引用UVCPreview.cpp指针:


UVCPreview *mPreview;


见下图:

int UVCCamera::startPreview() {
	ENTER();

	int result = EXIT_FAILURE;
	if (mDeviceHandle) {
		return mPreview->startPreview();
	}
	RETURN(result, int);
}

int UVCCamera::stopPreview() {
	ENTER();
	if (LIKELY(mPreview)) {
		mPreview->stopPreview();
	}
	RETURN(0, int);
}


UVCPreview.cpp的startPreview,stopPreview方法内容为:


int UVCPreview::startPreview() {
	ENTER();

	int result = EXIT_FAILURE;
	if (!isRunning()) {
		mIsRunning = true;
		pthread_mutex_lock(&preview_mutex);
		{
			if (LIKELY(mPreviewWindow)) {
				result = pthread_create(&preview_thread, NULL, preview_thread_func, (void *)this);
			}
		}
		pthread_mutex_unlock(&preview_mutex);
		if (UNLIKELY(result != EXIT_SUCCESS)) {
			LOGW("UVCCamera::window does not exist/already running/could not create thread etc.");
			mIsRunning = false;
			pthread_mutex_lock(&preview_mutex);
			{
				pthread_cond_signal(&preview_sync);
			}
			pthread_mutex_unlock(&preview_mutex);
		}
	}
	RETURN(result, int);
}

int UVCPreview::stopPreview() {
	ENTER();
	bool b = isRunning();
	if (LIKELY(b)) {
		mIsRunning = false;
		pthread_cond_signal(&preview_sync);
		pthread_cond_signal(&capture_sync);
		if (pthread_join(capture_thread, NULL) != EXIT_SUCCESS) {
			LOGW("UVCPreview::terminate capture thread: pthread_join failed");
		}
		if (pthread_join(preview_thread, NULL) != EXIT_SUCCESS) {
			LOGW("UVCPreview::terminate preview thread: pthread_join failed");
		}
		clearDisplay();
	}
	clearPreviewFrame();
	clearCaptureFrame();
	pthread_mutex_lock(&preview_mutex);
	if (mPreviewWindow) {
		ANativeWindow_release(mPreviewWindow);
		mPreviewWindow = NULL;
	}
	pthread_mutex_unlock(&preview_mutex);
	pthread_mutex_lock(&capture_mutex);
	if (mCaptureWindow) {
		ANativeWindow_release(mCaptureWindow);
		mCaptureWindow = NULL;
	}
	pthread_mutex_unlock(&capture_mutex);
	RETURN(0, int);
}


(5),UVCCameraTextureView类,该类是显示摄像头预览页面的类,非常重要,


public class UVCCameraTextureView extends AspectRatioTextureView   
	implements TextureView.SurfaceTextureListener, CameraViewInterface {
...}


因为时间关系,暂略。。。