【目的】

梳理应用层到framework层,MediaPlayer的处理流程,以便于工作中多媒体相关问题的分析处理,以下是整个流程的时序图,可对照时序图看下面的内容,会更加直观

android videoview使用示例 android videoview源码_c++

【案例展示】

布局文件

android videoview使用示例 android videoview源码_java_02


放一个VideoView作为播放视频的控件

android videoview使用示例 android videoview源码_java_03


首先初始化VideoView,然后通过setVideoPath()设置本地视频路径,紧接着添加了一个控制器,用于视频的暂停/播放、快进/快退控制,最后调用start()方法开始播放

【代码分析】

先看VideoView.setVideoPath()

android videoview使用示例 android videoview源码_c++_04


先执行openVideo()打开视频,然后执行requestLayout()和invalidate()重绘UI

进入openVideo()方法里面

android videoview使用示例 android videoview源码_c++_05


先执行release(),因为有可能之前已经调用过start()。然后申请获取音频焦点,最后创建了MediaPlayer对象,并对其做了一系列的初始化操作,核心操作有:

  • setDataSource()
  • setDisplay()
  • prepareAsync()

在核心操作之前,在MediaPlayer类的加载期间和对象的创建期间做了一些初始化的准备工作,分别是

  • System.loadLibrary(“media_jni”);
  • native_init();
  • new EventHandler();
  • native_setup();

android videoview使用示例 android videoview源码_java_06


android videoview使用示例 android videoview源码_c++_07


首先加载了media_jni.so动态库,这里涉及到jni技术,jni是java和c/c++建立沟通的桥梁,java层调用native层的方法必须通过jni访问

动态库加载完毕后执行了native_init()方法,这里动态库的源码在android_media_MediaPlayer.cpp中,进入到该文件查看native_init()方法的实现

android videoview使用示例 android videoview源码_java_08


首先是方法的注册,jni方法的注册有两种方式,静态注册和动态注册,这里使用的是动态注册,即java中native_init()对应的native方法为android_media_MediaPlayer_native_init(),查看该方法的具体实现

android videoview使用示例 android videoview源码_c++_09


主要是将java中的一些变量值保存到fields_t fields结构体中。

对应的native_init()执行完后接着创建了一个EventHandler对象,该对象的作用主要用于接收native层发送过来的消息

private class EventHandler extends Handler
   {
       private MediaPlayer mMediaPlayer;
 
       public EventHandler(MediaPlayer mp, Looper looper) {
           super(looper);
           mMediaPlayer = mp;
       }
 
       @Override
       public void handleMessage(Message msg) {
           if (mMediaPlayer.mNativeContext == 0) {
               Log.w(TAG, "mediaplayer went away with unhandled events");
               return;
           }
           switch(msg.what) {
           case MEDIA_PREPARED:
               try {
                   scanInternalSubtitleTracks();
               } catch (RuntimeException e) {
                   // send error message instead of crashing;
                   // send error message instead of inlining a call to onError
                   // to avoid code duplication.
                   Message msg2 = obtainMessage(
                           MEDIA_ERROR, MEDIA_ERROR_UNKNOWN, MEDIA_ERROR_UNSUPPORTED, null);
                   sendMessage(msg2);
               }
 
               OnPreparedListener onPreparedListener = mOnPreparedListener;
               if (onPreparedListener != null)
                   onPreparedListener.onPrepared(mMediaPlayer);
               return;
 
           case MEDIA_DRM_INFO:
               Log.v(TAG, "MEDIA_DRM_INFO " + mOnDrmInfoHandlerDelegate);
 
               if (msg.obj == null) {
                   Log.w(TAG, "MEDIA_DRM_INFO msg.obj=NULL");
               } else if (msg.obj instanceof Parcel) {
                   // The parcel was parsed already in postEventFromNative
                   DrmInfo drmInfo = null;
 
                   OnDrmInfoHandlerDelegate onDrmInfoHandlerDelegate;
                   synchronized (mDrmLock) {
                       if (mOnDrmInfoHandlerDelegate != null && mDrmInfo != null) {
                           drmInfo = mDrmInfo.makeCopy();
                       }
                       // local copy while keeping the lock
                       onDrmInfoHandlerDelegate = mOnDrmInfoHandlerDelegate;
                   }
 
                   // notifying the client outside the lock
                   if (onDrmInfoHandlerDelegate != null) {
                       onDrmInfoHandlerDelegate.notifyClient(drmInfo);
                   }
               } else {
                   Log.w(TAG, "MEDIA_DRM_INFO msg.obj of unexpected type " + msg.obj);
               }
               return;
 
           case MEDIA_PLAYBACK_COMPLETE:
               {
                   mOnCompletionInternalListener.onCompletion(mMediaPlayer);
                   OnCompletionListener onCompletionListener = mOnCompletionListener;
                   if (onCompletionListener != null)
                       onCompletionListener.onCompletion(mMediaPlayer);
               }
               stayAwake(false);
               return;
 
           case MEDIA_STOPPED:
               {
                   TimeProvider timeProvider = mTimeProvider;
                   if (timeProvider != null) {
                       timeProvider.onStopped();
                   }
               }
               break;
 
           case MEDIA_STARTED:
               // fall through
           case MEDIA_PAUSED:
               {
                   TimeProvider timeProvider = mTimeProvider;
                   if (timeProvider != null) {
                       timeProvider.onPaused(msg.what == MEDIA_PAUSED);
                   }
               }
               break;
 
           case MEDIA_BUFFERING_UPDATE:
               OnBufferingUpdateListener onBufferingUpdateListener = mOnBufferingUpdateListener;
               if (onBufferingUpdateListener != null)
                   onBufferingUpdateListener.onBufferingUpdate(mMediaPlayer, msg.arg1);
               return;
 
           case MEDIA_SEEK_COMPLETE:
               OnSeekCompleteListener onSeekCompleteListener = mOnSeekCompleteListener;
               if (onSeekCompleteListener != null) {
                   onSeekCompleteListener.onSeekComplete(mMediaPlayer);
               }
               // fall through
 
           case MEDIA_SKIPPED:
               {
                   TimeProvider timeProvider = mTimeProvider;
                   if (timeProvider != null) {
                       timeProvider.onSeekComplete(mMediaPlayer);
                   }
               }
               return;
 
           case MEDIA_SET_VIDEO_SIZE:
               OnVideoSizeChangedListener onVideoSizeChangedListener = mOnVideoSizeChangedListener;
               if (onVideoSizeChangedListener != null) {
                   onVideoSizeChangedListener.onVideoSizeChanged(
                       mMediaPlayer, msg.arg1, msg.arg2);
               }
               return;
 
           case MEDIA_ERROR:
               Log.e(TAG, "Error (" + msg.arg1 + "," + msg.arg2 + ")");
               boolean error_was_handled = false;
               OnErrorListener onErrorListener = mOnErrorListener;
               if (onErrorListener != null) {
                   error_was_handled = onErrorListener.onError(mMediaPlayer, msg.arg1, msg.arg2);
               }
               {
                   mOnCompletionInternalListener.onCompletion(mMediaPlayer);
                   OnCompletionListener onCompletionListener = mOnCompletionListener;
                   if (onCompletionListener != null && ! error_was_handled) {
                       onCompletionListener.onCompletion(mMediaPlayer);
                   }
               }
               stayAwake(false);
               return;
 
           case MEDIA_INFO:
               switch (msg.arg1) {
               case MEDIA_INFO_VIDEO_TRACK_LAGGING:
                   Log.i(TAG, "Info (" + msg.arg1 + "," + msg.arg2 + ")");
                   break;
               case MEDIA_INFO_METADATA_UPDATE:
                   try {
                       scanInternalSubtitleTracks();
                   } catch (RuntimeException e) {
                       Message msg2 = obtainMessage(
                               MEDIA_ERROR, MEDIA_ERROR_UNKNOWN, MEDIA_ERROR_UNSUPPORTED, null);
                       sendMessage(msg2);
                   }
                   // fall through
 
               case MEDIA_INFO_EXTERNAL_METADATA_UPDATE:
                   msg.arg1 = MEDIA_INFO_METADATA_UPDATE;
                   // update default track selection
                   if (mSubtitleController != null) {
                       mSubtitleController.selectDefaultTrack();
                   }
                   break;
               case MEDIA_INFO_BUFFERING_START:
               case MEDIA_INFO_BUFFERING_END:
                   TimeProvider timeProvider = mTimeProvider;
                   if (timeProvider != null) {
                       timeProvider.onBuffering(msg.arg1 == MEDIA_INFO_BUFFERING_START);
                   }
                   break;
               }
 
               OnInfoListener onInfoListener = mOnInfoListener;
               if (onInfoListener != null) {
                   onInfoListener.onInfo(mMediaPlayer, msg.arg1, msg.arg2);
               }
               // No real default action so far.
               return;
 
           case MEDIA_NOTIFY_TIME:
                   TimeProvider timeProvider = mTimeProvider;
                   if (timeProvider != null) {
                       timeProvider.onNotifyTime();
                   }
               return;
 
           case MEDIA_TIMED_TEXT:
               OnTimedTextListener onTimedTextListener = mOnTimedTextListener;
               if (onTimedTextListener == null)
                   return;
               if (msg.obj == null) {
                   onTimedTextListener.onTimedText(mMediaPlayer, null);
               } else {
                   if (msg.obj instanceof Parcel) {
                       Parcel parcel = (Parcel)msg.obj;
                       TimedText text = new TimedText(parcel);
                       parcel.recycle();
                       onTimedTextListener.onTimedText(mMediaPlayer, text);
                   }
               }
               return;
 
           case MEDIA_SUBTITLE_DATA:
               final OnSubtitleDataListener extSubtitleListener;
               final Handler extSubtitleHandler;
               synchronized(this) {
                   if (mSubtitleDataListenerDisabled) {
                       return;
                   }
                   extSubtitleListener = mExtSubtitleDataListener;
                   extSubtitleHandler = mExtSubtitleDataHandler;
               }
               if (msg.obj instanceof Parcel) {
                   Parcel parcel = (Parcel) msg.obj;
                   final SubtitleData data = new SubtitleData(parcel);
                   parcel.recycle();
 
                   mIntSubtitleDataListener.onSubtitleData(mMediaPlayer, data);
 
                   if (extSubtitleListener != null) {
                       if (extSubtitleHandler == null) {
                           extSubtitleListener.onSubtitleData(mMediaPlayer, data);
                       } else {
                           extSubtitleHandler.post(new Runnable() {
                               @Override
                               public void run() {
                                   extSubtitleListener.onSubtitleData(mMediaPlayer, data);
                               }
                           });
                       }
                   }
               }
               return;
 
           case MEDIA_META_DATA:
               OnTimedMetaDataAvailableListener onTimedMetaDataAvailableListener =
                   mOnTimedMetaDataAvailableListener;
               if (onTimedMetaDataAvailableListener == null) {
                   return;
               }
               if (msg.obj instanceof Parcel) {
                   Parcel parcel = (Parcel) msg.obj;
                   TimedMetaData data = TimedMetaData.createTimedMetaDataFromParcel(parcel);
                   parcel.recycle();
                   onTimedMetaDataAvailableListener.onTimedMetaDataAvailable(mMediaPlayer, data);
               }
               return;
 
           case MEDIA_NOP: // interface test message - ignore
               break;
 
           case MEDIA_AUDIO_ROUTING_CHANGED:
                   broadcastRoutingChange();
                   return;
 
           case MEDIA_TIME_DISCONTINUITY:
               final OnMediaTimeDiscontinuityListener mediaTimeListener;
               final Handler mediaTimeHandler;
               synchronized(this) {
                   mediaTimeListener = mOnMediaTimeDiscontinuityListener;
                   mediaTimeHandler = mOnMediaTimeDiscontinuityHandler;
               }
               if (mediaTimeListener == null) {
                   return;
               }
               if (msg.obj instanceof Parcel) {
                   Parcel parcel = (Parcel) msg.obj;
                   parcel.setDataPosition(0);
                   long anchorMediaUs = parcel.readLong();
                   long anchorRealUs = parcel.readLong();
                   float playbackRate = parcel.readFloat();
                   parcel.recycle();
                   final MediaTimestamp timestamp;
                   if (anchorMediaUs != -1 && anchorRealUs != -1) {
                       timestamp = new MediaTimestamp(
                               anchorMediaUs /*Us*/, anchorRealUs * 1000 /*Ns*/, playbackRate);
                   } else {
                       timestamp = MediaTimestamp.TIMESTAMP_UNKNOWN;
                   }
                   if (mediaTimeHandler == null) {
                       mediaTimeListener.onMediaTimeDiscontinuity(mMediaPlayer, timestamp);
                   } else {
                       mediaTimeHandler.post(new Runnable() {
                           @Override
                           public void run() {
                               mediaTimeListener.onMediaTimeDiscontinuity(mMediaPlayer, timestamp);
                           }
                       });
                   }
               }
               return;
 
           case MEDIA_RTP_RX_NOTICE:
               final OnRtpRxNoticeListener rtpRxNoticeListener = mOnRtpRxNoticeListener;
               if (rtpRxNoticeListener == null) {
                   return;
               }
               if (msg.obj instanceof Parcel) {
                   Parcel parcel = (Parcel) msg.obj;
                   parcel.setDataPosition(0);
                   int noticeType;
                   int[] data;
                   try {
                       noticeType = parcel.readInt();
                       int numOfArgs = parcel.dataAvail() / 4;
                       data = new int[numOfArgs];
                       for (int i = 0; i < numOfArgs; i++) {
                           data[i] = parcel.readInt();
                       }
                   } finally {
                       parcel.recycle();
                   }
                   mOnRtpRxNoticeExecutor.execute(() ->
                           rtpRxNoticeListener
                                   .onRtpRxNotice(mMediaPlayer, noticeType, data));
               }
               return;
 
           default:
               Log.e(TAG, "Unknown message type " + msg.what);
               return;
           }
       }
   }

最后执行了native_setup()方法,该方法也是一个native方法,对应在android_media_MediaPlayer.cpp中的实现是android_media_MediaPlayer_native_setup()

android videoview使用示例 android videoview源码_音视频_10


主要是创建了一个MediaPlayer对象,MediaPlayer继承自BnMediaPlayerClient和IMediaDeathNotifier

class MediaPlayer : public BnMediaPlayerClient,
                    public virtual IMediaDeathNotifier
{
public:
    MediaPlayer();
    ~MediaPlayer();
            void            died();
            void            disconnect();
 
            status_t        setDataSource(
                    const sp<IMediaHTTPService> &httpService,
                    const char *url,
                    const KeyedVector<String8, String8> *headers);
 
    virtual status_t        setDataSource(int fd, int64_t offset, int64_t length);
            status_t        setDataSource(const sp<IDataSource> &source);
            status_t        setVideoSurfaceTexture(
                                    const sp<IGraphicBufferProducer>& bufferProducer);
            status_t        setListener(const sp<MediaPlayerListener>& listener);
            status_t        getBufferingSettings(BufferingSettings* buffering /* nonnull */);
            status_t        setBufferingSettings(const BufferingSettings& buffering);
            status_t        prepare();
            status_t        prepareAsync();
            status_t        start();
            status_t        stop();
    virtual status_t        pause();
            bool            isPlaying();
            status_t        setPlaybackSettings(const AudioPlaybackRate& rate);
            status_t        getPlaybackSettings(AudioPlaybackRate* rate /* nonnull */);
            status_t        setSyncSettings(const AVSyncSettings& sync, float videoFpsHint);
            status_t        getSyncSettings(
                                    AVSyncSettings* sync /* nonnull */,
                                    float* videoFps /* nonnull */);
            status_t        getVideoWidth(int *w);
            status_t        getVideoHeight(int *h);
            status_t        seekTo(
                    int msec,
                    MediaPlayerSeekMode mode = MediaPlayerSeekMode::SEEK_PREVIOUS_SYNC);
            status_t        notifyAt(int64_t mediaTimeUs);
            status_t        getCurrentPosition(int *msec);
            status_t        getDuration(int *msec);
            status_t        reset();
            status_t        setAudioStreamType(audio_stream_type_t type);
            status_t        getAudioStreamType(audio_stream_type_t *type);
            status_t        setLooping(int loop);
            bool            isLooping();
            status_t        setVolume(float leftVolume, float rightVolume);
    virtual void            notify(int msg, int ext1, int ext2, const Parcel *obj = NULL);
            status_t        invoke(const Parcel& request, Parcel *reply);
            status_t        setMetadataFilter(const Parcel& filter);
            status_t        getMetadata(bool update_only, bool apply_filter, Parcel *metadata);
            status_t        setAudioSessionId(audio_session_t sessionId);
            audio_session_t getAudioSessionId();
            status_t        setAuxEffectSendLevel(float level);
            status_t        attachAuxEffect(int effectId);
            status_t        setParameter(int key, const Parcel& request);
            status_t        getParameter(int key, Parcel* reply);
            status_t        setRetransmitEndpoint(const char* addrString, uint16_t port);
            status_t        setNextMediaPlayer(const sp<MediaPlayer>& player);
 
            media::VolumeShaper::Status applyVolumeShaper(
                                    const sp<media::VolumeShaper::Configuration>& configuration,
                                    const sp<media::VolumeShaper::Operation>& operation);
            sp<media::VolumeShaper::State> getVolumeShaperState(int id);
            // Modular DRM
            status_t        prepareDrm(const uint8_t uuid[16], const Vector<uint8_t>& drmSessionId);
            status_t        releaseDrm();
            // AudioRouting
            status_t        setOutputDevice(audio_port_handle_t deviceId);
            audio_port_handle_t getRoutedDeviceId();
            status_t        enableAudioDeviceCallback(bool enabled);
 
private:
            void            clear_l();
            status_t        seekTo_l(int msec, MediaPlayerSeekMode mode);
            status_t        prepareAsync_l();
            status_t        getDuration_l(int *msec);
            status_t        attachNewPlayer(const sp<IMediaPlayer>& player);
            status_t        reset_l();
            status_t        doSetRetransmitEndpoint(const sp<IMediaPlayer>& player);
            status_t        checkStateForKeySet_l(int key);
 
    sp<IMediaPlayer>            mPlayer;
    thread_id_t                 mLockThreadId;
    Mutex                       mLock;
    Mutex                       mNotifyLock;
    Condition                   mSignal;
    sp<MediaPlayerListener>     mListener;
    void*                       mCookie;
    media_player_states         mCurrentState;
    int                         mCurrentPosition;
    MediaPlayerSeekMode         mCurrentSeekMode;
    int                         mSeekPosition;
    MediaPlayerSeekMode         mSeekMode;
    bool                        mPrepareSync;
    status_t                    mPrepareStatus;
    audio_stream_type_t         mStreamType;
    Parcel*                     mAudioAttributesParcel;
    bool                        mLoop;
    float                       mLeftVolume;
    float                       mRightVolume;
    int                         mVideoWidth;
    int                         mVideoHeight;
    audio_session_t             mAudioSessionId;
    float                       mSendLevel;
    struct sockaddr_in          mRetransmitEndpoint;
    bool                        mRetransmitEndpointValid;
};

该类中的成员基本上与java层中的MediaPlayer是一一对应的。

执行完初始化操作后主要接着执行上文说到的三个核心操作,依次查看三个核心操作的逻辑,先看setDataSource()

android videoview使用示例 android videoview源码_java_11


这里做了判断,如果是本地视频,则执行setDataSource(),如果是网络视频,则执行nativeSetDataSource(),这里使用的是本地视频,所以走setDataSource(),MediaPlayer里面有很多重载的setDataSource方法,最终都会执行_setDataSource(),注意,前面有个下划线,表示是一个native方法

android videoview使用示例 android videoview源码_android_12


_setDataSource()对应的native方法为android_media_MediaPlayer_setDataSourceCallback()

android videoview使用示例 android videoview源码_c++_13


主要是获取了MediaPlayer对象,然后调用它的setDataSource()方法,MediaPlayer所有方法的实现均在mediaplayer.cpp中

android videoview使用示例 android videoview源码_java_14


首先获取到MediaPlayerService服务,然后调用该服务的create()方法创建一个Client客户端,Client结构如下

class Client : public BnMediaPlayer {
        // IMediaPlayer interface
        virtual void            disconnect();
        virtual status_t        setVideoSurfaceTexture(
                                        const sp<IGraphicBufferProducer>& bufferProducer);
        virtual status_t        setBufferingSettings(const BufferingSettings& buffering) override;
        virtual status_t        getBufferingSettings(
                                        BufferingSettings* buffering /* nonnull */) override;
        virtual status_t        prepareAsync();
        virtual status_t        start();
        virtual status_t        stop();
        virtual status_t        pause();
        virtual status_t        isPlaying(bool* state);
        virtual status_t        setPlaybackSettings(const AudioPlaybackRate& rate);
        virtual status_t        getPlaybackSettings(AudioPlaybackRate* rate /* nonnull */);
        virtual status_t        setSyncSettings(const AVSyncSettings& rate, float videoFpsHint);
        virtual status_t        getSyncSettings(AVSyncSettings* rate /* nonnull */,
                                                float* videoFps /* nonnull */);
        virtual status_t        seekTo(
                int msec,
                MediaPlayerSeekMode mode = MediaPlayerSeekMode::SEEK_PREVIOUS_SYNC);
        virtual status_t        getCurrentPosition(int* msec);
        virtual status_t        getDuration(int* msec);
        virtual status_t        reset();
        virtual status_t        notifyAt(int64_t mediaTimeUs);
        virtual status_t        setAudioStreamType(audio_stream_type_t type);
        virtual status_t        setLooping(int loop);
        virtual status_t        setVolume(float leftVolume, float rightVolume);
        virtual status_t        invoke(const Parcel& request, Parcel *reply);
        virtual status_t        setMetadataFilter(const Parcel& filter);
        virtual status_t        getMetadata(bool update_only,
                                            bool apply_filter,
                                            Parcel *reply);
        virtual status_t        setAuxEffectSendLevel(float level);
        virtual status_t        attachAuxEffect(int effectId);
        virtual status_t        setParameter(int key, const Parcel &request);
        virtual status_t        getParameter(int key, Parcel *reply);
        virtual status_t        setRetransmitEndpoint(const struct sockaddr_in* endpoint);
        virtual status_t        getRetransmitEndpoint(struct sockaddr_in* endpoint);
        virtual status_t        setNextPlayer(const sp<IMediaPlayer>& player);
 
        virtual media::VolumeShaper::Status applyVolumeShaper(
                                        const sp<media::VolumeShaper::Configuration>& configuration,
                                        const sp<media::VolumeShaper::Operation>& operation) override;
        virtual sp<media::VolumeShaper::State> getVolumeShaperState(int id) override;
 
        sp<MediaPlayerBase>     createPlayer(player_type playerType);
 
        virtual status_t        setDataSource(
                        const sp<IMediaHTTPService> &httpService,
                        const char *url,
                        const KeyedVector<String8, String8> *headers);
 
        virtual status_t        setDataSource(int fd, int64_t offset, int64_t length);
 
        virtual status_t        setDataSource(const sp<IStreamSource> &source);
        virtual status_t        setDataSource(const sp<IDataSource> &source);
 
 
        sp<MediaPlayerBase>     setDataSource_pre(player_type playerType);
        status_t                setDataSource_post(const sp<MediaPlayerBase>& p,
                                                   status_t status);
 
                void            notify(int msg, int ext1, int ext2, const Parcel *obj);
 
                pid_t           pid() const { return mPid; }
        virtual status_t        dump(int fd, const Vector<String16>& args);
 
                audio_session_t getAudioSessionId() { return mAudioSessionId; }
        // Modular DRM
        virtual status_t prepareDrm(const uint8_t uuid[16], const Vector<uint8_t>& drmSessionId);
        virtual status_t releaseDrm();
        // AudioRouting
        virtual status_t setOutputDevice(audio_port_handle_t deviceId);
        virtual status_t getRoutedDeviceId(audio_port_handle_t* deviceId);
        virtual status_t enableAudioDeviceCallback(bool enabled);
 
    private:
        class AudioDeviceUpdatedNotifier: public AudioSystem::AudioDeviceCallback
        {
        public:
            AudioDeviceUpdatedNotifier(const sp<MediaPlayerBase>& listener) {
                mListener = listener;
            }
            ~AudioDeviceUpdatedNotifier() {}
 
            virtual void onAudioDeviceUpdate(audio_io_handle_t audioIo,
                                             audio_port_handle_t deviceId);
 
        private:
            wp<MediaPlayerBase> mListener;
        };
 
        friend class MediaPlayerService;
                                Client( const sp<MediaPlayerService>& service,
                                        pid_t pid,
                                        int32_t connId,
                                        const sp<IMediaPlayerClient>& client,
                                        audio_session_t audioSessionId,
                                        uid_t uid);
                                Client();
        virtual                 ~Client();
 
                void            deletePlayer();
 
        sp<MediaPlayerBase>     getPlayer() const { Mutex::Autolock lock(mLock); return mPlayer; }
 
 
 
        // @param type Of the metadata to be tested.
        // @return true if the metadata should be dropped according to
        //              the filters.
        bool shouldDropMetadata(media::Metadata::Type type) const;
 
        // Add a new element to the set of metadata updated. Noop if
        // the element exists already.
        // @param type Of the metadata to be recorded.
        void addNewMetadataUpdate(media::Metadata::Type type);
 
        // Disconnect from the currently connected ANativeWindow.
        void disconnectNativeWindow_l();
 
        status_t setAudioAttributes_l(const Parcel &request);
 
        class Listener : public MediaPlayerBase::Listener {
        public:
            Listener(const wp<Client> &client) : mClient(client) {}
            virtual ~Listener() {}
            virtual void notify(int msg, int ext1, int ext2, const Parcel *obj) {
                sp<Client> client = mClient.promote();
                if (client != NULL) {
                    client->notify(msg, ext1, ext2, obj);
                }
            }
        private:
            wp<Client> mClient;
        };
 
        mutable     Mutex                         mLock;
                    sp<MediaPlayerBase>           mPlayer;
                    sp<MediaPlayerService>        mService;
                    sp<IMediaPlayerClient>        mClient;
                    sp<AudioOutput>               mAudioOutput;
                    pid_t                         mPid;
                    status_t                      mStatus;
                    bool                          mLoop;
                    int32_t                       mConnId;
                    audio_session_t               mAudioSessionId;
                    audio_attributes_t *          mAudioAttributes;
                    uid_t                         mUid;
                    sp<ANativeWindow>             mConnectedWindow;
                    sp<IBinder>                   mConnectedWindowBinder;
                    struct sockaddr_in            mRetransmitEndpoint;
                    bool                          mRetransmitEndpointValid;
                    sp<Client>                    mNextClient;
                    sp<MediaPlayerBase::Listener> mListener;
 
        // Metadata filters.
        media::Metadata::Filter mMetadataAllow;  // protected by mLock
        media::Metadata::Filter mMetadataDrop;  // protected by mLock
 
        // Metadata updated. For each MEDIA_INFO_METADATA_UPDATE
        // notification we try to update mMetadataUpdated which is a
        // set: no duplicate.
        // getMetadata clears this set.
        media::Metadata::Filter mMetadataUpdated;  // protected by mLock
 
        std::vector<DeathNotifier> mDeathNotifiers;
        sp<AudioDeviceUpdatedNotifier> mAudioDeviceUpdatedListener;
#if CALLBACK_ANTAGONIZER
                    Antagonizer*                  mAntagonizer;
#endif
    }; // Client

Client继承自BnMediaPlayer,而BnMediaPlayer继承自IMediaPlayer,Client也是MediaPlayerService的一个内部类,所以上述通过MediaPlayerService.create()的得到的player对象再调用setDataSource()时,实际上是执行的MediaPlayerService里面的setDataSource()。

这是BnMediaPlayer的继承关系

android videoview使用示例 android videoview源码_c++_15


android videoview使用示例 android videoview源码_android_16


setDataSource()先获取到player的类型,MediaPlayer中有三种播放器,StagefrightPlayer、NuPlayer和TestPlayer,StagefrightPlayer因为存在严重的安全漏洞,并被黑客所利用,因此在Android 7.0中被删除了,以后默认使用的是NuPlayer。

获取完播放器类型执行了setDataSource_pre()方法

sp<MediaPlayerBase> MediaPlayerService::Client::setDataSource_pre(
        player_type playerType)
{
    ALOGV("player type = %d", playerType);
 
    // create the right type of player
    sp<MediaPlayerBase> p = createPlayer(playerType);
    if (p == NULL) {
        return p;
    }
 
    std::vector<DeathNotifier> deathNotifiers;
 
    // Listen to death of media.extractor service
    sp<IServiceManager> sm = defaultServiceManager();
    sp<IBinder> binder = sm->getService(String16("media.extractor"));
    if (binder == NULL) {
        ALOGE("extractor service not available");
        return NULL;
    }
    deathNotifiers.emplace_back(
            binder, [l = wp<MediaPlayerBase>(p)]() {
        sp<MediaPlayerBase> listener = l.promote();
        if (listener) {
            ALOGI("media.extractor died. Sending death notification.");
            listener->sendEvent(MEDIA_ERROR, MEDIA_ERROR_SERVER_DIED,
                                MEDIAEXTRACTOR_PROCESS_DEATH);
        } else {
            ALOGW("media.extractor died without a death handler.");
        }
    });
 
    {
        using ::android::hidl::base::V1_0::IBase;
 
        // Listen to death of OMX service
        {
            sp<IBase> base = ::android::hardware::media::omx::V1_0::
                    IOmx::getService();
            if (base == nullptr) {
                ALOGD("OMX service is not available");
            } else {
                deathNotifiers.emplace_back(
                        base, [l = wp<MediaPlayerBase>(p)]() {
                    sp<MediaPlayerBase> listener = l.promote();
                    if (listener) {
                        ALOGI("OMX service died. "
                              "Sending death notification.");
                        listener->sendEvent(
                                MEDIA_ERROR, MEDIA_ERROR_SERVER_DIED,
                                MEDIACODEC_PROCESS_DEATH);
                    } else {
                        ALOGW("OMX service died without a death handler.");
                    }
                });
            }
        }
 
        // Listen to death of Codec2 services
        {
            for (std::shared_ptr<Codec2Client> const& client :
                    Codec2Client::CreateFromAllServices()) {
                sp<IBase> base = client->getBase();
                deathNotifiers.emplace_back(
                        base, [l = wp<MediaPlayerBase>(p),
                               name = std::string(client->getServiceName())]() {
                    sp<MediaPlayerBase> listener = l.promote();
                    if (listener) {
                        ALOGI("Codec2 service \"%s\" died. "
                              "Sending death notification.",
                              name.c_str());
                        listener->sendEvent(
                                MEDIA_ERROR, MEDIA_ERROR_SERVER_DIED,
                                MEDIACODEC_PROCESS_DEATH);
                    } else {
                        ALOGW("Codec2 service \"%s\" died "
                              "without a death handler.",
                              name.c_str());
                    }
                });
            }
        }
    }
 
    Mutex::Autolock lock(mLock);
 
    mDeathNotifiers.clear();
    mDeathNotifiers.swap(deathNotifiers);
    mAudioDeviceUpdatedListener = new AudioDeviceUpdatedNotifier(p);
 
    if (!p->hardwareOutput()) {
        mAudioOutput = new AudioOutput(mAudioSessionId, IPCThreadState::self()->getCallingUid(),
                mPid, mAudioAttributes, mAudioDeviceUpdatedListener);
        static_cast<MediaPlayerInterface*>(p.get())->setAudioSink(mAudioOutput);
    }
 
    return p;
}

首先执行createPlayer()创建NuPlayer对象

android videoview使用示例 android videoview源码_音视频_17


里面又通过MediaPlayerFactory的createPlayer()方法创建NuPlayer

android videoview使用示例 android videoview源码_c++_18


android videoview使用示例 android videoview源码_android_19


这里先创建了NuPlayerDriver对象

android videoview使用示例 android videoview源码_android_20


NuPlayerDriver的构造方法里通过AVNuFactory::get()→createNuPlayer()最终创建了NuPlayer对象

android videoview使用示例 android videoview源码_音视频_21


创建完NuPlayer对象后紧接着执行了它的init()方法,完成一些初始化的动作

android videoview使用示例 android videoview源码_android_22


在MediaPlayerService.cpp中执行完setDataSource_pre()创建完NuPlayer对象后,紧接着执行setDataSource_post(),该方法第二个参数是NuPlayerDriver.cpp执行setDataSource()返回的结果

android videoview使用示例 android videoview源码_音视频_23


里面又执行了NuPlayer.cpp的setDataSourceAsync()

android videoview使用示例 android videoview源码_ide_24


可以看到,应用层传递的视频路径最终会交由NuPlayer处理,具体NuPlayer如何对视频进行解码将在后面的文章详细分析。

在VideoView.java中,当openVideo()执行完mMediaPlayer.setDataSource(),之后执行了mMediaPlayer.setDisplay()用于设置视频显示的对象,此处跟图像显示Surface和Window有关系,后面的文章会详细分析。

执行完mMediaPlayer.setDisplay(),紧接着又执行了mMediaPlayer.prepareAsync(),最终也会执行NuPlayer的prepareAsync(),具体代码流程同setDataSource()

最后应用层VideoView调用start()方法执行播放时,此方法最终也会执行NuPlayer的start(),具体代码流程同setDataSource()

至此,视频播放流程由应用层VideoView到native层NuPlayer的完整流程分析到这儿,当然中间还有很多细节的处理没有具体去分析,但掌握了整体框架后再去分析细节功能就会信手拈来。