​​斯坦福大学iOS应用开发教程学习笔记(第五课)Protocols,手势识别​​




主要4部分组成:自动旋转、 Protocols、 手势识别、一个自定义UIView的Demo




1、自动旋转


- (BOOL)shouldAutorotateToInterfaceOrientation:(UIInterfaceOrientation)orientation
{
return UIInterfaceOrientationIsPortrait(orientation); // only support portrait
return YES; // support all orientations
return (orientation != UIInterfaceOrientationPortraitUpsideDown); // anything but
}


//包括竖直、上下颠倒、左横向、右横向



当旋转的时候,view的bounds会改变,子view的frame, 子view的子view也会改变。


改变的规则是:struts和springs。


当view的bound改变时,drawRect不会默认调用



斯坦福大学公开课 iOS应用开发教程学习笔记(第五课)Protocols,手势识别_#define




就是struts,中间的红色箭头就是springs。


右边红白色的显示屏似的就是用动画告诉你view如何改变。


白色的是父view,红色的是你选中的view。


这个控制在iPhone上一般使用不到,因为屏幕太小了。


幸运的是,有UIView有这么个属性来控制




三种控制方式:


1、


@property (nonatomic) UIViewContentMode contentMode;  
UIViewContentMode{Left,Right,Top,Right,BottomLeft,BottomRight,TopLeft,TopRight}


//把你的view的像素点移动到指定位置上




2、缩放的控制属性:


UIViewContentModeScale{ToFill,AspectFill,AspectFit} // bit stretching/shrinking


分别是填充,内容填充,内容适应。 toFill是默认的模式,它会自动缩放像素点填满新的空间,可能会造成图形扭曲。


UIViewContentModeRedraw//重绘



3、


@property (nonatomic) CGRect contentStretch;


指定某一个区域拉伸


初始化一个UIView.




初始化一个UIView


Typical Code


-(void) setip{...}
-(void) awakeFormNib{ [self setup];}
-(id) initWithFrame:(CGRect)aRect{
self=[super initWithFrame:arect];
[self setup];
return self;
}




2、协议procotol


@protocol Foo <Other, NSObject>        // implementors must implement Other and NSObject too
- (void)doSomething; // implementors must implement this (methods are @required by default)
@optional
- (int)getSomething; // implementors do not have to implement this
- (void)doSomethingOptionalWithArgument:(NSString *)argument; // also optional
@required
- (NSArray *)getManySomethings:(int)howMany; // back to being “must implement”
@property (nonatomic, strong) NSString *fooProp; // note that you must specify strength
@end


协议没有对应@implementation,就是一个方法和@property的集合




可以定义在自己的头文件里,也可以定义在其他类的头文件中。


实现协议,并使用的语法:


#import “Foo.h”                                                   // importing the header file that declares the Foo @protocol
@interface MyClass : NSObject <Foo> // MyClass is saying it implements the Foo @protocol
...
@end




1 必须实现所有的非可选的方法


2 可以声明变量


id <Foo> obj = [[MyClass alloc] init];//编译器喜欢这样的


id <Foo> obj = [NSArray array];//报错


3 可以把它们当参数传递


@property (nonatomic, weak) id <Foo> myFooProperty;  // properties too!  



协议的主要作用:


实现委托和数据源。//delegate datasource
委托几乎都是weak的,因为被设置为委托的对象通常都是委托对象的所有者或创建者。


比如controller通常把自己设置成view的委托或数据源,你不要它们相互的strong指针指向。


view只会weak指向回controller,反正view也干不了什么,如果controller不在了,view也没什么用了



scrollView例子


@protocol UIScrollViewDelegate
@optional
- (UIView *)viewForZoomingInScrollView:(UIScrollView *)sender;
- (void)scrollViewDidEndDragging:(UIScrollView *)sender willDecelerate:(BOOL)decelerate;
@end



@interface UIScrollView : UIView
@property (nonatomic, weak) id <UIScrollViewDelegate> delegate;
@end



@interface MyViewController : UIViewController <UIScrollViewDelegate>
@property (nonatomic, weak) IBOutlet UIScrollView *scrollView;
@end
@implementation MyViewController
- (void)setScrollView:(UIScrollView *)scrollView {
_scrollView = scrollView;
self.scrollView.delegate = self; // compiler won’t complain
}
- (UIView *)viewForZoomingInScrollView:(UIScrollView *)sender { return ... };
@end




3、手势识别 UIGestureRecognizer


如何获得触摸事件呢?


得到触摸时间的通知


相应预定义好的手势


UIGestureRecognizer类,它是个抽象类,需要实现。



使用手势识别有两步骤:


先创建一个手势,把它附到UIView上,然后当手势被识别时进行处理。


第一步由controller来完成


第二步由UIView自己完成


从controller添加一个手势到UIView:


- (void)setPannableView:(UIView *)pannableView
{
_pannableView = pannableView;
UIPanGestureRecognizer *pangr =
[[UIPanGestureRecognizer alloc] initWithTarget:pannableView action:@selector(pan:)];
[pannableView addGestureRecognizer:pangr];
}


手势会冲突,但系统会处理好潜在的冲突


target 是手势识别之后的处理者,这里就是view本身来处理。


但pan不是发送者,手势识别调用这个准备发送的消息,所以他不是发送者,手势识别才是



UIPangestureRecognizer的三个方法:


- (CGPoint)translationInView:(UIView *)aView;
- (CGPoint)velocityInView:(UIView *)aView;
- (void)setTranslation:(CGPoint)translation inView:(UIView *)aView;


第一个告诉移动的距离,从上个手势点到这个手势点的距离


第二个告诉移动的速度


第三个是重新设置,第一个方法的setter



手势识别的状态机 state


@property (readonly) UIGestureRecognizerState state;


状态从possible  ,手势识别的初始状态


如果手势比较短:recognized   


如果持续:Began - Changed变化中 ,最后Ended,手指抬起


还有Failed 和 Cancelled状态,取消或其他操作中断时会有这个情况。



pan:是什么样的呢:


- (void)pan:(UIPanGestureRecognizer *)recognizer
{
if ((recognizer.state == UIGestureRecognizerStateChanged) ||
(recognizer.state == UIGestureRecognizerStateEnded)) {
CGPoint translation = [recognizer translationInView:self];
// move something in myself (I’m a UIView) by translation.x and translation.y
// for example, if I were a graph and my origin was set by an @property called origin
self.origin = CGPointMake(self.origin.x+translation.x, self.origin.y+translation.y);
[recognizer setTranslation:CGPointZero inView:self];
}
}



其他实例的手势:


UIPinchGestureRecognizer //缩放
//scale/velocity

UIRotationGestureRecognizer //旋转手势,两个手指按下,然后旋转,是个弧度,不是角度。
//rotation/velocity

UISwipeGestureRecognizer 滑动手势, 一个或多个手指滑动,
//direction/numerOfTouchesRequired

UITapGestureRecognizer //点击手势
//numofTapsRequired/numerOfTouchesRequired



4 演示Demo


整个Demo的下载地址:​​https://github.com/junxianhu/Happiness​

Happiness

通过drawRect画出一个笑脸,在捕捉手势,放大缩小,以及向上,向下拖动,改变笑脸和哭脸



斯坦福大学公开课 iOS应用开发教程学习笔记(第五课)Protocols,手势识别_ios_02

斯坦福大学公开课 iOS应用开发教程学习笔记(第五课)Protocols,手势识别_ci_03



贴几个主要代码:


ViewController.m


//
// ViewController.m
// Happiness
//
// Created by cipher on 15/10/7.
// Copyright (c) 2015年 com.lab1411.cipher. All rights reserved.
//

#import "ViewController.h"
#import "FaceView.h"

//协议放这<FaceViewDataSource> 私有的
@interface ViewController () <FaceViewDataSource>
@property(nonatomic,weak) IBOutlet FaceView *faceView;
@end


@implementation ViewController

@synthesize happiness = _happiness;
@synthesize faceView = _faceView;

-(void)setHappiness:(int)happiness{
_happiness=happiness;
[self.faceView setNeedsDisplay];
}

-(void)setFaceView:(FaceView *)faceView{

_faceView = faceView;
//这里增加的target 一定是UIPinchGestureRecognizer,UIGestureRecognizer无法捕捉
[self.faceView addGestureRecognizer:[[UIPinchGestureRecognizer alloc] initWithTarget:self.faceView action:@selector(pinch:)]];

//控制笑脸
[self.faceView addGestureRecognizer:[[UIPanGestureRecognizer alloc] initWithTarget:self action:@selector(handleHappinessGesture:)]];

self.faceView.dataSource = self;
}

-(void)handleHappinessGesture:(UIPanGestureRecognizer *)gesture{
if (gesture.state == UIGestureRecognizerStateChanged || gesture.state == UIGestureRecognizerStateEnded) {
CGPoint translation = [gesture translationInView:self.faceView];
self.happiness -= translation.y / 2;
[gesture setTranslation:CGPointZero inView:self.faceView];
}
}

-(float)smileForFaceView:(FaceView *)sender{

return (self.happiness - 50) / 50.0;
}


-(BOOL)shouldAutorotateToInterfaceOrientation:(UIInterfaceOrientation)toInterfaceOrientation{
return YES;
}

@end




FaceView.h


//
// FaceView.h
// Happiness
//
// Created by cipher on 15/10/7.
// Copyright (c) 2015年 com.lab1411.cipher. All rights reserved.
//

#import <UIKit/UIKit.h>

@class FaceView;
@protocol FaceViewDataSource

//这里的FaceView变量未定义,在下面菜才定义,用class解决,前向引用
-(float)smileForFaceView:(FaceView *)sender;

@end



@interface FaceView : UIView

@property (nonatomic) CGFloat scale;

//让该方法成为public
-(void)pinch:(UIPinchGestureRecognizer *)gesture;

//协议
@property (nonatomic,weak) IBOutlet id<FaceViewDataSource> dataSource;

@end




FaceView.m


//
// FaceView.m
// Happiness
//
// Created by cipher on 15/10/7.
// Copyright (c) 2015年 com.lab1411.cipher. All rights reserved.
//

#import "FaceView.h"

#define DEFAULT_SCALE 0.90

#define EYE_H 0.35
#define EYE_Y 0.35
#define EYE_RADIUS 0.10

#define MOUTH_H 0.45
#define MOUTH_Y 0.40
#define MOUTH_SMILE 0.25

@implementation FaceView

@synthesize dataSource=_dataSource;
@synthesize scale=_scale;

-(CGFloat)scale{
if (!_scale) {
return DEFAULT_SCALE;
}else{
return _scale;
}
}

-(void)setScale:(CGFloat)scale{
if (scale != _scale) {
_scale=scale;
[self setNeedsDisplay];
}
}

-(void)pinch:(UIPinchGestureRecognizer *)gesture{
NSLog(@"dfsdf");
if (gesture.state == UIGestureRecognizerStateChanged || gesture.state == UIGestureRecognizerStateEnded) {
self.scale *= gesture.scale;
gesture.scale = 1;
}
}

-(void) setup{

self.contentMode = UIViewContentModeRedraw;
}

-(void)awakeFromNib{
[self setup];
}

-(id)initWithFrame:(CGRect)frame{
self = [super initWithFrame:frame];
if (self) {
[self setup];

}
return self;
}

-(void)drawCircleAtPoint:(CGPoint)p withRadius:(CGFloat)radius inContext:(CGContextRef)context{

UIGraphicsPushContext(context);
CGContextBeginPath(context);
CGContextAddArc(context,p.x,p.y, radius, 0, 2*M_PI, YES);

CGContextStrokePath(context);
UIGraphicsPopContext();
}


-(void) drawRect:(CGRect)rect{
CGContextRef context = UIGraphicsGetCurrentContext();
//draw face circle
CGPoint midPoint;
midPoint.x = self.bounds.origin.x + self.bounds.size.width/2;
midPoint.y = self.bounds.origin.y + self.bounds.size.height/2;

CGFloat size =self.bounds.size.width/2;
if (self.bounds.size.height < self.bounds.size.width) {
size = self.bounds.size.height/2;
}
// size *= DEFAULT_SCALE;
NSLog(@"%f",self.scale);
size *= self.scale;

CGContextSetLineWidth(context, 5.0);
[[UIColor blueColor] setStroke];

[self drawCircleAtPoint:midPoint withRadius:size inContext:context];


//draw eyes
CGPoint eyePoint;
eyePoint.x = midPoint.x - size * EYE_H;
eyePoint.y = midPoint.y - size * EYE_Y;
[self drawCircleAtPoint:eyePoint withRadius:size*EYE_RADIUS inContext:context];
eyePoint.x += size * EYE_H * 2;
[self drawCircleAtPoint:eyePoint withRadius:size*EYE_RADIUS inContext:context ];

//no nose
//draw mouth
CGPoint mouthStart;
mouthStart.x = midPoint.x - MOUTH_H * size;
mouthStart.y = midPoint.y + MOUTH_Y * size;
CGPoint mouthEnd = mouthStart;
mouthEnd.x += MOUTH_H * size * 2;
CGPoint mouthCP1 = mouthStart;
mouthCP1.x += MOUTH_H * size * 2/3;
CGPoint mouthCP2 = mouthEnd;
mouthCP2.x -= MOUTH_H * size * 2/3;

// float smile = 1;//this shoule be dlegated it's our view's data
float smile = [self.dataSource smileForFaceView:self];
if (smile < -1) {
smile = -1;
}
if (smile > 1) {
smile = 1;
}

CGFloat smileOffset = MOUTH_SMILE * size * smile;
mouthCP1.y += smileOffset;
mouthCP2.y += smileOffset;

CGContextBeginPath(context);
CGContextMoveToPoint(context, mouthStart.x, mouthStart.y);
CGContextAddCurveToPoint(context, mouthCP1.x, mouthCP1.y, mouthCP2.x, mouthCP2.y, mouthEnd.x, mouthEnd.y);
CGContextStrokePath(context);

}


@end