我看过Apple的样本触摸应用程序,但它没有解决我的问题。在他们的示例应用程序中,当触摸事件发生时,它会尝试将视图保持在触摸事件发生的位置。这使逻辑变得简单。他们只是查找框架包含触摸位置的视图。
这在我的场景中不起作用。这是我的情景。
有一个视图包含一堆子视图。这个想法是允许用户按照他们手势的方向对其中一个子视图进行排序。我希望touchesBegan
事件找到中心最接近触摸的视图。
然后我希望touchesEnded
事件通过由开始和结束事件确定的速度移动相同的视图。速度不一定与手指速度相同,因此我不能简单地将视图“附加”到Apple在示例应用程序中完成的触摸位置。
我想过使用触摸对象标记touchesBegan
中标识的视图,并使用它与touchesEnded
事件中的触摸对象进行比较,但这不起作用。 touchesBegan
和touchesEnded
事件的触摸对象不同。
那我错过了什么?如何保存要移动的视图和触摸之间的关系?
答案 0 :(得分:0)
TouchObject将不会相似。对象将随触摸的视图,位置等而改变。 有关详细信息,请查看UITouch Class。
建议1:
我在项目中通过继承UIView 并在子类中添加触摸委托来实现相同的功能。 我创建了这个子类的实例,而不是普通的UIView。
@interface myView : UIView
@end
@implementation myView
//over ride the following methods
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
}
- (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event
{
}
- (void)motionEnded:(UIEventSubtype)motion withEvent:(UIEvent *)event
{
}
@end
建议2
使用UIPanGesture执行相同的操作。这将是一种更简单的方式。
UIPanGestureRecognizer *panGesture = [[[UIPanGestureRecognizer alloc] initWithTarget:self action:@selector(panGestureMoveAround:)] autorelease];
[panGesture setMaximumNumberOfTouches:2];
[panGesture setDelegate:self];
[yourSubView addGestureRecognizer:panGesture];
-(void)panGestureMoveAround:(UIPanGestureRecognizer *)gesture;
{
UIView *piece = [gesture view];
[self adjustAnchorPointForGestureRecognizer:gesture];
if ([gesture state] == UIGestureRecognizerStateBegan || [gesture state] == UIGestureRecognizerStateChanged) {
CGPoint translation = [gesture translationInView:[piece superview]];
[piece setCenter:CGPointMake([piece center].x + translation.x, [piece center].y+translation.y*0.1)];
[gesture setTranslation:CGPointZero inView:[piece superview]];
}
}
一个例子是here
答案 1 :(得分:0)
这是我的解决方案。在此解决方案中,scaledView
是箭头视图移动的视图。它们属于spriteView
类,它是UIView
的子类。
-(spriteView *)findArrowContainingTouch:(UITouch *)touch inView:(UIView *)scaledView atPoint:(CGPoint) touchPoint
//There could be multiple subviews whose rectangles include the point. Find the one whose center is closest.
{
spriteView * touchedView = nil;
float bestDistance2 = 9999999999.9;
float testDistance2 = 0;
for (spriteView *arrow in scaledView.subviews) {
if (arrow.tag == ARROWTAG) {
if (CGRectContainsPoint(arrow.frame, touchPoint)) {
testDistance2 = [self distance2Between:touchPoint and:arrow.center];
if (testDistance2<bestDistance2) {
bestDistance2 = testDistance2;
touchedView = arrow;
}
}
}
}
return touchedView;
}
方法distance2Between
计算两点之间距离的平方。
-(spriteView *)findArrowTouchedAtLocation:(CGPoint)p inView:(UIView *)scaledView
{
spriteView * arrow = nil;
for (spriteView *testArrow in scaledView.subviews) {
if (testArrow.tag == ARROWTAG) {
if ((p.x == testArrow.touch.x) && (p.y == testArrow.touch.y)) {
arrow = testArrow;
}
}
}
return arrow;
}
scaledView
中的子视图不是箭头,因此我使用常量ARROWTAG来识别箭头。
#pragma mark - Touches
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UIView *scaledView = [self getScaledView];
for (UITouch *touch in touches) {
CGPoint touchLocation = [touch locationInView:scaledView];
spriteView * arrow = [self findArrowContainingTouch:touch inView:scaledView atPoint:touchLocation];
if (!(arrow==nil)) {
//Record the original location of the touch event in a property, originalTouchLocation, of the arrow instance. Additionally, store the same point on the property touch.
//Both properties are necessary. The originalTouchLocation will be used in `touchesEnded` and is not available in the `touch` object. So the information is stored separately.
//The `touch` property, a CGPoint, is stored in order to identify the view. This property is updated by every touch event. The new value will be used by the upcoming event to find the appropriate view.
arrow.touch = CGPointMake(touchLocation.x, touchLocation.y);
arrow.originalTouchLocation = CGPointMake(touchLocation.x, touchLocation.y);
arrow.debugFlag = YES;
arrow.timeTouchBegan = touch.timestamp;
}
}
}
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UIView *scaledView = [self getScaledView];
for (UITouch *touch in touches) {
CGPoint touchLocation = [touch locationInView:scaledView];
CGPoint previousLocation = [touch previousLocationInView:scaledView];
//previousLocation is used to find the right view. This must be in the coordinate system of the same view used in `touchesBegan`.
spriteView * arrow = [self findArrowTouchedAtLocation:previousLocation inView:scaledView];
if (!(arrow==nil)) {
arrow.touch = CGPointMake(touchLocation.x, touchLocation.y);
}
}
}
-(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
UIView *scaledView = [self getScaledView];
for (UITouch *touch in touches) {
CGPoint touchLocation = [touch locationInView:scaledView];
CGPoint previousLocation = [touch previousLocationInView:scaledView];
spriteView * arrow = [self findArrowTouchedAtLocation:previousLocation inView:scaledView];
if (!(arrow==nil)) {
arrow.touch = CGPointMake(touchLocation.x, touchLocation.y);
float strokeAngle = [self findAngleFrom:arrow.originalTouchLocation to:touchLocation];
float strokeDistance2 = sqrt([self distance2Between:arrow.originalTouchLocation and:touchLocation]);
NSTimeInterval timeElapsed = touch.timestamp - arrow.timeTouchBegan;
float newArrowSpeed = strokeDistance2 / timeElapsed / 100; //might want to use a different conversion factor, but this one works quite well
arrow.transform = CGAffineTransformMakeRotation(strokeAngle);
arrow.currentSpeed = newArrowSpeed;
}
}
}
-(void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event
{
UIView *scaledView = [self getScaledView];
for (UITouch *touch in touches) {
CGPoint previousLocation = [touch previousLocationInView:scaledView];
spriteView * arrow = [self findArrowTouchedAtLocation:previousLocation inView:scaledView];
if (!(arrow==nil)) {
arrow.originalTouchLocation = CGPointMake(99999.0, 99999.0);
NSLog(@"Arrow original location erased");
}
}
}
答案 2 :(得分:0)
触摸对象实际上与touchesBegan
和touchesEnded
中的相同。结果证明这是跟踪触摸的方式。