iphone - Three Layers of Pan Gesture Recogniser Confusion -


a sketch of app's view hierarchy

whilst developing app have come against problem having many pan gesture recognisers. first pan gesture recogniser on mainviewcontroller parent of recipesearchvc. gesture recogniser slides whole view left or right. second pan gesture recogniser in in recipesearchparametersvc parent of page view controller. third pan gesture gesture recogniser added uicontrol wheel nested inside of view controller represented pageviewcontroller.

i know sounds insane , argued poor design. however, believe worked cohesively fine.

when trying rotate wheel rotate second or 2 before gesture overtaken either pageviewcontroller or mainviewcontroller. more not mainviewcontroller takes over. techniques employ separate each of these gesture recognisers?

edit:

apologies vagueness of description when comes pan gesture recognisers. mainviewcontroller has it's own uipangesturerecpgniser allow move left or right. recipesearchparametersvc has uipangesturerecogniser because of uipageviewcontroller contains. not add gesture recogniser itself, takes them the pageviewcontroller. uicontrol's gesture recognisers allows track rotation should undergo.

in taking advice given, may remove gestures page view controller , substitue them buttons. intended work images (which can scrolled reveal more images) found in ibooks, , thought work fine.

uicontrol uipangesturerecogniser code

/**  *  sent control when touch related given event enters control’s bounds  *  *  @param  touch                       uitouch object represents touch on receiving control during tracking  *  @param  event                       event object encapsulating information specific user event  */ - (bool)begintrackingwithtouch:(uitouch *)touch                      withevent:(uievent *)event {     [super begintrackingwithtouch:touch withevent:event];      cgpoint touchpoint                  = [touch locationinview:self];      //  filter out touchs close centre of wheel     cgfloat magnitudefromcentre         = [self calculatedistancefromcentre:touchpoint];      if (magnitudefromcentre < 40)       return no;      //  calculate distance centre     cgfloat deltax                      = touchpoint.x - _container.center.x;     cgfloat deltay                      = touchpoint.y - _container.center.y;      //  calculate arctangent of opposite (y axis) on adjacent (x axis) angle     _deltaangle                         = atan2(deltay, deltax);      _starttransform                     = _container.transform;      //  selection in limbo set sector image's minimum value changing current 1     [self getsectorbyvalue:_currentsector].alpha = kminimumalpha;      return yes; } 

unfortunately due nature of controller hierarchy forced rethink design of app.

the mainviewcontroller uipangesturerecogniser has stayed is. uipageviewcontroller uicontrol has moved separate static view controller.

this works far better not yet ideal. uipageviewcontroller steals horizontal panning, can fixed implementing buttons alternative scrolling.

the uicontrol did not have gesture recogniser, override begintrackingwithtouch: , other methods track touches.

i suppose answer should be: if layering many gestures, you're doing wrong.


Comments

Popular posts from this blog

blackberry 10 - how to add multiple markers on the google map just by url? -

php - guestbook returning database data to flash -

delphi - Dynamic file type icon -