Cocos2D is a great library, and there are several wrappers out there for adding functionality contained within the UIKit into Cocos2D games.
Shortly after gesture recognizers were added to the iOS SDK some wrapper code was released so Cocos2D developers could use gesture recongizers within their games, but this required using a different syntax, and modifying Cocos2D code.
Today I came across a nice category from Krystof Zablocki that can be used with CCNode in both Cocos2D for adding gesture recognizers in Cocos2D games while using the same syntax as a UIGestureRecognizer.
Two versions are provided for both Cocos2D v1 and V2.
One of the things that sets the iPhone’s user interface apart from the competition is Apple’s incredible attention to detail. This attention does not only extend to every single pixel on the screen but, perhaps even more important, also to gesture recognition: the standard iOS controls do a lot of smart things in order to identify the gesture the user intended to make and to allow multiple gestures to be executed at the same time if appropriate.
For example, scroll views must distinguish between a simple tap and the beginning of a swipe gesture. Similarly, the built-in map view control allows pinching and scrolling in one simultaneous gesture (put two fingers on the screen and you can switch seamlessly between pinching and scrolling without having to “restart” the gesture).
Half-assed Gesture Recognition Is Not Good Enough
We should set ourselves the same high standards for our own apps. In this article, I would like to show you an example where the half-assed implementation of gesture recognition seems to work well enough at first glance. But as the user continues to use the app, they quickly stumble upon little irritations, small annoyances in the user interface that they perhaps cannot even pinpoint but notice nevertheless. And while these irritations do not seem to be a big deal to many developers, I want to show you how getting rid of them can improve the user experience tremendously.
The example made use of gesture recognizers to allow the master view to be swiped on and off screen. However I created and configured these objects in code rather than with Interface Builder. In this post I will show how easy it is to create gesture recognizers with IB and eliminate some code.
A quick recap
Just by way of a recap the example app I used in the last post made use of three gesture recognizers. These were created in the viewDidLoad method of the DetailViewController and added to the detail view. The code that was used to create the three objects is shown below:
Each gesture recognizer is allocated and initialised with its appropriate target and action, there is some gesture specific configuration such as setting the direction of the swipe or the number of taps we expect and then finally the gesture is added to the view.
Gesture Recognizers in Interface Builder
Interface Builder contains a full set of gesture recognizers in its objects library. So instead of creating these objects manually with code we can just drag the required gestures onto our view in the UYLDetailViewController XIB.
Before I do that though I need to correct an error in the example I showed last time. The gesture recognizers were assigned to the top level UIView which includes the toolbar and the toolbar button. This meant that any attempt to tap on the toolbar button was intercepted by the gesture recognizer preventing the button action from being called.
To correct this error I have introduced a subview that covers the entire user interface with the exception of the toolbar. The gesture recognizers can then be assigned to this subview leaving taps on the toolbar button to be handled by the button. For this example we need one Tap Gesture Recognizer object and two Swipe Gesture Recognizer objects. Dragging these objects onto the subview leaves us with a XIB file looks as follows:
It is the second view in this list that the gestures are assigned to. If you find your gestures are not working check the Referencing Outlet Collections connection in Interface Builder to ensure they are connected to the correct view:
With the gestures added, configuration becomes easier as the inspector shows you the gesture specific configuration options. So the tap gesture allows you set to the number of taps and touches you want to recognize:
The swipe gesture allows you set the direction of the swipe (left, right, up, down) and the number of touches:
The final step is to set the target-action for each gesture. Using IB you can create the target-action by control-clicking on the object and then dragging into the implementation file. IB will then prompt you for the name of the action to insert:
Since in this case we already have the action methods defined we will take a slightly different approach. We will first fix the methods to add IBAction hints for Xcode and then using the same control-click method drag each recognizer object to its method. As you mouse over the action it highlights to show the connection:
With the gestures configured and connected there is nothing more to do other than clean up the now redundant code from the view controller. A quick compile and run should confirm that the gestures are recognised and function as before.
The updated Xcode project can be found here or in my CodeExamples github repository.
Uiswipegesturerecognizer Sample Screen PathUisplitviewcontroller Intercepting Pan GesturesIphone Dev Dismissmodalview Too FastIphone Crash UiswipegesturerecognizerAddgesture Toolbar Not Working IosMonotouch Uiswipegesturerecognizer What Cell Is SwipedIos Generation Guesture ProgramaticallyIntercept Tap Gesture UibarbuttonitemHow To Differentiate Between Tableview Scrolling & Tap Gesture IosSwipe Gesture Recognizer With DecelerationStoryboard Tap Gesture Controller