How iOS handles touches. Responder chain, touch event handling, gesture recognizers, scrollviews

I started out trying to make a scrollview with buttons in it. I ended up diving deep into the pool of touch events and gesture recognizers. Here’s what I found.

The problem with putting a UIbutton into a scrollview is that when you tap the button, there’s a noticeable delay before the button is highlighted. Not satisfied with this, I started Googling around. Found a stack overflow and it all worked, but how does it work?

I needed to go back to the start.

What happens after you touch an iPhone screen?

  • How does a touch find the object that should respond to it?
  • How does the view handle the touch without gesture recognizers?
  • How do views use gesture recognizers?
    • How do gesture recognizes interact with each other?


How does a touch find the object that should respond to it?

The responder chain.

Basically, the window takes the touch and checks to see which of it’s subviews the touch is in (aka hit testing). The next view checks its subviews and so on until there are no more subviews to check and we’ve found the deepest view in the view tree that contains the touch. It’s helpful that views are responders, so they can handle the touch events.

If that view isn’t able to handle the touch, then it would be forwarded back up the responder chain to it’s superview until it finds a responder that can handle the touch.

How does the view handle the touch without gesture recognizers?

Touches have four phases: began, moved, ended and cancelled. When the touch changes to a phase, it calls the corresponding method on the view that it’s in.

- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event;
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event;
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event;
- (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event;


What happens if you start touching in viewA and then move out of that view without lifting your finger. Does the touch call the method on viewA or viewA’s superview?

Turns out that the messages are sent to the object’s where the touch began.

Within these four methods, you get all the touches involved and you can use these touches to define gestures and trigger actions. For example, if you want your view to recognize swipe gestures, you can keep track of where the swipe action starts in touchesBegan:withEvent: and then keep track of how far the touch has moved in touchesMoved:withEvent: and once it’s greater than a certain distance, you can call an action. That’s exactly what people did before gesture recognizers were introduced in iOS 3.2.

Code examples from Apple

How do views use gesture recognizers?

Gesture recognizers (GRs) are awesome because makes takes touch event handling to a higher level. Instead of tracking individual touches, a gesture recognizer just tells you if you should act on a user’s touches.

Also, if you need to recognize a new type of gesture, you don’t need to make a new subclass of UIView. You can just make a new subclass of GR and add it to a vanilla UIView.

After the gesture is recognized, the GR calls an action on a target and that’s where your app reacts to the touch.

Gesture recognizers get first dibs. Touch events go to them before being handled by the view. That means, when a gesture recognizer recognizes a touch, it can prevent the delivery of touchesEnded:withEvent: to the view and instead call touchesCancelled:withEvent:.

By default, during a swipe gesture, the touchesBegan: and touchesMoved: events will still be called while the gesture hasn’t been recognized yet. If you don’t want these events to be called, you can set delaysTouchesBegan to true on the gesture recognizer.

There are awesome diagrams in the Apple docs for gesture recognizers (especially, Fig. 1-3, 1-5, 1-6).

How do gesture recognizers interact with other gesture recognizers?

By default, views with more than one GR can call the GRs in a different order each time.

You can define the order of GRs with

You can prevent touches going to GRs by setting their delegates with

More interactions between GRs

How did this help solve my problem?

The original problem was that a button in a scrollview had a delay in response to touches. The reason is that scrollviews have a property called delaysContentTouches set to YES by default.

Turn delaysContentTouches to NO and you can tap buttons quickly now, but you can’t scroll anymore. That’s because “default control actions prevent overlapping gesture recognizer behavior”. The default control action is tapping a button. The gesture recognizer it was preventing was the panGestureRecognizer on the scrollview.

Meaning that when you tap a button in a scrollview, the tapping of the button takes precedence and you won’t be able to scroll because all touches starting in a button would be interpreted as taps.

In addition, I had to create a subclass of UIScrollview and overwrite touchesShouldCancelInContentView: to YES, so that not all touches are no longer sent the button and the scrollview is able to pan.



  1. User touches screen (ex. drags scrollview).
  2. Screen finds the view the user touched (i.e. scrollview).
  3. The view uses any gesture recognizers to process the touch, otherwise handles the touch itself (i.e. scrollview’s panGestureRecognizer).
  4. Gesture recognizers or the view trigger actions that change what appears on screen (i.e. scrollview moves) or changes the model.

Playing with iOS MapKit Part 3: Making it Pretty

Playing With MapKit

Part 1 MapKit Tutorials 

Part 2 Reverse Geocoding and Custom Annotation Callouts

Part 3 Making it Pretty

Part 4 Race Conditions, Other Bugs, and Wrap-up

After getting down the basic functionality, I wanted to polish up the app to give it a quality feel and to dig deeper into the nitty-gritty details of MapKit.

So I made a list of things that could be improved and proceeded to knock each of them out one by one.

To do list

*I had the most fun with this one.

  1. Remove the jumpy effect on loading
  2. Move bar buttons from toolbar to buttons on the map
  3. Make the callout for the first search result come up
  4. Make the zip code and city on the same line
  5. Make the map center and zoom when the current location is tapped
  6. After clicking the textbox, resign the first responder when the map is clicked
  7. Add Directions
  8. *Zoom to include the destination in the Routeviewcontroller
  9. Intelligently choose walking vs driving directions
  10. Make the starting region of the route, the same as that of the previous view

1. Remove the jumpy effect on loading

When the app loaded, the default map showed the United States and when the user location was updated, it would move center on the current location, leading to a jumpy effect (I didn’t know about setCenterCoordinate: animated: at the time). The jump came about a couple seconds after opening the app depending on the internet connection and lead to a laggy feeling.

Design Choice

Initially, I chose to not center the map on the current location because it was a quick fix. I let the user tap the current location to zoom in. I chose this because it was the easiest fix.

After some iterations, I decided it was better just to zoom in automatically.

Back to list

2. Move bar buttons from toolbar to buttons on the map

Many of the map apps that I’ve seen have floating buttons on the map instead of a toolbar. I planned to do this, but it didn’t seem worth the effort.

Design Choice

After looking at the default maps app and seeing that it used a toolbar too, I decided to save time and stick with the toolbar. I just separated the buttons to make it more symmetrical.

Back to list

3. Make the callout for the first search result come up

Before, I had the callout for the last search result come up. This lead to the map whipping to the last location when it was off the visible region.

Design Choice

Make the first search result come up.

I used enumerateObjectsUsingBlock to save the first item. In doing this, I found out that I needed to set my firstAnnotation object to __block, so that the change would be visible outside of the block. (More on StackOverflow.)

How do I select a MKPointAnnotation?

[self.mapView selectAnnotation:firstAnnotation 

Back to list

4. Make the zip code and city on the same line

Custom Callout View

The view for the address callout was too small to show the whole line for city, state, and zip code.

Design Choice

The easiest way to fix this was to make the view wider and shorten the zip to five numbers.

Afterwards, I ran into problems with special places that had an extra line for the placename, so I had to add extra code to parse the formatted address lines in cases where there were two or three lines.

In the final version, the callout view is now a bit too big, so future work may be nice to set the size dynamically and even animated.

Back to list

5. Make the map center and zoom when the current location is tapped

Easy, call the zoomIn method that the zoom in button calls.

Back to list

6. After clicking the textbox, resign the first responder when the map is clicked

The touchesBegan: withEvent: method detects when I touch outside the textfield onto the map, so this was easy to fix.

- (void) touchesBegan:(NSSet *)touches withEvent:
(UIEvent *)event
    [self.searchText resignFirstResponder];

Back to list

7. Add Directions

With the route shown, it was natural to want to show the directions and the distances involved.

Design Choice

The easiest thing was to show the directions in a tableViewController and segue to it from a UIBarButton on the navigation bar of the directions page. This is pretty standard, so I just passed the instructions to a table view controller and displayed it in the cells.

The distances are currently in meters. That’s not very intuitive so it would be nice to have this in blocks.

Back to list

*8. Zoom to include the destination in the RouteViewController

The idea is that when I selected a venue which was outside the visible range, I wanted the map to zoom to fit the destination and my current location. While tedious, executing this was not hard when I saw down to work backwards from

[self.routeMap setRegion:self.adjustedRegion 

How do I zoom in to fit two locations on a map?


  • Get the coordinates for your location and the destination location.
  • Find the difference in latitude and longitude and scale this up by a multiplier. A multiplier of 1 will have both locations on the edges of the screen.
  • Make sure to get the absolute difference. Negative numbers are no good.
  • Make a MKCoordinateSpan with the ABSOLUTE differences in latitude and longitude.
  • Find the average latitude and longitude to get the center for the region.
  • Make the MKCoordinateRegion with the centerCoordinate and the span.
  • Find the adjustedRegion by calling regionThatFits on the region in the last step.
  • Set the MKMapView to that adjustedRegion.
- (void)resizeRegionWithDestination:(MKMapItem *)destination 
userLocation:(MKUserLocation *)userLocation
    NSLog(@"need to resize");
    CLLocationCoordinate2D destinationCoordinate = 
    CLLocationCoordinate2D userCoordinate = 
    double scaleFactor = 2;
    CLLocationDegrees latitudeDelta = 
        (userCoordinate.latitude - 
    CLLocationDegrees longitudeDelta = 
        (userCoordinate.longitude - 
        destinationCoordinate.longitude) *scaleFactor;
    if (latitudeDelta < 0) {
        latitudeDelta = latitudeDelta * -1;
    if (longitudeDelta < 0) {
        longitudeDelta = longitudeDelta * -1;
    MKCoordinateSpan span = 
    MKCoordinateSpanMake(latitudeDelta, longitudeDelta);
    CLLocationDegrees averageLatitude = 
        (userCoordinate.latitude + 
    CLLocationDegrees averageLongitude = 
        (userCoordinate.longitude + 
    CLLocationCoordinate2D centerCoordinate = 

    MKCoordinateRegion region = 
    MKCoordinateRegionMake(centerCoordinate, span);

    self.adjustedRegion = [self.routeMap regionThatFits:region];
    [self.routeMap setRegion:self.adjustedRegion animated:YES];

The cool part about this code is that it doesn’t care if it’s zooming in or out. It just zooms perfectly to the right region. I had to play with the scalingFactor a couple times to get it to look right. 2 seems to work pretty well.

Back to list

9. Intelligently choose walking vs driving directions

To do this, I needed to get the estimated time for walking directions.

A note on directions. If the MKDirectionRequest has the transportationType set to walking and the destination is not walkable, then the service will return no results. That’s why the transportation type is defaulted to any.

MKDirectionsRequest *request = [[MKDirectionsRequest alloc] init];
request.source = [MKMapItem mapItemForCurrentLocation];
request.destination = self.destination;
request.requestsAlternateRoutes = NO;
request.transportType = MKDirectionsTransportTypeWalking;
MKDirections *estimatedTimeOfArrival = [[MKDirections alloc] initWithRequest:request];
[estimatedTimeOfArrival calculateETAWithCompletionHandler:^(MKETAResponse *response, NSError *error) {
    //do something with the response

After I knew that there was no error, I would check to see if the response.expectedTravelTime was greater than 15 minutes (how much I’d be willing to walk). If it was, I would change the MKDirectionsRequest’s transportType to include driving and find the directions with the modified request.

With the asynchronous call within the completion block, this might be a good candidate for wrapping the asynchronous calls using promises.

Back to list

10. Make the starting region of the route, the same as that of the previous view

Pretty simple. Just pass the region from the first MKMapView to the search results view controller to the RouteViewController through the prepare for segues.

Back to list

At the end of this process, I had an app that I could actually show people and it feels really good.

Up next squashing bugs.