In this blog post I will be discussing and demonstrating iOS 7 full screen layout, updates, new and updated gesture recognizers, and new gesture recognizer delegate methods. Click below to download the example application. This app demonstrates the use of these new and updated features.

https://blogs.captechconsulting.com/sites/default/files/blog_posts/GesturesAndFullscreenDemo_1.zip

New Layout and Status Bar Updates

With iOS 7 Apple makes full use of every pixel of screen real estate available on their devices. No longer is it the norm to use visual frames, insets, and drop shadows to create a 3-D illusion of items on a single screen. Instead the emphasis is on creating a more realistic sense of depth by treating each object as a separate part of a hierarchy that combines to form the entire user experience.

Full Screen and Translucency

A large part of iOS 7 new interface design is the change to a true full screen layout for views and the use of translucency. This creates a more realistic illusion that each item on the screen is layered on top of other items. A major change that developers will notice is that the top/bottom bars no longer reposition the main view of a controller. These views now fit the entire screen and extend under the bars. The "Wants Full Screen" option in storyboard has been deprecated and a new category "Extend Edges" has been added with 3 options: "Under Top Bars", "Under Bottom Bars", and "Under Opaque Bars." Removing checks for these options will return the view controller to a behavior similar to iOS 6. This can also be done programmatically by setting extended layout edges to none:

<span class="br0">[</span>self setEdgesForExtendedLayout<span class="sy0">:</span>UIRectEdgeNone<span class="br0">]</span>;

This code is in the example app but commented out to make full use of the extended layout.

Another notable change in iOS 7 is that the top/bottom bars including the status bar now default to a new translucent effect. Views behind these bars will be somewhat visible but extremely blurred and distorted. Views that require user input, such as an action sheet, alert view or the new control center also use this translucent effect. This again works to create the illusion of layers of objects on top of each other on the screen. Top/bottom bars are also encouraged to hide when it seems natural for the user to want an expanded view of the app's content. The updated Safari, Maps, and Photos apps make use of this feature. Scrolling a page in Safari forces the address bar and the tool bar to hide and allow more of the screen to be devoted to the web page. Tapping on the map or an image produces a similar effect to allow a larger viewable area. The screenshots below show the differences in these apps. All three apps enter and exit full screen mode using simple gestures, scrolling for Safari and a single tap for Maps and Photos. The example app demonstrates this by reproducing the effect in Photos where a single tap anywhere on the screen will toggle hidden and visible top/bottom bars and status bar. See the screenshots below:

Safari with top/bottom bars visible then hidden in full screen mode

Apple Maps with top/bottom bars visible then hidden

Apple Photos with top/bottom bars visible then hidden. Notice in full screen mode the background color turns black to allow better view contrast.

New and Updated iOS 7 Gestures

iOS 7 introduces a few new and updated gesture recognized shortcuts. Some of these can be performed system-wide including on the Springboard, an active app, and the Lock Screen while others can only be performed in specific areas like an active app or the Springboard.

Slide one finger down from top edge of screen – This gesture existed in iOS 6 and is used to access the Notification Center. This gesture works system-wide, including the lock screen. In iOS 7 the gesture is the same but the Notification Center has been modified and improved to include more options and information. If the active app is in full screen mode this becomes a two-step gesture. The first swipe brings down a tab and the second will reveal Notification Center. This gesture cannot be turned off.

Slide one finger up from bottom edge of screen – This is a new gesture to iOS 7 that is used to access the new Control Center. The Control Center acts as a shortcut to many common features that before had to be accessed different parts of the settings app. Commonly used features like Airplane mode, the LED light, and screen brightness can now be accessed from anywhere in springboard, an active app, and the lock screen. Like the gesture for Notification Center, if the active app is in full screen mode this also becomes a two-step gesture. This feature can also be turned off separately for the lock screen or for apps in Settings. It can be turned off while on the lock screen or in open apps but not for the Springboard.

Slide one finger from left edge of the screen to right – This new gesture is used in apps only and is a shortcut to hitting the back button in a UINavigationController stack or on a browser window.

Slide one finger down from anywhere on Springboard – This is a new way to access the iOS search function from anywhere in the springboard. You no longer have to be on the first page and then swipe left to right an extra time for the search screen. Search can now be accessed from any page of springboard. This new gesture only works in springboard and not with an active app.

Slide one finger up while in the app switcher to remove an active app – The app switcher is now a full screen feature that shows both the open apps and the most current content for each app. With iOS 7 it is easier and more efficient to design apps that update their content while in the background. In the app switcher the user will be able to see the updated content of an app without bringing it to the foreground. For example if a new text message arrives, the user can read the text from app switcher without actually switching to Messages and leaving their current app. Any open app can be removed by tapping it and sliding upward. This gesture can work simultaneously on multiple apps. A single finger can be placed on each app to be removed and one upward gesture will remove every touched app simultaneously.

Apple has optimized background multitasking to be more efficient to save battery power and allow apps do more while in the background. It is important to note that removing an application from the app switcher also prevents the removed app from performing any background processing until it is launched again. For more information, please read more about this behavior in our previous post about the new multitasking features in iOS 7.

New Gesture Recognizers

UIScreenEdgePanGestureRecognizer

New gesture recognizing class that specifically looks for panning gestures that starts from a defined screen edge. This new class inherits from UIPanGestureRecognizer and only one new property called edges, which determines which edges of the screen, will recognize a swipe. The Apple documentation states that the edges property can hold multiple directions, but as of the posting the edges property will only work with one edge direction. Adding more than one edge direction causes the gesture to not be recognized and fire. This should be fixed with a later version of iOS 7. Also as of this blog the Xcode 5 Interface Builder does not have this new gesture listed with the other gestures so creating these gestures can only be done programmatically. In the example app we have a screen edge gesture for every edge. Each is initialized and added to the primary view of the view controller:

swipeRight <span class="sy0">=</span> <span class="br0">[</span><span class="br0">[</span>UIScreenEdgePanGestureRecognizer alloc<span class="br0">]</span>initWithTarget<span class="sy0">:</span>self action<span class="sy0">:</span><span class="kw1">@selector</span><span class="br0">(</span>handleSwipeRight<span class="sy0">:</span><span class="br0">)</span><span class="br0">]</span>;

<span class="br0">[</span>swipeRight setEdges<span class="sy0">:</span>UIRectEdgeLeft<span class="br0">]</span>;

<span class="br0">[</span>swipeRight setDelegate<span class="sy0">:</span>self<span class="br0">]</span>;

<span class="br0">[</span>self.view addGestureRecognizer<span class="sy0">:</span>swipeRight<span class="br0">]</span>;

These gestures can be helpful for full screen apps like photo editors or games that may require an extra tap or other gesture to show the back button or other buttons needed for navigation or other functions. Instead the user can swipe from one edge of the screen and immediately perform the intended action. On the other hand, three of the four screen edges are already reserved by system wide gestures created by Apple for other functions. According to Apple's gesture guidelines:

Avoid associating different actions with the standard gestures. Unless your app is a game, redefining the meaning of a standard gesture may confuse people and make your app harder to use.

Avoid creating custom gestures that invoke the same actions as the standard gestures. People are used to the behavior of the standard gestures and they don't appreciate being expected to learn different ways to do the same thing.

Despite this all four edges can be used if necessary by taking a few steps in your app. If your app is not in full screen mode, meaning that the status bar is visible, the top and bottom edges are reserved for the iOS 7 system gestures. In order for your custom gestures to be recognized you must be in full screen mode. This is not quite as straightforward as you may think. For the setStatusBarHidden: function to work your app's Info.plist file must be updated to include two new entries:

  • "View controller-based status bar appearance" = NO
  • "Status bar is initially hidden" = YES or NO

In the example app the status bar is swapped between visible and hidden with a single tap:

<span class="br0">[</span><span class="br0">[</span>UIApplication sharedApplication<span class="br0">]</span>setStatusBarHidden<span class="sy0">:!</span><span class="br0">[</span><span class="br0">[</span>UIApplication sharedApplication<span class="br0">]</span>isStatusBarHidden<span class="br0">]</span> withAnimation<span class="sy0">:</span>UIStatusBarAnimationFade<span class="br0">]</span>;

This is mentioned in the Apple documentation but is worth noting here. When dealing with edge gestures, iOS 7 knows the orientation of the device and will always match the gesture edge with the orientation as long as that orientation is supported by your app.

InteractivePopGestureRecognizer

New gesture recognizer embedded in UINavigationController. This implements the new EdgePanGestureRecognizer feature that allows the slide to go back functionality. The recognizer can be disabled in a couple of different ways depending on your needs. First you can disable it for a single view controller by first setting that controller to the recognizer's delegate and then adding your custom gesture as a replacement. This method will disable the recognizer for only the current view controller and will not affect controllers in the stack. The second way is to disable the recognizer entirely by using:

<span class="br0">[</span>self.navigationController.interactivePopGestureRecognizer setEnabled<span class="sy0">:</span><span class="kw2">NO</span><span class="br0">]</span>;

This disables the recognizer over the entire navigation stack no matter what view controller the code itself resides. It can also be enabled for the entire stack in the same view controller it was disabled or in any other view controller along the stack.

Filtering Gestures

iOS 7 makes a few changes to the UIGestureRecognizerDelegate and will allow gesture failures across the view hierarchy. Previously if two gesture recognizers existed in a view and one required the other to fail, requireGestureRecognizerToFail instance method had to be used at creation time to define their relationship. Now with iOS 7 the UIGestureRecognizerDelegate has been updated to allow failure requirements to be specified at runtime. In the example app there are three UITapGestureRecognizers:

singleTap <span class="sy0">=</span> <span class="br0">[</span><span class="br0">[</span>UITapGestureRecognizer alloc<span class="br0">]</span>initWithTarget<span class="sy0">:</span>self action<span class="sy0">:</span><span class="kw1">@selector</span><span class="br0">(</span>screenTapped<span class="br0">)</span><span class="br0">]</span>;

<span class="br0">[</span>singleTap setNumberOfTapsRequired<span class="sy0">:</span><span class="nu0">1</span><span class="br0">]</span>;

<span class="br0">[</span>singleTap setDelegate<span class="sy0">:</span>self<span class="br0">]</span>;

doubleTap <span class="sy0">=</span> <span class="br0">[</span><span class="br0">[</span>UITapGestureRecognizer alloc<span class="br0">]</span>initWithTarget<span class="sy0">:</span>self action<span class="sy0">:</span><span class="kw1">@selector</span><span class="br0">(</span>screenDoubleTapped<span class="br0">)</span><span class="br0">]</span>;

<span class="br0">[</span>doubleTap setNumberOfTapsRequired<span class="sy0">:</span><span class="nu0">2</span><span class="br0">]</span>;

<span class="br0">[</span>doubleTap setDelegate<span class="sy0">:</span>self<span class="br0">]</span>;

tripleTap <span class="sy0">=</span> <span class="br0">[</span><span class="br0">[</span>UITapGestureRecognizer alloc<span class="br0">]</span>initWithTarget<span class="sy0">:</span>self action<span class="sy0">:</span><span class="kw1">@selector</span><span class="br0">(</span>screenTripleTapped<span class="br0">)</span><span class="br0">]</span>;

<span class="br0">[</span>tripleTap setNumberOfTapsRequired<span class="sy0">:</span><span class="nu0">3</span><span class="br0">]</span>;

<span class="br0">[</span>tripleTap setDelegate<span class="sy0">:</span>self<span class="br0">]</span>;

Every doubleTap will also be interpreted as two singleTap gestures and tripleTaps will be interpreted as all three. Before iOS 7 this would be solved by the following code:

<span class="br0">[</span>singleTap requireGestureRecognizerToFail<span class="sy0">:</span>doubleTap<span class="br0">]</span>;

<span class="br0">[</span>doubleTap requireGestureRecognizerToFail<span class="sy0">:</span>tripleTap<span class="br0">]</span>;

Because this has to be done when the gestures are added to a view there is no way to remove the failure requirement without removing the gestures, reinitializing them, and adding them to the view again. This can still be done in iOS 7, but it can now also be done at runtime using one of two new delegate methods:

<span class="sy0">-</span><span class="br0">(</span><span class="kw4">BOOL</span><span class="br0">)</span>gestureRecognizer<span class="sy0">:</span><span class="br0">(</span>UIGestureRecognizer <span class="sy0">*</span><span class="br0">)</span>gestureRecognizer shouldRequireFailureOfGestureRecognizer<span class="sy0">:</span><span class="br0">(</span>UIGestureRecognizer <span class="sy0">*</span><span class="br0">)</span>otherGestureRecognizer

<span class="sy0">-</span><span class="br0">(</span><span class="kw4">BOOL</span><span class="br0">)</span>gestureRecognizer<span class="sy0">:</span><span class="br0">(</span>UIGestureRecognizer <span class="sy0">*</span><span class="br0">)</span>gestureRecognizer shouldBeRequiredToFailByGestureRecognizer<span class="sy0">:</span><span class="br0">(</span>UIGestureRecognizer <span class="sy0">*</span><span class="br0">)</span>otherGestureRecognizer

Using these delegate methods allows you to lazily analyze gestures as they are interpreted and handle the behavior accordingly. In our example app I included a boolean value for requiring the failure requirements that allows you to easily ignore them at runtime.

<span class="sy0">-</span><span class="br0">(</span><span class="kw4">BOOL</span><span class="br0">)</span>gestureRecognizer<span class="sy0">:</span><span class="br0">(</span>UIGestureRecognizer <span class="sy0">*</span><span class="br0">)</span>gestureRecognizer shouldRequireFailureOfGestureRecognizer<span class="sy0">:</span><span class="br0">(</span>UIGestureRecognizer <span class="sy0">*</span><span class="br0">)</span>otherGestureRecognizer

<span class="br0">{</span>

<span class="kw4">BOOL</span> result <span class="sy0">=</span> <span class="kw2">NO</span>; <span class="sy0">&</span>nbsp;<span class="sy0">&</span>nbsp;

<span class="kw1">if</span> <span class="br0">(</span>useGestureRequirements<span class="br0">)</span>

<span class="br0">{</span>

<span class="kw1">if</span> <span class="br0">(</span>gestureRecognizer <span class="sy0">==</span> singleTap && otherGestureRecognizer <span class="sy0">==</span> doubleTap<span class="br0">)</span>

result <span class="sy0">=</span> <span class="kw2">YES</span>;

<span class="kw1">if</span> <span class="br0">(</span>gestureRecognizer <span class="sy0">==</span> doubleTap && otherGestureRecognizer <span class="sy0">==</span> tripleTap<span class="br0">)</span>

result <span class="sy0">=</span> <span class="kw2">YES</span>;

<span class="br0">}</span>

<span class="kw1">return</span> result;

<span class="br0">}</span>

Summary

iOS 7 has brought many new and welcome changes to how we will interact with Apple devices and the many apps developed for them. In this blog we discussed full screen layout changes and how it will affect application design, new gesture recognizers and how to make use of them in an application, and new gesture based shortcuts built into iOS 7. The example code that you can download here includes basic examples of how to use edge swipe gestures with and without the existing system gestures, turning on and off full screen mode and its effects, and how to make use of gesture recognizer delegates.