iOS App Development

iOS App Development Apps is a two-semester course, Part I INFO1-CE9236 and Part II INFO1-CE9704. The term “iPhone” is to be understood as “iPhone, iPad, and iPod Touch”. This document is online at

http://i5.nyu.edu/~mm64/INFO1-CE9236/syllabus.html
http://i5.nyu.edu/~mm64/INFO1-CE9704/syllabus.html

Catalog descriptions

The instructor described the courses as follows, limited to 100 words (not counting the prerequisites).

Part I, INFO1-CE9236

Create applications for the Apple iPhone, iPad, and iPod Touch using the iPhone operating system iOS. Part I of this two-semester course begins with the Objective-C language, a combination of C and Smalltalk. We then write apps in the Macintosh Xcode development environment, using Apple’s Cocoa Touch API of classes, protocols, and objects. Build the visual interface of the app with touch-sensitive windows and views, containing buttons, sliders, tables, and slot machines. Distingush between taps and swipes, responding to them with animations. Explore the roles played by objects as delegates, targets, controllers, and data sources, integrated into the Model-View-Controller design pattern.

Prerequisites: experience with loops, “if” statements, and functions in a language such as C, C++, Objective-C, Java, JavaScript, Perl, PHP, or Adobe Flash ActionScript. Students do not need an iPhone, iPad, or iPod Touch, but must have an Intel Mac.

Part II, INFO1-CE9704

Part II of this two-semester course is centered around graphics. We begin by reviewing the core concepts of Part I: memory management and the various relationships between objects. Then we render 3D objects with OpenGL ES, the industry standard for texture and lighting. Create games with Cocos2d, a framework for animating sprites and directing scene transitions. Write threaded apps that do multiple jobs simultaneously; synchronize audio with graphics. Pull photos and videos from the camera, RSS feeds from the web. Read and write SQLite databases. Build graphical interfaces for the GPS and accelerometer; interface with Google maps.

Prerequisite: INFO1-CE9236. Students do not need an iPhone, iPad, or iPod Touch, but must have an Intel Mac.

Prerequisites

Students should have lots of experience touching and tapping their iPhones, e.g., navigating the tables in the Settings app. Students are not required to know the Objective-C language. NYU offers no courses in Objective-C, and this language is little used outside of the Mac and iPhone worlds. But students should have some experience in a language with loops, “if” statements, functions, and preferably objects and methods. Examples are Java, JavaScript Perl, PHP, the ActionScript language of Adobe Flash, etc. Knowledge of HTML and JavaScript will permit the student to do more with UIWebViews and Google maps.

The prerequisite for Part II is Part I. But more than half of the students arriving in Part II will have only a shaky knowledge of Part I, so we have to budget time at the beginning for review.

Hardware and software

Apps are written on Xcode, a free application that runs only on Mac OS X. Apps are tested on the iPhone Simulator that comes with Xcode. To do their homework, students must have access to a Mac.

The current version of Xcode and the iPhone SDK run only on the current version of Mac OS X (Snow Leopard). For this reason, Apple keeps older versions of Xcode freely available. NYU does not have the current Xcode or SDK at its computer centers or at the 48 Cooper Square computer labs, exasperating the students who keep their own Macs current.

Students do not need an iPhone for this course. But there is a deep satisfaction in seeing your app running on the iPhone.

Textbook

No textbook—the entire content of the course is online. The documentation for iPhone NS (Next Step) and UI (User Interface) is online. The Human Interface Guidelines for the iPhone and iPad are online.

Grading policy

No tests. Grades will be determined by the apps submitted as homework. Each app will be submitted by being uploaded to the GitHub website, where the entire class can see it.

Outline of topics (20 lectures of 3 hours each)

For pedagogical purposes, topics are not covered strictly in the order listed below. For example, the sections on controls and audio/video are interleaved to keep the students awake: we want something dramatic to happen when a control is triggered. A button will play video; a switch will start and stop audio; a slider will change the volume. Similarly, Objective-C collections are presented on a need-to-know basis: sets when we detect a touch, arrays when we write data to a file.

Part I, INFO1-CE9236

  1. The Objective-C Language. iPhone apps must be written in Objective-C, a superset of the language C. We cover enough of Objective-C to begin writing apps.
    1. The C subset of Objective-C. Variables, including pointers and enumerations (for identifying buttons in an action sheet). Structures: CGPoint, CGSize; and CGRect, a big structure that contains smaller structures inside of it. Functions, arguments, and return values.
    2. Objective-C features added to C. Classes and objects, messages and methods, arguments passed to a method. Inheritance, for building a series of bigger classes out of a smaller one: NSObject, NSString, NSMutableString. Protocols and categories.
    3. Idiosyncrasies of Objective C. “Selectors” for designating a method of a class. Objects that hold a number or a structure (NSNumber, NSValue), needed because an Objective-C collection can hold only objects, not numbers or structures.
    4. Memory management. An object is safe from the memory manager as long as the object’s retainCount remains positive. The retain and release methods; autorelease and the autorelease pool.
    5. Object creation and destruction. alloc and init at birth; dealloc at death.

  2. Compile an app with Xcode. The app is not written and debugged on an iPhone—you would go blind. It is written in Xcode, a free application that runs only on a Mac. It is also debugged on the Mac, on an iPhone Simulator that comes with Xcode.
    1. App startup and termination.
      1. The path into the app. main calls UIApplicationMain, which creates two objects: the application and the application delegate. The application delegate creates the window and makes it visible.
      2. The path out of the app: the applicationWillTerminate: method of the application delegate; the dealloc method of each object.
    2. Write an empty app that does nothing. Write a “Hello, World!” app that displays a message. Discard the app’s nib file.
    3. Modify the “Hello, World!” app. Change the font, color, size, position. Display real information instead of a canned greeting. Print debugging output in the debugger (dbg) window.
    4. Take a snapshot of the app. Give the app an icon and a launch image.
    5. Internationalize the app. Localize the strings in the app and in the Info.plist file. Create separate .lproj directories for each language.
    6. Walk through the app with the GNU gdb debugger. Set breakpoints, print variables.
    7. Archive and compress the files of the app with zip and unzip.
    8. Download the app to an iPhone. Certification signing requests; public and private keys in the keychain; certificates, App ID, and Provisioning Profiles. Register the device’s 40-character hex code.

  3. Still-life graphics. Draw pictures but do not yet have them move or respond to a touch. This section picks up where we left off in the the modifications to the “Hello, World!” app in the above §2.3.
    1. Draw graphics in the drawRect: method of a UIView using a CGContextRef. (UI stands for User Interface; CG for Core Graphics.)
    2. Lines, rectangles, ellipses, jpeg images, etc. Shapes can be outlined or filled in.
    3. The CTM (Current Transformation Matrix) lets us apply a “linear transformation” to a picture. Three transformations are possible, in various combinations: translate (move), scale (stretch), rotate. Degrees vs. radians. Sine vs. cosine.

  4. Touches and animations. An app can detect a touch on the screen and react by changing the picture.
    1. Multi-touch. There may be more than one touch in progress at any given moment. The touches are delivered to the app in a set of touches.
    2. Detect when a touch starts, moves, or ends. The touchesBegan:withEvent: method of class UIResponder.
    3. Animate the response to a touch. An “animation” in this sense is any gradual (not instantaneous) change in the position, size, shape, color, or opacity of an object. Set the animation’s repetition count, speed, and the abruptness of the start and end. Animate a movie title: Gone with the Wind scrolls in from the right; the Star Wars prologue scrolls in from the bottom. The beginAnimations and endAnimations class methods of UIView.
    4. Animate the CGAffineTransform, a linear transformation analogous to the CTM in §3.4. Concatenate two or more transformations.

  5. Collections. An Objective-C collection is a big object that holds (pointers to) smaller objects. They come in two flavors: mutable and non-mutable.
    1. Set: the touches are delivered to an app in a set.
    2. Array: the points in a drawing can be saved in an array. Build a two-dimensional array or a tree out of an array of arrays.
    3. Dictionary: a map or associative array of keys and values. Dictionaries hold the properties in a .plist file, the attributes of a file on the disk, and the settings of an audio recorder.
    4. An application: an etch-a-sketch that saves the picture in a file when the application terminates, and recreates the picture when the application is relaunched.

  6. Controls and the target/action pattern. A control is an input device such as a button or slider. When stimulated, a control sends a messages to “target” object.
    1. Setting up the target/action mechanism: the addTarget:action:forControlEvents: method of class UIControl, and its selector and bitmask arguments. There can be more than one message and more than one target.
    2. Various types of control: button, switch, slider, page control (a row of dots for turning the page), segmented control (a row of buttons), date picker (looks like a Las Vegas slot machine), text field.
    3. Resize and tilt the controls with a linear transformation.
    4. A timer can be treated as an invisible button that presses itself after an interval of time.
    5. Targets vs. delegates. A control can have many target objects, but only one delegate object. Any object can be a target, but only an object that adopts a protocol (§1.2) can be a delegate.

  7. Audio/visual media.
    1. Vibration (iPhone only).
    2. Three ways to play audio. AudioServicesPlaySystemSound is for a short file (warning beep, sound effect). AVAudioPlayer and its delegate are for recording and playing a longer file (music). AudioQueue is for sound that must be synchronized with graphics. (Might have to move AudioQueue to Part II of this course.)
    3. Record audio with AVAudioSession, AVAudioRecorder, and their delegates.
    4. Video. Create a MPMoviePlayerController. Send a notification to the notification center when the movie is finished.

  8. View controllers. A view is a visible object: a page, photograph, or playing board. A view controller handles the view’s interactions with the rest of the program, allowing the view to concentrate narrowly on issues of background color, foreground color, font, etc. In an ideal world, every view would have a view controller hovering over it like a guardian angel.
    1. A view controller that controls a single view. The controller creates the view on demand, and asks it to redraw itself when the device’s orientation changes (portrait vs. landscape).
    2. Hierarchies of view controllers. A tab bar controller controls a set of other view controllers, each controlling a view of its own. It lets the user visit the views in any order. A navigation controller is similar, but lets the user visit the views only in a predetermined order. A modal view controller displays a temporary view, with the intent of returning the user to his or her previous view.
    3. The model-view-controller paradigm. The view and controller are now two separate objects. The app’s data structure will reside in a third object called the model. The view and the model do talk directly to each other; their intermediary is the controller.
    4. Nib files an why we don’t use them. An “interface builder” file can create controllers and views and connect them together. At this early stage, however, it’s simpler to do the job in the app.

  9. View classes. They come in all shapes and sizes.
    1. Still-life views: UILabel for text, UIImageView, for a picture, UIWebView for a page of HTML. They can change, but only under stimulation from the rest of the app, not by themselves.
    2. Changing views: UIProgressView when you can estimate what percentage of the job has been been done, UIActivityView when you can’t.
    3. Views that perform input: UIAlertView demands a button press; UIActionSheet demands a decision. UIPickerView looks like another slot machine.
    4. Large views that perform input and output: UITextView for text, UITableView for a list of items.

  10. Table view. The table view in §9.4 is the most complicated view.
    1. A table view and its data source. The data source is a separate object, often thought of as the model in the model-view-controller paradigm.
    2. A table view and its delegate. The items in a table view can be deleted, edited, and reordered. New ones can be inserted.
    3. Navigation controllers and table views. A tree-like data structure requires a hierarchy of two different kinds of view controllers: a navigation controller above, and a series of table view controllers below. The tree that backs up this impressive structure will be built recursively out of an array of arrays.

Part II, INFO1-CE9704

  1. Files and directories. Media files are stored in the app’s “bundle” in the iPhone’s flash memory. When the app is terminated and launched again, the files from its previous launch will still be there.
    1. The app’s bundle; the home directory and temporary directory.
    2. Find and open a file with a given name and extension. Print the file’s dictionary of attributes.
    3. Read and write the file’s contents via an NSData object.
    4. Loop through the files and subdirectories accessible to the app with an NSFileManager.
    5. Save the state of an app that is being terminated. Restore the state when the app is launched again.

  2. XML and RSS. XML is the successor to HTML. RSS is Really Simple Syndication, a format for XML web feeds that are frequently updated.
    1. DOM: the Document Object Model, familiar to JavaScript programmers.
    2. Download an XML file. Example: the seven-day forecast from www.noaa.gov.
    3. Two ways to parse XML: libxml2, a parser where you loop from node to node; and SAX (“Simple API for XML”), a parser with callback functions.
    4. An app that reads an RSS feed.

  3. Databases. The version of SQL that runs on the iPhone is called “SQLite”.
    1. Play with a database file interactively on a Mac before we put one on an iPhone. A database contains tables; a table contains records; a record contains fields. Primary keys and foreign keys.
    2. The SQLite language: create, dump, and drop a table; insert, select, update, and delete a record. Joins and subqueries.
    3. Create an app that reads and writes records, and displays them in a UIWebView, UITextView, or UITableView.
    4. SQLite statements compiled with sqlite3_prepare_v2 let us receive numbers from the database as numbers, not as strings.
    5. Add a geographical distance function to the SQLite language with sqlite3_create_function.
    6. Download a database from the web and install it on the iPhone. Find all zipcodes within 5 miles of a given latitude and longitude. Check out the New York City Data Mine.
    7. Access an online database with an NSURLRequest. The GET and POST methods of HTTP. Text formats: JSON and comma-separated values.

  4. Location awareness. An iPhone has a GPS. An app can send its latitude and longitude to Google and get a map and street address. The app can talk direcly to Google via JavaScript, or indirectly via the objects in MapKit.
    1. Get the latitude/longitude with a CLLocationManager object and its delegate. Periodic updates, failure detection.
    2. Embed a JavaScript function in an HTML file. The <SCRIPT> and <DIV> tags.
    3. An app that calls the JavaScript function. The stringByEvaluatingJavaScriptFromString: method of class UIWebView. Debug the function with JavaScript alert.
    4. Send the latitude/longitude to maps.google.com and get a map via classes Map and LatLng in Version 3 of the Google Maps Javascript API. Render the map with controls for Map/Satellite/Hybrid and zoom level, colored markers, info windows (cartoon balloons), etc.
    5. Display the traffic layer on top of a roadmap. Google Sky, Moon, and Mars.
    6. Convert between latitude/longitude and street address using the Google server.

  5. The accelerometer. The iPhone can detect acceleration and the force of gravity.
    1. The X, Y, and Z axes form a “right-handed” coördinate system. Yaw, pitch, and roll. G-forces: punch vs. hug.
    2. The UIAccelerometer object and its delegate. Periodic updates.
    3. Display the acceleration/gravity graphically. A plumb bob.
    4. Distinguish between acceleration and gravity with high-pass and low-pass filters.

  6. Multithreading. An app can do two or more things at the same time.
    1. An animation considered as a separate thread.
    2. A timer (class NSTimer) considered as a separate thread.
    3. Official multithreading with NSOperation and its subclasses NSInvocationOperation and NSBlockOperation. Set up an operation queue.
    4. @synchronized properties of objects.
    5. For the time being, the apps run one at a time. But an app can voluntarily turn itself off and nominate a sucessor; our example will be an app that launches Safari. Make an app launchable by other apps on the same iPhone.

  7. The camera and photo albums. An app can get images and movies via a controller.
    1. The UIImagePickerController and its delegate. The UIImagePickerController doubles as a modally presented navigation controller. (§8.2), so it also requires a navigation controller delegate.
    2. Sources of input: the camera, photo albums, or saved photos. Pick a movie or still image.
    3. Customize the camera controls.
    4. Edit video with UIVideoEditorController.

  8. Cocos2d graphics. A framework for building games.
    1. Sprites and animation.
    2. A parent sprite containing child sprites inside of it.
    3. Tiled maps containing rectangles or hexagons.
    4. Switch between scenes using a director.
    5. Give a sprite mass, weight, and momentum.

  9. OpenGL ES graphics: geometry, perspective, and motion. GL stands for Graphics Library; ES for Embedded Systems. The graphics in §§3, 4, and 18 were two-dimensional. In this section they are three-dimensional. GL stands for “graphics library”; ES for “embedded systems”. iPhones have two versions of ES: 1.1 with fixed function transformation and fragment pipeline, and 2.0 with shaders and program objects.
    1. The world of OpenGL ES is built out of triangles. Build a square out of two triangles. Build a cube out of six squares. Build a building out of many cubes. Triangles (GL_TRIANGLES), vs. triangle strip and triangle fan. Pass arrays of vertices and subscripts to the OpenGL functions.
    2. Viewports and the field of view. Projections: orthogonal vs. perspective. The viewing frustum.
    3. The projection and modelview matrices. Transformations implemented as matrices: translate (move), scale (magnify), rotate, project (flatten 3D to 2D), shear, etc. Gimbal lock and quaternions.
    4. Animation by interpolation between key frames. Linear vs. spherical interpolation.

  10. OpenGL ES graphics: lighting, color, and texture.
    1. Enable color and shade models.
    2. The three components of a light source: specular, diffuse, ambient. Specify the light’s color, intensity, and position.
    3. Spotlights: direction and angular cutoff.
    4. Materials: front or back; specular, diffuse, and ambient. GL_SHININESS for shiny objects. GL_EMISSION for objects that glow in the dark.
    5. Texture mapping: paint a picture onto an object. Load a texture from a UIImage object or a PVRCT file. Pad the image to satisfy the aspect ratio requirements. Map the ST axes of the texture onto the UV axes of the object’s surface. Cover a long surface by tiling a texture.

  11. Objective-C and C++. Have Objective-C objects and C++ objects coexist in the same app, and even in the same source file.
    1. Call the member functions of a C++ object inside of an Objective-C method.
    2. Call the methods of an Objective-C object inside of C++ code using the Objective-C runtime library.