Android App Development
http://oit2.scps.nyu.edu/~meretzkm/INFO1-CE9416/
mark.meretzky@gmail.com

  1. iOS: my parallel course on Apple iOS in the language Swift.

Description for the 14-week course

An Android app is written on your Mac or PC, not on your phone or tablet. This 14-week course therefore begins with the installation on your laptop of Android Studio, the Integrated Development Environment (IDE) for Android. We then cover the basics of the two languages in which an Android app is written, Java and the Extended Markup Language (XML). We outline the lifecycle of an app: the Java objects that are created and destroyed when an app is launched, covered by another app and uncovered again, and has its orientation changed from portrait to landscape.

We show how an app can draw text and two-dimensional graphics, and position them on the screen with XML “layouts”. Our first examples do not move, but we soon proceed to animation. The more complicated animations require “multi-threading”, allowing the app to do two things at the same time. We also make the screen touch-sensitive by adding buttons, sliders, and other input controls, and by recognizing standard gestures such as swipes, taps, and pinches.

The middle part of the course deals with databases and other sources of data, hosted on the phone itself or in the cloud. To read and write a database, Android uses the SQLite version of the Structured Query Language (SQL). To display the records of the database on the screen, we connect together Java objects such as cursors, adapters, view binders, and adapter views. But we also insulate the database from these screen objects by packaging it as a “content provider”.

Building on the interactive machinery in the first part of the course, and the databases in the second, we present a variety of applications including Google Maps, the GPS, and the accelerometers. We download data in CSV and JSON formats from remote servers. We capture and playback audio and video, providing additional examples of multi-threading and communication between a background “service” object and a foreground “activity” object. We perform speech recognition by using an “intent” object to launch an activity object and get back a transcript of the words. Each of these topics reinforces the basic theme of the course: to make Java objects coöperate with each other and with the parts of the app written in XML.

For the entire content of the course, please see
http://oit2.scps.nyu.edu/~meretzkm/android/