Object Oriented Programming
In December, 1979, Steve Jobs and several Apple engineers visited Zerox PARC, where they were shown three crucial technologies: the Graphical User Interface, Object Oriented Programming, and Networking. Jobs immediately recognized the importance of the first, but he later admitted that he did not understand the significance of the other two.
By the time Jobs and a few carefully chosen engineers left Apple to form NeXt, all three technologies were front and center in their minds. In particular, engineers at NeXt adopted Objective C as the programming language for the NeXt, and invented Interface Builder to make Object Oriented Programming a much easier, visual experience. When NeXt returned to Apple, this Object Programming layer was renamed Cocoa.
Object Oriented Programming begins with a useful metaphor. Imagine that a typical program is build from a collection of small independent pieces called "objects". Each object has a memory, its "instance variables", and defines a series of "messages" that it can send or receive. These messages form the procedures or functions implemented by the object. Objects interact by sending messages to each other. The objects are not created statically when the program begins; instead they are created dynamically by the program as needed, and thrown away when no longer necessary.
For example, a window on the computer screen may correspond to a window object in the code. This window object knows how to resize the window or move it to a new position on screen, how to temporarily hide or show the window, and how to permanently close it. When the user asks for a new window, a new window object is created to handle this window; when the user closes a window, its window object is released from memory. Thus code objects are created dynamically rather than all at once when the program starts, because users create windows dynamically as needed, rather than all at once at bootup.
Programmers do not need to create objects from scratch. Instead Cocoa has a rich store of pre-built objects which programmers can request as needed. When Cocoa provides such an object, it is already "hooked up". Thus if the programmer requests a window, the window appears on the screen able to resize itself or move elsewhere without writing any code at all.
Before object oriented programming, this could be accomplished by providing a large library of pre-written code. But that approach is messy and intractable; a programmer than sees gigantic sections of code that he or she did not write and can barely understand, and must insert their own code inside these incomprehensible chunks of code. In object oriented programming, the code which makes an object work is hidden. Programmers aren't distracted by it, and cannot see it even if they want to. A programmer can send a message to a window asking it to resize itself, but never sees all the smaller tasks and special cases the code considers to make that happen.
It wouldn't be very useful to get a library of pre-build objects from Apple which a user cannot modify. If I want a window on the screen, I'll happily let Apple resize and move it, but I need to control the contents of the window. Maybe I want a window to show editable text, or perhaps a picture, or maybe a movie. So the most crucial feature of object oriented programming is that a programmer can request a prebuilt object and then modify it by changing the behavior of some of its build-in procedures.
For example, window objects have a content region and a "draw" procedure which draws the contents of the content region. This draw procedure is already hooked up, so it is called whenever the window resizes or moves elsewhere. The prebuilt draw routine does some very complicated things. It calculates the visible portion of the content region and sets a "clip region" so that when the drawing actually occurs, it only draws in this visible portion, but not into another window covering part of the window being drawn. And then, after all this preparation, the draw routine of the default window actually draws. It erases the previous contents and then draws nothing.
When a user asks for a window object, he or she accepts all of this default behavior except the final command to "draw nothing." This part is overridden, and instead the new object draws some text, or a picture, or shows a movie. So the programmer has to write code for the things that make a program unique, but does not have to write code for the mundane tasks which, while complicated, are understood and already written by Apple.
Automatic Reference Counting
A Cocoa program can create big objects, like those controlling windows, and also lots of little objects. Suppose I have a number 3.14159 and I want to round it to an integer 3. Or perhaps I want to write that number =as a string of text. Cocoa has an object called NSNumber to handle tasks like these. A programmer can create an NSNumber object initialized to the value 3.14159, and then ask the object for its "integer value" or its "string value". Then the NSNumber object should be thrown away.
A typical procedure in Cocoa might create five small objects at the start, use these objects to do something, and then throw them all away at the end. But often a procedure creates -objects and then sends these objects to other procedures. Suppose a procedure creates an object and sends this object to window B and window C. Suppose later window B is closed. Should it throw the object away? YES if C is already closed, but NO otherwise.
Originally Apple had very elaborate rules to handle such memory management. But other object oriented languages adopted an easy way to handle it called "garbage collection". At regular intervals, the program would pause and go through the list of all objects currently in the system, determining how many other pieces were still using that object. If nobody was still using it, the object was thrown away. At one point, Apple = adopted garbage collection for Cocoa. But then they created the iPhone, which is programmed in Cocoa. Garbage collection would mean that the phone would become unresponsive at random moments. That was unacceptable.
Instead, Apple invented a method called Automatic Reference Countings or ARC. The rough idea is that objects keep track of the number of parts of the system using them. Whenever one of these parts of the system quits, it reduces the reference count of all objects using it by one. When the reference count of an object reaches zero, that object throws itself away.
Eventually Apple adopted ARC for Cocoa programs on all platforms, but a programmer could opt into ARC on a source file by source file basis. For a long period of time that summer, I gradually adopted ARC for all of TeXShop. The process was painful, and for a long period of time there were no new releases or bug fixes as file after file was switched over.
Instabilities remained, and some of my collaborators were extremely helpful tracing them down. In the end, the ARC battle was worth it. A key lesson is that when Apple makes a significant change, it is best to bite the bullet and adopt the change. Avoiding change is easier in the short run, and dangerous in the end.