Bookmark and Share

iPhoneSDK Tutorial
Chapter 11. Graphics and Drawing





11.0 Preview

iPhone OS provides two primary paths for creating high-quality graphics in our system: OpenGL or native rendering using Quartz, Core Animation, and UIKit.

The OpenGL frameworks are geared primarily toward game development or applications that require high frame rates. OpenGL is a C-based interface used to create 2D and 3D content on desktop computers. iPhone OS supports OpenGL drawing through the OpenGL ES (Embedded Systems) framework, which provides support for both the OpenGL ES 2.0 and OpenGL ES v1.1 specifications. OpenGL ES is designed specifically for use on embedded hardware systems and differs in many ways from desktop versions of OpenGL.


For developers who want a more object-oriented drawing approach, iPhone OS provides Quartz, Core Animation, and the graphics support in UIKit. Quartz is the main drawing interface, providing support for path-based drawing, anti-aliased rendering, gradient fill patterns, images, colors, coordinate-space transformations, and PDF document creation, display, and parsing. UIKit provides Objective-C wrappers for Quartz images and color manipulations. Core Animation provides the underlying support for animating changes in many UIKit view properties and can also be used to implement custom animations.

TwoNavigationBars


In this chapter, we will build the application shown in the picture above. First with Quartz 2D, and then with OpenGL ES.
The application has a bar across the top and the bottom, each with a segmented control. The control at the top let us change the drawing color, and the one at the bottom let us change the shape. When we touch and drag, the selected shape will be drawn in the selected color.



11.1 Quartz Drawing

When we use Quartz to our drawing, we usually add the drawing code to the "view doing the drawing." As an example, drawRect: is called every time a view needs to redraw itself. So, we add Quartz function calls to drawRect:. To be more specific, we might create a subclass of UIView and add Quartz function calls to that class's drawRect: method. As a result, the Quartz code in drawRect: will be called whenever the view redraws itself.


11.1.1 Quartz Graphics Context

As in the rest of Core Graphics, in Quartz 2D, drawing happens in a graphics context. Every view has its own context. So, when we want to draw in a view, we should have the current context, use that context to make Quartz drawing calls. Rendering will be taken care of by the context. So, forget about rendering.

The code that retrieves the current context is:
GCContextRef context = UIGraphicsGetCurrentContext();
Once our graphics context has been defined, we can draw into that context by passing that context to Core Graphics drawing functions. For example, the following code will draw a 2-pixdel wide line in the context:

CGContextSetLineWidth(context, 2.0);
CGContextSetStrokeColorWithColor(context, [UIColor redColor].GCColor);
GGContextMoveToPoint(context, 100.0f, 100.0f);
CGContextAddLineToPoint(context, 200.0f, 200.0f);
CGContextStrokePath(context);

The first and second calls specify the line width and the stroke color. In Core Graphics, two colors are associated with drawing actions: the stroke color and the fill color.

Contexts have a sort of invisible "pen" associated with them that does the line drawing. When we call CGContextMovePoint(), we move that invisible pen to a new location, without actually drawing anything. So the third line says we are about to draw and it will start at (100,100). The next function actually draws a line from the current pen location to the specified location. When we draw in Core Graphics, we're not drawing anything we can actually see.

We're creating a shape, a line, or some other object, but it contains no color or anything to make it visible. It's like writing in invisible ink. Until we do something to make it visible, our line can't be sen. So, the next step is to tell Quartz to draw the line using CGContextStrokePath(). This function will our line visible using the width and stroke color we set.


11.1.2 Coordinates System

View's coordinate system: (0,0) is in the lower-left corner, but Graphics library's (0,0) is in the lower-right corner.


11.1.3 Colors

UIKit provides an Objective-C class for color, and it is UIColor. But we can't use a UIColor object directly in Core Graphics calls. Since UIColor is just a wrapper around CGColor, we can retrieve a CGColor reference from a UIColor instance by using its CGColor property.

CGContextSetStrokeColorWithColor(context, [UIColor redColor].CGColor);

Here, we created a UIColor instance using a convenience method called redColor, and then retrieved its CGColor property and passed that into the function.


11.1.4 Images

Quartz allows us to draw images directly into a context. This is another example of an Objective-C class, UIImage, that we can use as an alternative to working with a Core Graphics data structure, CGImage. The UIImage class contains methods to draw its image into the current context. We'll need to identify where the image should appear in the context by specifying either a CGPoint to identify the image's upper-left corner or a CGRect to frame the image or resized to fit the frame. We can draw a UIImage into the current context:

CGPoint drawPoint = CGPointMake(100.0f, 100.0f);
[image drawAtPoint:drawPoint];


11.2 Quartz Application

Create a new project with view-based application template, and name it QuartzApp. We're going to draw customized drawing in the view. So, we should create a subclass of UIView. Create a new Cocoa Touch Class file, with Objective-C class and UIView for Subclass of. Name it QuartzAppView.m.

QuartzAppView

We need a header file for constants needed by more than one class and not for just one class. So, let's create a "Constants.h" using Empty File template.

Constants

Our application will offer an option to select a random color. But UIColor doesn't have a method to return a random color. So, we should write code to do that. We're going to put the code into a category on UIColor. Let's create two more files using Empty File template. Name them "UIColorRandom.h" and "UIColorRandom.m"

Here are the files in our project.

FilesSoFar

11.2.1 Random Color

Here are the "UIColorRandom.h"

#import <UIKit/UIKit.h>

@interface UIColor(Random)
+(UIColor *)randomColor;
@end

and "UIColorRandom.m"

#import "UIColorRandom.h"

@implementation UIColor(Random)
+(UIColor *)randomColor
{
    static BOOL seeded = NO;
    if (!seeded) {
        seeded = YES;
        srandom(time(NULL));
    }
    CGFloat red =  (CGFloat)random()/(CGFloat)RAND_MAX;
    CGFloat blue = (CGFloat)random()/(CGFloat)RAND_MAX;
    CGFloat green = (CGFloat)random()/(CGFloat)RAND_MAX;
    return [UIColor colorWithRed:red green:green blue:blue alpha:1.0f];
}
@end

We declared a static variable that tells us if this the the first time through the method. The first time this method is called, we're going to seed the random number generator. Once we've made sure the random number generator is seeded, we generate three random CGFloats with a value between 0.0 and 1.0. Using those three values, we're creating a new color. The generated colors will be opaque with alpha = 1.0.


11.2.2 Application Constants

Here are constants defined in "Constants.h" file. These constants are for each of the options that the user can select using segmented controllers.

typedef enum {
    kLineShape  = 0,
    kRectShape,
    kEllipseShape,
    kImageShape
} ShapeType;

typedef enum {
    kRedColorTab = 0,
    kBlueColorTab,
    kYellowColorTab,
    kGreenColorTab,
    kRandomColorTab
} ColorTabIndex;
#define degreesToRadian(x) (M_PI * (x) / 180.0)

The two enumeration types declared using typedef. One will represent the available shape options, the other will represents the various color options available. The values of these constants will correspond to segments on two segmented controllers we will create.


11.3 QuartzApp Skeleton

Our drawing will be done in a subclass of UIView, let's setup that class, time time, just its skeleton only.

Here is "QuarzAppView.h."

#import <UIKit/UIKit.h>
#import "Constants.h"

@interface QuartzAppView : UIView {
    CGPoint        firstTouch;
    CGPoint        lastTouch;
    UIColor        *currentColor;
    ShapeType      shapeType;
    UIImage        *drawImage;
    BOOL           useRandomColor;
}
@property CGPoint firstTouch;
@property CGPoint lastTouch;
@property (nonatomic, retain) UIColor *currentColor;
@property ShapeType shapeType;
@property (nonatomic, retain) UIImage *drawImage;
@property BOOL useRandomColor;
@end

The first two instance variables, firstTouch and lastTouch, will track the user's finger as it drags across the screen. We'll store the location where the user first touches the screen in firstTouch. We'll also store the location of the user's finger in lastTouch while dragging and when the drag ends. Our code will use these two variables to determine where to draw the requested shape.

Then, we define a color to hold the user's color selection and a ShapeType to keep track of the shape the user wants to draw. Next, is a UIImage property that will hold the image to draw when the user selects the rightmost toolbar item on the bottom toolbar. The last property is a Boolean that will be used to keep track of whether the user is requesting a random color.

Now, let's look at implementation file, "QuartzAppView.m."

#import "QuartzAppView.h"
#import "UIColorRandom.h"

@implementation QuartzAppView
@synthesize firstTouch;
@synthesize lastTouch;
@synthesize currentColor;
@synthesize shapeType;
@synthesize drawImage;
@synthesize useRandomColor;


- (id)initWithCoder:(NSCoder*)coder
{
    if ( ( self = [super initWithCoder:coder] ) ) {
        self.currentColor = [UIColor redColor];
        self.useRandomColor = NO;
        if (drawImage == nil)
            self.drawImage = [UIImage imageNamed:@"tiger.png"];
    }
    return self;
}

- (id)initWithFrame:(CGRect)frame {
    if (self = [super initWithFrame:frame]) {
        // Initialization code
    }
    return self;
}

- (void)drawRect:(CGRect)rect {
}

- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
    if (useRandomColor)
        self.currentColor = [UIColor randomColor];
    UITouch *touch = [touches anyObject];
    firstTouch = [touch locationInView:self];
    lastTouch = [touch locationInView:self];
    [self setNeedsDisplay];
}

- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
    UITouch *touch = [touches anyObject];
    lastTouch = [touch locationInView:self];
    
    [self setNeedsDisplay];
    
}

- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
    UITouch *touch = [touches anyObject];
    lastTouch = [touch locationInView:self];
	
    [self setNeedsDisplay];
}

- (void)dealloc {
    [currentColor release];
    [drawImage release];
    [super dealloc];
}
@end

We first implement initWithCoder: because this view is loaded from a nib. Keep in mind that object instances in nibs are stored as archived objects. As a result, when an object instance is loaded from a nib, neither init: nor initWithFrame: ever called. Instead, initWithCoder: is used, so this is where we need to add any initialization code. In our case, we set the initial color value to red, initialized useRandomColor to NO and load the image file that we'll draw.

touchesBegan:withEvent: is called when the user's finger first touch the screen. In that method, we change the color if the user has selected a random color using the new randomColor method. After that, we store the current location so that we know where the user first touched the screen, and we indicate that our view needs to be redrawn by calling setNeedsDisplay on self.

The next method, touchesMoved:withEvent:, is continuously called while the user is dragging a finger on the screen. We do here is store off the new location in lastTouch and indicate that the screen needs to be redrawn.

The last method, touchesEnded:withEvent:, is called when the user lifts that finger off of the screen. What we do is store off the final location in the lastTouch variable and indicate that the view needs to be redrawn.



11.4 Adding Outlets

We need an outlet to top segmented controller and two methods, one that will be called when a new color is selected and another that will be called when a new shape is selected.

Here is the interface file, "QuartzAppViewController.h"

#import <UIKit/UIKit.h>

@interface QuartzFunViewController : UIViewController {
    UISegmentedControl *colorControl;
}
@property (nonatomic, retain) IBOutlet UISegmentedControl *colorControl;
- (IBAction)changeColor:(id)sender;
- (IBAction)changeShape:(id)sender;
@end

and implementation file, "QuartzAppViewController.m."

#import "QuartzAppViewController.h"
#import "QuartzAppView.h"
#import "Constants.h"

@implementation QuartzAppViewController
@synthesize colorControl;

- (IBAction)changeColor:(id)sender {
    UISegmentedControl *control = sender;
    NSInteger index = [control selectedSegmentIndex];
    
    QuartzAppView *quartzView = (QuartzAppView *)self.view;
    
    switch (index) {
        case kRedColorTab:
            quartzView.currentColor = [UIColor redColor];
            quartzView.useRandomColor = NO;
            break;
        case kBlueColorTab:
            quartzView.currentColor = [UIColor blueColor];
            quartzView.useRandomColor = NO;
            break;
        case kYellowColorTab:
            quartzView.currentColor = [UIColor yellowColor];
            quartzView.useRandomColor = NO;
            break;
        case kGreenColorTab:
            quartzView.currentColor = [UIColor greenColor];
            quartzView.useRandomColor = NO;
            break;
        case kRandomColorTab:
            quartzView.useRandomColor = YES;
            break;
        default:
            break;
    }
}

- (IBAction)changeShape:(id)sender {
    UISegmentedControl *control = sender;
    [(QuartzAppView *)self.view setShapeType:[control
                                              selectedSegmentIndex]];
    
    if ([control selectedSegmentIndex] == kImageShape)
        colorControl.hidden = YES;
    else
        colorControl.hidden = NO;
}

- (void)didReceiveMemoryWarning {
	// Releases the view if it doesn't have a superview.
    [super didReceiveMemoryWarning];
	
	// Release any cached data, images, etc that aren't in use.
}

- (void)viewDidUnload {
	// Release any retained subviews of the main view.
	// e.g. self.myOutlet = nil;
    self.colorControl = nil;
}

- (void)dealloc {
    [colorControl release];
    [super dealloc];
}

@end

In the method, changeColor:, we look at which segment was selected and create a new color. Then, we set its currentColor property of the view, except when a random color is selected. All the drawing code will be in the view itself.

In the changeShape: method, we just set the view's shapeType property to the segment index from the sender. We set the shape to be the same as the currently selected segment, and hide and unhide the colorControl based on whether the Image segment was selected.



11.5 ViewController.xib

Now, it's time for the View. But we need to add the segmented controls to nib and hook up the action and outlets.

Let's open Interface Builder with QuartzAppViewController.xib. Change the class from UVView to QuartzAppView using identity inspector.

Grab the Navigation Bar from the library and put it at the top of the view. Then, put Segmented Control on top of the Navigation Bar. After that, increase the number of segments from 2 to 5 using attribute inspector. Then make the labels as in the picture below.

NavigationBar

Control-drag from the File's Owner icon to the segmented control, and select colorControl outlet.

ConnectionToSegmentControl

Bring up connections inspector while segment control is selected. Drag from the Value Changed event to File's Owner, and select changeColor: action.

ValueChangedToOwners

Put a ToolBar and put it at the bottom of the view. Then drag a Segmented Control and drop it on top of the ToolBar. Make it centered after positioning two Flexible Space Bar Button Item at both ends. Next, increase the number of segments from 2 to 4 and label them as in the picture below.

TwoNavigationBars

Next, connect Value Changed event from the connection inspector of the segment control to File's Owner's changeShape: action method.

ChangeShape


11.6 Drawing Shapes

Let's look at "QuartzAppView.m"

- (void)drawRect:(CGRect)rect {
    
    CGContextRef context = UIGraphicsGetCurrentContext();
    
    CGContextSetLineWidth(context, 2.0);
    CGContextSetStrokeColorWithColor(context, currentColor.CGColor);
    
    switch (shapeType) {
        case kLineShape:
            CGContextMoveToPoint(context, firstTouch.x, firstTouch.y);
            CGContextAddLineToPoint(context, lastTouch.x, lastTouch.y);
            CGContextStrokePath(context);
            break;
        case kRectShape:
            break;
        case kEllipseShape:
            break;
        case kImageShape:            
            break;
        default:
            break;
    }
}

We are retrieving a reference to the current context so that we know where to draw and then, we set the line width to 2.0.

    CGContextRef context = UIGraphicsGetCurrentContext();
    CGContextSetLineWidth(context, 2.0);

After that, we set the color for stroking lines. Since UIColor has a CGColor property, which is what this method needs, we use that property of our currentColor instance variable to pass the correct color onto this function:

    CGContextSetStrokeColorWithColor(context, currentColor.CGColor);

The last function,

    CGContextStrokePath(context);

will stroke the line we just drew using the color and width set set earlier.

Build and Run.

LineDrawingPic

Now, it's time to draw other shapes, rectangle and ellipse. Here is the updated drawRect: method.

- (void)drawRect:(CGRect)rect {
    
    CGContextRef context = UIGraphicsGetCurrentContext();
    
    CGContextSetLineWidth(context, 2.0);
    CGContextSetStrokeColorWithColor(context, currentColor.CGColor);
    CGContextSetFillColorWithColor(context, currentColor.CGColor);
    CGRect currentRect = CGRectMake (
                                     (firstTouch.x > lastTouch.x) ? lastTouch.x : firstTouch.x,
                                     (firstTouch.y > lastTouch.y) ? lastTouch.y : firstTouch.y,
                                     fabsf(firstTouch.x - lastTouch.x),
                                     fabsf(firstTouch.y - lastTouch.y));
    
    switch (shapeType) {
        case kLineShape:
            CGContextMoveToPoint(context, firstTouch.x, firstTouch.y);
            CGContextAddLineToPoint(context, lastTouch.x, lastTouch.y);
            CGContextStrokePath(context);
            break;
        case kRectShape:
            CGContextAddRect(context, currentRect);
            CGContextDrawPath(context, kCGPathFillStroke);
            break;
        case kEllipseShape:
            CGContextAddEllipseInRect(context, currentRect);
            CGContextDrawPath(context, kCGPathFillStroke);
            break;
        case kImageShape: 
        {
            CGFloat horizontalOffset = drawImage.size.width / 2;
            CGFloat verticalOffset = drawImage.size.height / 2;
            CGPoint drawPoint = CGPointMake(lastTouch.x - horizontalOffset,
                                            lastTouch.y - verticalOffset);
            [drawImage drawAtPoint:drawPoint];            
            break;
        }
        default:
            break;
    }
}
RectDrawingPic EllipseDrawingPic

The result for image drawing is:

ImageDrawingPic



11.7 OpenGL ES

We're going to make an application using OpenGL ES. The application will be very similar to the previous one. So, while we're making a new project, we'll borrow some files from the QuartzApp project.

Let's create a new view-based application, and name it GLApp. We need copy Constants.h, UIColorRandom.h, UIColorRandom.m, and tiger.png from the QuartzApp project into GLApp project.

Here is "GLAppViewController.h":

#import <UIKit/UIKit.h>
#import "Constants.h"

@interface GLAppViewController : UIViewController {
    UISegmentedControl *colorControl;
}
@property (nonatomic, retain) IBOutlet UISegmentedControl *colorControl;
- (IBAction)changeColor:(id)sender;
- (IBAction)changeShape:(id)sender;
@end

and "GLAppViewController.m":

#import "GLAppViewController.h"
#import "GLAppView.h"
#import "UIColorRandom.h"

@implementation GLAppViewController

@synthesize colorControl;

- (IBAction)changeColor:(id)sender {
    UISegmentedControl *control = sender;
    NSInteger index = [control selectedSegmentIndex];
    
    GLAppView *glView = (GLAppView *)self.view;
    
    switch (index) {
        case kRedColorTab:
            glView.currentColor = [UIColor redColor];
            glView.useRandomColor = NO;
            break;
        case kBlueColorTab:
            glView.currentColor = [UIColor blueColor];
            glView.useRandomColor = NO;
            break;
        case kYellowColorTab:
            glView.currentColor = [UIColor yellowColor];
            glView.useRandomColor = NO;
            break;
        case kGreenColorTab:
            glView.currentColor = [UIColor greenColor];
            glView.useRandomColor = NO;
            break;
        case kRandomColorTab:
            glView.useRandomColor = YES;
            break;
        default:
            break;
    }
}

- (IBAction)changeShape:(id)sender {
    UISegmentedControl *control = sender;
    [(GLFunView *)self.view setShapeType:[control selectedSegmentIndex]];
    if ([control selectedSegmentIndex] == kImageShape)
        [colorControl setHidden:YES];
    else        [colorControl setHidden:NO];
}

- (void)viewDidUnload {
	// Release any retained subviews of the main view.
	// e.g. self.myOutlet = nil;
    self.colorControl = nil;
    [super viewDidUnload];
}

- (void)dealloc {
    [colorControl release];
    [super dealloc];
}

@end

Not much different from the previous project files, here, we're referencing a view called GLAppView instead of QuartzAppView.

Before moving on, we need four files for OpenGL:
Texture2D.h, Texture2D.m, OpenGLES2DView.h , and OpenGLES2DView.m.
What these files are for?

  • Make drawing images in OpenGL ES much easier than it otherwise would be.
  • Configure OpenGL to do 2D drawing.

OpenGL ES doesn't have sprites or images. But it has a texture to make object looking real. The way we draw an image in OpenGL is to draw a square, and then map a texture onto that square so that it exactly matches the square's size. Texture2D encapsulates that process into a single class.

OpenGLES2DView is a subclass of UIView that uses OpenGL to do its drawing. We setup this view so that the coordinate systems of OpenGL ES is a 3D system. OpenGLES2DView maps the OpenGL 3D world to the pixels of our 2D view.

To use the OpenGLES2DView class, first subclass it, and then implement the draw method to do our actual drawing. We can also implement and other methods we need in our view.

Let's create a new file using Cocoa Touch Class template with Objective-C and NSObject for Subclass of selected. Name it GLAppView.m.

Here is the "GLAppView.h."

#import <UIKit/UIKit.h>
#import "Constants.h"
#import "Texture2D.h"
#import "OpenGLES2DView.h"


@interface GLAppView : OpenGLES2DView {
    CGPoint        firstTouch;
    CGPoint        lastTouch;
    UIColor        *currentColor;
    BOOL        useRandomColor;
    
    ShapeType    shapeType;
    
    Texture2D    *sprite;
}
@property CGPoint firstTouch;
@property CGPoint lastTouch;
@property (nonatomic, retain) UIColor *currentColor;
@property BOOL useRandomColor;
@property ShapeType shapeType;
@property (nonatomic, retain) Texture2D *sprite;
@end

Instead of using UIImage as in the QuartzApp.h, we use a Texture2D to simplify drawing images into an OpenGL ES context. We also change the superclass to OpenGLEX2DView from UVView so that our view becomes an OpenGL ES-backed view set up for doing 2D drawing.

And the implementation file, "GLAppView.m":

#import "GLAppView.h"
#import "UIColorRandom.h"

@implementation GLAppView
@synthesize firstTouch;
@synthesize lastTouch;
@synthesize currentColor;
@synthesize useRandomColor;
@synthesize shapeType;
@synthesize sprite;
- (id)initWithCoder:(NSCoder*)coder {
    if (self = [super initWithCoder:coder]) {
        self.currentColor = [UIColor redColor];
        self.useRandomColor = NO;
        self.sprite = [[Texture2D alloc] initWithImage:[UIImage 
														imageNamed:@"iphone.png"]];
        glBindTexture(GL_TEXTURE_2D, sprite.name);
    }
    return self;
}

- (void)draw  {
    glLoadIdentity();
    
    glClearColor(0.78f, 0.78f, 0.78f, 1.0f);
    glClear(GL_COLOR_BUFFER_BIT);
    
    CGColorRef color = currentColor.CGColor;
    const CGFloat *components = CGColorGetComponents(color);
    CGFloat red = components[0];
    CGFloat green = components[1];
    CGFloat blue = components[2];
    
    glColor4f(red,green, blue, 1.0);
    
    switch (shapeType) {
        case kLineShape: {
            glDisable(GL_TEXTURE_2D);
            GLfloat vertices[4];
            
            // Convert coordinates
            vertices[0] =  firstTouch.x;
            vertices[1] = self.frame.size.height - firstTouch.y;
            vertices[2] = lastTouch.x;
            vertices[3] = self.frame.size.height - lastTouch.y;
            glLineWidth(2.0);
            glVertexPointer (2, GL_FLOAT , 0, vertices);
            glDrawArrays (GL_LINES, 0, 2);
            break;
        }
        case kRectShape: {
            glDisable(GL_TEXTURE_2D);
            // Calculate bounding rect and store in vertices
            GLfloat vertices[8];
            GLfloat minX = (firstTouch.x > lastTouch.x) ?
            lastTouch.x : firstTouch.x;
            GLfloat minY = (self.frame.size.height - firstTouch.y > 
                            self.frame.size.height - lastTouch.y) ? 
            self.frame.size.height - lastTouch.y : 
            self.frame.size.height - firstTouch.y;
            GLfloat maxX = (firstTouch.x > lastTouch.x) ?
            firstTouch.x : lastTouch.x;
            GLfloat maxY = (self.frame.size.height - firstTouch.y > 
                            self.frame.size.height - lastTouch.y) ? 
            self.frame.size.height - firstTouch.y : 
            self.frame.size.height - lastTouch.y;
            
            vertices[0] = maxX;
            vertices[1] = maxY;
            vertices[2] = minX;
            vertices[3] = maxY;
            vertices[4] = minX;
            vertices[5] = minY;
            vertices[6] = maxX;
            vertices[7] = minY;
            
            glVertexPointer (2, GL_FLOAT , 0, vertices);
            glDrawArrays (GL_TRIANGLE_FAN, 0, 4);
            break;
        }
        case kEllipseShape: {
            glDisable(GL_TEXTURE_2D);
            GLfloat vertices[720];
            GLfloat xradius = (firstTouch.x > lastTouch.x) ?
            (firstTouch.x - lastTouch.x)/2 : 
            (lastTouch.x - firstTouch.x)/2;
            GLfloat yradius = (self.frame.size.height - firstTouch.y > 
                               self.frame.size.height - lastTouch.y) ? 
            ((self.frame.size.height - firstTouch.y) - 
             (self.frame.size.height - lastTouch.y))/2 : 
            ((self.frame.size.height - lastTouch.y) -
             (self.frame.size.height - firstTouch.y))/2; 
            for (int i = 0; i <= 720; i+=2) {
                GLfloat xOffset = (firstTouch.x > lastTouch.x) ?
                lastTouch.x + xradius 
                : firstTouch.x + xradius;
                GLfloat yOffset = (self.frame.size.height - firstTouch.y > 
                                   self.frame.size.height - lastTouch.y) ?
                self.frame.size.height - lastTouch.y + yradius : 
                self.frame.size.height - firstTouch.y + yradius;
                vertices[i] = (cos(degreesToRadian(i))*xradius) + xOffset;
                vertices[i+1] = (sin(degreesToRadian(i))*yradius) +
                yOffset;
                
            }
            glVertexPointer (2, GL_FLOAT , 0, vertices);
            glDrawArrays (GL_TRIANGLE_FAN, 0, 360);
            break;
            
        }
        case kImageShape:
            glEnable(GL_TEXTURE_2D);
            [sprite drawAtPoint:CGPointMake(lastTouch.x, 
                                            self.frame.size.height - lastTouch.y)];
            break;
        default:
            break;
    }
    
    glBindRenderbufferOES(GL_RENDERBUFFER_OES, viewRenderbuffer);
    [context presentRenderbuffer:GL_RENDERBUFFER_OES];
}
- (void)dealloc {
    [currentColor release];
    [sprite release];
    [super dealloc];
}
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
    if (useRandomColor)
        self.currentColor = [UIColor randomColor];
    
    UITouch* touch = [[event touchesForView:self] anyObject];
    firstTouch = [touch locationInView:self];
    lastTouch = [touch locationInView:self];
    [self draw];
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {  
    
    UITouch *touch = [touches anyObject];
    lastTouch = [touch locationInView:self];
    
    [self draw];
    
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
    UITouch *touch = [touches anyObject];
    lastTouch = [touch locationInView:self];
    
    [self draw];
}
@end

Because this view is being loaded from a nib, we added an initWithCoder: method, and in it, we create and assign a UIColor to currentColor. We set useRandomColor to NO and created our Texture2D object.

After the initWithCoder: method, we have our draw method. I'll skip the details of OpenGL ES drawing process.

We will get lots of link errors if we do Build and Run at this point. We need to link two frameworks to our project. Right-click on the Frameworks folder in the Groups & Files pane in Xcode and select Existing Frameworks... from the Add submenu. Navigate to the OpenGLES.framework and QuartzCore.framework. Add them to our project.

FrameworkSelection

Build and Run.

We should get the same results as in the QuartzApp project.