Comp 271-400 Week 2

Lewis Tower 410, 4:15-8:15



    Bailey chapter 1, on objects generally.
    Bailey chapter 2, on assertions. We will come back to this later; you can skim it for now.
    Bailey chapter 3, on a Vector class

    Morin chapter 1, sections 1.1 and 1.2
        One slight peculiarity of Morin is that he refers to the array-based List implementation of chapter 2 as an ArrayStack.

Primary text: Bailey, online, and maybe Morin, also online.

Information about MSDNAA is in the Intro to C++ section


Suppose we have List<String> L, that has data in it. How can we print out the entries?

1. while loop

int i=0;    // Java
while (i< L.size()) {
For a while loop, the loop variable 'i' must be declared before the while.

2. for loop ("classic for")

for (int i = 0; i< L.size(); i++) {    // Java

Note that I've chosen to declare i within the loop here. You can do that or else declare the loop variable as in the while loop example above.

3. for-each loop

for (String s : L)

Note that we don't have get(i) here; the for-each loop uses the String variable s as the "loop variable". Note that s must be declared within the loop, as shown. Java takes care of assigning to s each element of L, in turn.

4. Iterator loop

Iterator<String> it = words.iterator();
while (it.hasNext())

This is an iterator. Iterators were sort of a predecessor to the for-each loop. Both Iterators and for-each work for any Collection, not just ArrayList. Why would you use an Iterator, rather than the for-each loop? There are times when the for-each structure just does not work; consider a single loop that takes elements from two lists, one from each for each loop pass. You can't do that with a for-each loop, because the for-each loop would go through just one of the lists.

What an iterator is is a precise way of keeping track of the "current position" in a list. The actual object representing the iterator has two pieces: a reference to the original list, and also a current position.

You may be in deep trouble if you

Introduction to C++

Here are a few notes on this: Intro to C++

What about installing it?

Macs sometimes have xcode. Or you can get it at (or maybe the Apple App Store).

For windows, you can install MS Visual Studio, or mingw. The link to the MSDNAA site for Visual Studio keeps changing; right now it seems to be called Microsoft Imagine and is at

Be sure to click register the first time you connect. Your account identifier is your Loyola email address, with the "".

Binary Search

See sorting.html#binsearch


Pre- and Post-conditions

Bailey addresses these in Chapter 2.

A simple example of a precondition is that the function Math.sqrt(double x) requires that x>=0. The postcondition is something that is true afterwards,on the assumption that the precondition held (in this case, that the value returned is a "good" floating-point approximation to the squareroot of x). Note that sometimes precondition X is replaced in Java with the statement that "an exception is thrown if X is false"; this is probably best thought of as amounting to the same thing.

Note that it is up to the caller of a function to verify the precondition. Sometimes (though not always) the function verifies that the preconditions hold.

An invariant is a statement that is both pre and post: if it holds at the start, then it still holds at the end. The classic example is a loop invariant.

    int sum = 0;
    int n=0;
    while (n<=100) {    // invariant: sum = 1+2+...+n
       n += 1;
       sum += n;

We're not going to obsess about these, but they're good to be familiar with. Most loop invariants are either not helpful or are hard to write down; sometimes, however, they can really help clear up what is going on.

Consider again the Ratio class. One version of the gcd() method was recursive: it calls itself. But we also had an iterative version:

// pre: a>=0, b>=0
int gcd(int a, int b) {
    while (a>0 && b>0) {
       if (a>=b) a = a % b;
       else b = b % a;
    if (a==0) return b; else return a;

Is there an invariant we can use here? Basically, the gcd of a and b never changes. How do we write that?

Ratio Recursion

The gcd() method on Bailey page 9 is recursive: it calls itself. How does this work?

There are a few separate issues. First, we note that gcd(a,b) = gcd(a,b%a), always; any divisor of a and b is a divisor of b%a (which has the form b-ka), and any divisor of a and b%a is a divisor of b.

The second issue, though, is how it can even be legal for a function to call itself. Internally, the runtime system handles this by creating a separate set of local variables for each call to gcd(). This is done on the so-called runtime stack. This means that different calls to gcd(), with different parameter values, don't interact or interfere.

Finally, there's the question of whether rgcd() ever returns. One way to prove this is to argue that the first parameter to rgcd() keeps getting smaller. We stop when it reaches 0, as it must. The atomic case in the recursion is the case that involves no further recursive calls; in the gcd() example it is the case when a==0.

Object Semantics

See objects.html#semantics.

Linked List

See lists.html#linked.

List-related examples:

Table of Factors

This is the example on Bailey page 88. Let us construct a table of all the k<=n and a list of all the factors (prime or not) of k, and ask how much space is needed. This turns out to be n log n. The running time to construct the table varies with how clever the algorithm is, it can be O(n2) [check all i<k for divisibility], O(n3/2) [check all i<sqrt(k)], or O(n log n) [Sieve of Eratosthenes].

Space in a string

The answer depends on whether we're concerned with the worst case or the average case (we are almost never interested in the best case). If the average case, then the answer typically depends on the probability distribution of the data.

More complexity

A function is said to be polynomial if it is O(nk) for some fixed k; quadratic growth is a special case.
So far we've been looking mainly at running time. We can also consider space needs.

Chapter 6: Sorting

See sorting.html#sorting


Recursion starts at Bailey page 94
See recursion.html