March 11, 2014 4:52 PM
Change The Battle From Arguments To Tests
In his recent article on the future of the news business, Marc Andreessen has a great passage in his section on ways for the journalism industry to move forward:
Experimentation: You may not have all the right answers up front, but running many experiments changes the battle for the right way forward from arguments to tests. You get data, which leads to correctness and ultimately finding the right answers.
I love that clause: "running many experiments changes the battle for the right way forward from arguments to tests".
While programming, it's easy to get caught up in what we know about the code we have just written and assume that this somehow empowers us to declare sweeping truths about what to do next.
When students are first learning to program, they often fall into this trap -- despite the fact that they don't know much at all. From other courses, though, they are used to thinking for a bit, drawing some conclusions, and then expressing strongly-held opinions. Why not do it with their code, too?
No matter who we are, whenever we do this, sometimes we are right, and sometimes, we are wrong. Why leave it to chance? Run a simple little experiment. Write a snippet of code that implements our idea, and run it. See what happens.
Programs let us test our ideas, even the ideas we have about the program we are writing. Why settle for abstract assertions when we can do better? In the end, even well-reasoned assertions are so much hot air. I learned this from Ward Cunningham: It's all talk until the tests run.
March 08, 2014 10:18 AM
Sometimes a Fantasy
This week I saw a link to The Turing School of Software & Design, "a seven-month, full-time program for people who want to become professional developers". It reminded me of Neumont University, a ten-year-old school that offers a B.S. degree program in Computer science that students can complete in two and a half years.
While riding the bike, I occasionally fantasize about doing something like this. With the economics of universities changing so quickly [ 1 | 2 ], there is an opportunity for a new kind of higher education. And there's something appealing about being able to work closely with a cadre of motivated students on the full spectrum of computer science and software development.
This could be an accelerated form of traditional CS instruction, without the distractions of other things, or it could be something different. Traditional university courses are pretty confining. "This course is about algorithms. That one is about programming languages." It would be fun to run a studio in which students serve as apprentices making real stuff, all of us learning as we go along.
A few years ago, one of our ChiliPLoP hot topic groups conducted a greenfield thought experiment to design an undergrad CS program outside of the constraints of any existing university structure. Student advancement was based on demonstrating professional competencies, not completing packaged courses. It was such an appealing idea! Of course, there was a lot of hard work to be done working out the details.
My view of university is still romantic, though. I like the idea of students engaging the great ideas of humanity that lie outside their major. These days, I think it's conceivable to include the humanities and other disciplines in a new kind of CS education. In a recent blog entry, Hollis Robbins floats the idea of Home College for the first year of a liberal arts education. The premise is that there are "thousands of qualified, trained, energetic, and underemployed Ph.D.s [...] struggling to find stable teaching jobs". Hiring a well-rounded tutor could be a lot less expensive than a year at a private college, and more lucrative for the tutor than adjuncting.
Maybe a new educational venture could offer more than targeted professional development in computing or software. Include a couple of humanities profs, maybe some a social scientist, and it could offer a more complete undergraduate education -- one that is economical both in time and money.
But the core of my dream is going broad and deep in CS without the baggage of a university. Sometimes a fantasy is all you need. Other times...
March 07, 2014 2:24 PM
Take Small Steps
If a CS major learns only one habit of professional practice in four years, it should be:
Take small steps.
If things aren't working, take smaller steps.
I once heard Kent Beck say something similar, in the context of TDD and XP. When my colleague Mark Jacobson works with students who are struggling, he uses a similar mantra: Solve a simpler problem. As Dr. Nick notes, students and professionals alike should scale the step size according to their level of knowledge or their confidence about the problem.
When I tweeted these thoughts yesterday, two pieces of related advice came in:
- Slow down. -- Big steps are usually a sign of trying to hurry. Beginners are especially prone to this.
- Lemma: Keep moving. -- Small steps keep us moving more reliably. We can always fool ourselves into believing that the next big step is all we need...
Of course, I've always been a fan of baby steps and unusual connections to agile software development. They apply quite nicely to learners in many settings.
March 01, 2014 11:35 AM
A Few Old Passages
I was looking over a couple of files of old notes and found several quotes that I still like, usually from articles I enjoyed as well. They haven't found their way into a blog entry yet, but they deserve to see the light of day.
From a short note on the tendency even among scientists to believe unsubstantiated claims, both in and out of the professional context:
It's hard work, but I suspect the real challenge will lie in persuading working programmers to say "evidence, please" more often.
More programmers and computer scientists are trying to collect and understand data these days, but I'm not sure we've made much headway in getting programmers to ask for evidence.
Sometimes, Code Before Math
From a discussion of The Expectation Maximization Algorithm:
The code is a lot simpler to understand than the math, I think.
I often understand the language of code more quickly than the language of math. Reading, or even writing, a program sometimes helps me understand a new idea better than reading the math. Theory is, however, great for helping me to pin down what I have learned more formally.
Grin, Wave, Nod
From Iteration Inside and Out, a review of the many ways we loop over stuff in programs:
Right now, the Rubyists are grinning, the Smalltalkers are furiously waving their hands in the air to get the teacher's attention, and the Lispers are just nodding smugly in the back row (all as usual).
As a Big Fan of all three languages, I am occasionally conflicted. Grin? Wave? Nod? Look like the court jester by doing all three simultaneously?
February 25, 2014 3:31 PM
Abraham Lincoln on Reading the Comment Section
From Abraham Lincoln's last public address:
As a general rule, I abstain from reading the reports of attacks upon myself, wishing not to be provoked by that to which I cannot properly offer an answer.
These remarks came two days after Robert E. Lee surrendered at Appomattox Court House. Lincoln was facing abuse from the North and the South, and from within his party and without.
The great ones speak truths that outlive their times.
February 22, 2014 2:05 PM
MOOCs: Have No Fear! -- Or Should We?
The Grumpy Economist has taught a MOOC and says in his analysis of MOOCs:
The grumpy response to moocs: When Gutenberg invented moveable type, universities reacted in horror. "They'll just read the textbook. Nobody will come to lectures anymore!" It didn't happen. Why should we worry now?
The calming effect of his rather long entry is mitigated by other predictions, such as:
However, no question about it, the deadly boring hour and a half lecture in a hall with 100 people by a mediocre professor teaching utterly standard material is just dead, RIP. And universities and classes which offer nothing more to their campus students will indeed be pressed.
In downplaying the potential effects of MOOCs, Cochrane seems mostly to be speaking about research schools and more prestigious liberal arts schools. Education is but one of the "goods' being sold by such schools; prestige and connections are often the primary benefits sought by students there.
I usually feel a little odd when I read comments on teaching from people who teach mostly graduate students and mostly at big R-1 schools. I'm not sure their experience of teaching is quite the same as the experience of most university professors. Consequently, I'm suspicious of the prescriptions and predictions they make for higher education, because our personal experiences affect our view of the world.
That said, Cochrane's blog spends a lot of time talking about the nuts and bolts of creating MOOCs, and his comments on fixed and marginal costs are on the mark. (He may be grumpy, but he is an economist!) And a few of his remarks about teaching apply just as well to undergrads at a state teaching university as they do to U. of Chicago's doctoral program in economics. One that stood out:
Most of my skill as a classroom teacher comes from the fact that I know most of the wrong answers as well as the right ones.
All discussions of MOOCs ultimately include the question of revenue. Cochrane reminds us that universities...
... are, in the end, nonprofit institutions that give away what used to be called knowledge and is now called intellectual property.
The question now, though, is how schools can afford to give away knowledge as state support for public schools declines sharply and relative cost structure makes it hard for public and private schools alike to offer education at a price reasonable for their respective target audiences. The R-1s face a future just as challenging as the rest of us; how can they afford to support researchers who spend most of their time creating knowledge, not teaching it to students?
MOOCs are a weird wrench thrown into this mix. They seem to taketh away as much as they giveth. Interesting times.
February 21, 2014 3:35 PM
Sticking with a Good Idea
My algorithms students and I recently considered the classic problem of finding the closest pair of points in a set. Many of them were able to produce a typical brute-force approach, such as:
minimum ← ∞ for i ← 1 to n do for j ← i+1 to n do distance ← sqrt((x[i] - x[j])² + (y[i] - y[j])²) if distance < minimum then minimum ← distance first ← i second ← j return (first, second)
Alas, this is an O(n²) process, so we considered whether we might do better with a divide-and-conquer approach. It did not look promising, though. Divide-and-conquer doesn't let us solve the sub-problems independently. What if the closest pair straddles two partitions?
This is a common theme in computing, and problem solving more generally. We try a technique only to find that it doesn't quite work. Something doesn't fit, or a feature of the domain violates a requirement of the technique. It's tempting in such cases to give up and settle for something less.
Experienced problem solvers know not to give up too quickly. Many of the great advances in computing came under conditions just like this. Consider Leonard Kleinrock and the theory of packet switching.
In a Computing Conversations podcast published last year, Kleinrock talks about his Ph.D. research. He was working on the problem of how to support a lot of bursty network traffic on a shared connection. (You can read a summary of the podcast in an IEEE Computer column also published last year.)
His wonderful idea: apply the technique of time sharing from multi-user operating systems. The system could break all messages into "packets" of a fixed size, let messages take turns on the shared line, then reassemble each message on the receiving end. This would give every message a chance to move without making short messages wait too long behind large ones.
Thus was born the idea of packet switching. But there was a problem. Kleinrock says:
I set up this mathematical model and found it was analytically intractable. I had two choices: give up and find another problem to work on, or make an assumption that would allow me to move forward. So I introduced a mathematical assumption that cracked the problem wide open.
His "independence assumption" made it possible for him to complete his analysis and optimize the design of a packet-switching network. But an important question remained: Was his simplifying assumption too big a cheat? Did it skew the theoretical results in such a way that his model was no longer a reasonable approximation of how networks would behave in the real world?
Again, Kleinrock didn't give up. He wrote a program instead.
I had to write a program to simulate these networks with and without the assumption. ... I simulated many networks on the TX-2 computer at Lincoln Laboratories. I spent four months writing the simulation program. It was a 2,500-line assembly language program, and I wrote it all before debugging a single line of it. I knew if I didn't get that simulation right, I wouldn't get my dissertation.
High-stakes programming! In the end, Kleinrock was able to demonstrate that his analytical model was sufficiently close to real-world behavior that his design would work. Every one of us reaps the benefit of his persistence every day.
Sometimes, a good idea poses obstacles of its own. We should not let those obstacles beat us without a fight. Often, we just have to find a way to make it work.
This lesson applies quite nicely to using divide-and-conquer on the closest pairs problem. In this case, we don't make a simplifying assumption; we solve the sub-problem created by our approach:
After finding a candidate for the closest pair, we check to see if there is a closer pair straddling our partitions. The distance between the candidate points constrains the area we have to consider quite a bit, which makes the post-processing step feasible. The result is an O(n log n) algorithm that improves significantly on brute force.
This algorithm, like packet switching, comes from sticking with a good idea and finding a way to make it work. This is a lesson every computer science student and novice programmer needs to learn.
There is a complementary lesson to be learned, of course: knowing when to give up on an idea and move on to something else. Experience helps us tell the two situations apart, though never with perfect accuracy. Sometimes, we just have to follow an idea long enough to know when it's time to move on.
February 19, 2014 4:12 PM
Teaching for the Perplexed and the Traumatized
Teaching for the Perplexed and the Traumatized
On need for empathy when writing about math for the perplexed and the traumatized, Steven Strogatz says:
You have to help them love the questions.
Teachers learn this eventually. If students love the questions, they will do an amazing amount of working searching for answers.
Strogatz is writing about writing, but everything he says applies to teaching as well, especially teaching undergraduates and non-majors. If you teach only grad courses in a specialty area, you may be able to rely on the students to provide their own curiosity and energy. Otherwise having empathy, making connections, and providing Aha! moments are a big part of being successful in the classroom. Stories trump formal notation.
This semester, I've been trying a particular angle on achieving this trifecta of teaching goodness: I try to open every class session with a game or puzzle that the students might care about. From there, we delve into the details of algorithms and theoretical analysis. I plan to write more on this soon.
February 16, 2014 10:48 AM
Experience Happens When You Keep Showing Up
You know what they say about good design coming from experience, and experience coming from bad design? That phenomenon is true of most things non-trivial. Here's an example from men's college basketball.
The University of Florida has a veteran team. The University of Kentucky has a young team. Florida's players are very good, but not generally considered to be in the same class as Kentucky's highly-regarded players. Yesterday, the two teams played a close game on Kentucky's home floor.
Once they fell behind by five with less than two minutes remaining, Kentucky players panicked. Florida players didn't. Why not? "Well, we have a veteran group here that's panicked before -- that's been in this situation and not handled it well," [Patric] Young said.
How did Florida's players maintain their composure at the end of a tight game on the road against another good team? They had been in that same situation three times before, and failed. They didn't panic this time in large part because they had panicked before and learned from those experiences.
Kentucky's starters have played a total of 124 college games. Florida's four seniors have combined to play 491. That's a lot of experience -- a lot of opportunities to panic, to guess wrong, to underestimate a situation, or otherwise to come up short. And a lot of opportunities to learn.
The young players at Kentucky hurt today. As the author of the linked game report notes, Florida's players have hurt like that before, for coming up short in much the same way, "and they used that pain to get better".
It turns out that composure comes from experience, and experience comes from lack of composure.
As a teacher, I try to convince students not to shy away from the error messages their compiler gives them, or from the convoluted code they eventually sneak past it. Those are the experiences they'll eventually look back to when they are capable, confident programmers. They just need the opportunity to learn.
February 14, 2014 3:07 PM
Do Things That Help You Become Less Wrong
My students and I debriefed a programming assignment in class yesterday. In the middle of class, I said, "Now for a big question: How do you know your code is correct?
There were a lot of knowing smiles and a lot of nervous laughter. Most of them don't.
Sure, they ran a few test cases, but after making additions and changes to the code, some were just happy that it still ran. The output looked reasonable, so it must be done. I suggested that they might want to think more about testing.
This morning I read a great quote from Nathan Marz that I will share with my students:
Feedback is everything. Most of the time you're wrong, and feedback is the only way to realize your mistakes and help you become less wrong. This applies to everything.
Most of the time you're wrong. Do things that help you become less wrong. Getting feedback, early and often, is one of the best ways to do this.
A comment by a student earlier in the period foreshadowed our discussion of testing, which made me feel even better. In response to the retrospective question, "What design or programming choices went well for you?", he answered unit tests.
That set me up quite nicely to segue from manual testing into automated testing. If you aren't testing your code early and often, then manual testing is a huge improvement. But you can do even better by pushing those test cases into a form that can be executed quickly and easily, with the code doing the tedious work of verifying the output.
My students are writing code in many different languages, so I showed them testing frameworks in Ruby, Java, and Python. The code looks simple, even with the boilerplate imposed by the frameworks.
The big challenges in getting students to write unit tests are the same as for getting professionals to write them: lack of time, and misplaced confidence. I hope that a few of my students will see that the real time sink is debugging bad code and that a fear of changing code is a lack of confidence. The best way to be confident is to have evidence.
The student who extolled unit tests works in Racket and so has test cases in RackUnit. He set me up nicely for a future discussion, too, when he admitted out loud that he wrote his tests first. This time, it was I who smiled knowingly.