[email protected]

Teaching Real-World Programming

Philip Guo

January 7, 2013

In this post, I describe a ubiquitous style of programming that, to my knowledge, has never been formally taught in the classroom.

User Comments

 (16)

Taking a course in programming is like expecting to learn a new language by being handed a dictionary. I despair at the horrible state that, so called, computer language courses are in. There is no teaching involved, and in many cases where I have requested assistance in my home work it became apparent that the homework was never reviewed.

I don't find it surprising that the computer science field has such a difficult time attracting students. They are to spend hundreds of dollars on books and then basically told to read them and teach themselves programming.

TDH

Well I for one wont be taking one of your classes. Cobbling together all sorts of stuff and hoping you can create abstractions later, modularise and refactor is NOT the way to write serious software. If I presented this way of programming to the medical regulatory authorities, I would be laughed at. At best this method could be used for script kiddies apps on iPhones. Its all anyone seems to talk about. The REAL programming world is one of proper specification, design and test.

Doug

I can't help think that there may be a place for a software craftsmanship (unfortunately, its not an "engineering" discipline yet for the most part) qualification, but I hope CS maintains its focus on CS independently of that.

Well, I am an amateur programmer, but when I studied programming I had to write whole applications, sometimes from scratch, though sometimes from templates, which were sometimes quite complete, sometimes skeletal and which didn't always help very much. (Sometimes it's easier to work from scratch, but the templates forced us to use a certain design strategy). In my written exam I had to write a number of classes from scratch, with nothing but paper and pen. Thus, I am not sure I relate entirely to the spoon-fed approach outlined in the article. Clearly approaches to IT education differ somewhat. Courses did teach a single language, which is why I took several courses, covering C++, C# and Java. I pretty well follow the same steps that you do, but then I was writing my own applications from day one, long before commencing formal study and so I had already discovered these things. What I wasn't taught was the use of some of the tools that are in fad in industry at the moment, but then I'm an amateur so I guess that doesn't matter to me so much.

Would call this Cowboy Coding, wouldnt ya?

I teach the software "Practicum" at Franklin University, which puts sophmores and juniors on a development team with a senior as the manager/project-manager/technical lead. Each team develops an application they can complete within the semester, and demos it at the end.

The class focuses on process: the team needs to write full requirements, a design document, and a user manual; the manager needs to create a schedule for the project with milestones, assign tickets each week for the work, and keep the software in a repository accessible through SVN or GIT.

The process creates more overhead that you might do in the real world for a small project, but gives a view of how a software team really works. Initiative, out-of-the-box thinking, communications take place in practice not just theory.

One of the things I stress in the real world is to "write it down" before you build it. Even for small projects I build myself, I find I save time and headaches by writing down: what I'm trying to accomplish, how I'm going to do, what are the tools and environment. Building on Phillips's comments, I would write down "where" to look for code snippets, "how" they would go together, and a clear statement of "what" I'm trying to accomplish.

For myself, I enjoy writing code, and I find when I don't write things down at the beginning, I tend to wander.