Improving university for developers

ian – Sat, 2006 – 10 – 14 05:43

For a long time, I've maintained that I took two useful subjects during my four-year university education: a data structures course, and the subject where I met a future employer.

What could be done to improve this situation? How could university study be made more relevant?

Formal study is a bit like cross-training for a sport. It'll be beneficial, no doubt, but the best training will be to play the sport itself. The best way to become a good coder is going to be to get lots of practice coding; formal education should simply guide you and give you the benefit of experience. Hence, theory, best practices, and lots of homework.

University is a very time-consuming way to become qualified to work as a developer. I spent most of my time learning things that I'd never use again - analog electronics, physics and higher-level maths. My degree was basically electrical engineering with some computer science thrown in.

The time could have been better spent on how to be an effective developer - the course was called software engineering, but with depressingly little of the engineering side that we take for granted. Bugtracking, source control and builds. These are the three things that most effective teams take for granted and that ineffective teams have never heard of. I've interviewed so many people that have never dealt with them, and I think that should change.

A secondary problem is that it's just not hard enough to get a degree in computer science or software engineering nowadays. I did a lot of tutoring and exam marking, and given the option, I would not pass most of the candidates. In one particular exam, the average final exam mark was 30%. They were all scaled up because the department was not allowed to fail so many students. No wonder there are so many low-quality developers circulating.

When looking at the final marks for a group of students, you can easily see the good ones, the ones you'd hire into your company. They're consistently getting high grades, or writing thoughful answers to questions, or taking a programming assignment just that little bit further to see what will happen. They don't give run-of-the-mill textbook responses or do the bare minimum assignment. And yet, there's no way to reward that excellence - their 95 raw mark will get scaled to 98, while someone who got a 25 raw mark will get scaled to 50 and pass. They'll end up with the same degree at the end.

Imagine a degree that prepares you for the reality of professional software development. What would it entail?

Source control

How could I be allowed to graduate without being forcibly exposed to source control systems?

I did use one for one subject, but that was entirely voluntary. We tried to get it hosted on the university servers, but the administrative hassle was so great that we just ran it off one team member's home machine.

Nowadays, source control is both important and complicated enough that students need to be taught about it.

It's important because if a team isn't using it, it probably should be. If you were straight out of uni, trying to find your way in a crazy working world, and there was no source control system in place, would you do anything about it? Would you know that it's not right? Did you have a bout of anxiety when performing your first checkin on commercial code? What if it's not right? What if it breaks the build? What if I damage someone else's file and they get mad? What if?

How does branching work? What's the difference between deleting and obliterating a file? What source control system should I recommend for a particular situation, and why? How do I detect if someone is tampering with a codebase?

These are all crucial for any professional software developer, and very easy to address at an educational level.

Bugtracking

When I left university, I went to work for the Honeywell Software Centre. They have the most wonderful bugtracking system that I have ever worked with. You'd come in in the morning and check your bug summary. They'd be prioritised for you. You'd complete some work, you'd close the bugs, your manager was notified, and he'd assign you more bugs. He was doing the same thing with his manager above him. Thus, Steve Yegge's "bugtracker as workqueue" vision is realised. The software was great, the procedures were great, and they were universally accepted by all staff. Hence, successful system.

Since that was my first and only exposure to bugtracking, I assumed that all development shops were like that. Since then, I've been exposed to a raft of poorly implemented bugtracking systems. The biggest problem is simply that people don't use them, even if ordered to. Close behind is that people don't know how to file useful bugs.

Students should have some exposure to bugtrackers - particularly since many will begin in a testing role. One week of explanation, a couple of small assignments where they file bugs. Easy.

Builds

I love nightly builds. I sleep better at night knowing that at a moment's notice, I can release a fairly up-to-date version of my code.

In order for automated builds to work, you need an effective source control system. You also need to be able to build in one step. Thus, the ability to produce automated builds speaks positively of the rest of your process.

You can keep your team synchronised easily and lose less time when someone checks in a bad change. You can add in automated testing and move your focus from firefighting to incremental, stable change. You can increase the build frequency and move towards continuous integration zen. Nightly builds aren't the goal, but they are a crucial milestone. Once you're at that point, Everything Will Be OK.

So why don't students know what they're working towards? In a degree titled Software Engineering, why are we still being taught to use the waterfall methodology? Why aren't we learning the major foundations of 'software engineering'? Why aren't we shown a better way to do things, so that students know what things could be like and try to effect some change?

Communication Skills

Somehow, we're getting university graduates who cannot explain an opinion or a decision or speak in front of a group.

It boggles my mind to think that any sort of professional can operate without being able to communicate effectively, whether by writing documents, email or speaking.

On a daily basis, I'm asked to explain complicated concepts to nontechnical people, give opinions and justify decisions that I've made. This isn't just something that's nice to have - it's an essential part of the job. Ever seen management's eyes glaze over while a coder explains the depths of their code and their decisions? It's not just about avoiding boredom - it's an efficiency issue.

Time estimation and management

What task are developers asked to do several times each week that makes them cringe reliably? Giving estimates! It's a game you can't win. If you give a long estimate, management will ask you to reduce it. If you give a short estimate, you'll look bad when you miss it.

My experience has been that estimates are reasonably accurate for the case where things go smoothly. Every so often - say, once a week - an issue crops up that sucks up huge amounts of time. There's a bug in your compiler, or something just refuses to work, or performance is unacceptable. These are the things that blow out deadlines badly, not development progressing 20% slower than expected.

I remember doing some trivial time estimation exercises at university, but the essential step that was missing was actually using the estimates. It's all well and good to write a fake project plan that says Task B will take two weeks, but that does nobody any good. Give students some large projects. Have them estimate in a way that might actually work - by speccing out the work down to the hour. Then try to meet the estimates. If you wanted to be harsh you could use the difference between estimated time and actual time to determine the final mark, but that would probably encourage cheating.

I'm still astounded that so many people can't manage their own time effectively. I think this might be part of the success of Agile methodologies - people have no system for tracking and prioritising their own work queue, so Agile gives them one. Suddenly, they're getting important stuff done and there are less complaints because jobs aren't being done. Who'd have thought it?

At some point, everyone's going to need to spec out a project and then assign tasks. Most graduates won't need to do that immediately - but it's good to have some experience and some thinking before the first project goes belly-up. Form groups of four students. Give them assignments that are way too large for any individual to complete in time. Ideally, they should be too large for even the team to complete in time. One person is the leader and has to assign work to the others. Every three weeks, the leader changes and they all start a new assignment.

Reality

I remember complaining about deadlines a lot at university. We had an overwhelmingly large amount of work to do. Things only got worse - instead of a large amount of work, there's now an infinite amount of work to do, and some of it needs to be done yesterday. So why not teach students how to deal with crazy deadlines?

I propose a largish assignment, announced in the lecture and by email, where the deadline is *today*. Not at the lecture? Not being at meetings doesn't make the deadline go away in reality! For every day that you take to complete the assignment, you lose 20%. You need correctness checks on the assignment submissions - just like reality - or people will submit crap. And you need to make it worth a decent chunk of the final grade, or people won't take it seriously.

Most programmers - especially new recruits - will be working on someone else's code. It'll be big, and it'll be poorly written, and there'll be a significant amount of time just spent comprehending it. Have an assignment where students have to make modifications to an existing body of code. Add a new syntactic construct to gcc, for example.

As an extension, make the existing code garbage and difficult to understand - preferably in a language that the student hasn't seen before. Or combine the two - student A writes a program in an unfamiliar language. Student B, chosen at random, modifies the program to perform something new. As a bonus task, critique the program and suggest ways in which it could be improved.

I'd love to see two lecturers give students conflicting information on the same assignment. This won't go down well in a university environment - predictable marking schemes are considered sacrosanct - but it's extremely commonplace in the real world. Different people give you different instructions, priorities and deadlines all of the time. You need to resolve these differences and make sure everyone (or everyone important) is happy.

Change the marking scheme of an assignment after it's submitted. Customers change their minds all of the time.

Make students encounter a bug in something that you take as perfect - the compiler or operating system. It's rare that these things ever appear, but you need to be confident in your diagnosis when it does happen.

Give an assignment with conflicting requirements - students either need to negotiate their way out of it or come up with a compromise.

Computer Science theory

There's no escaping it. Just keep it relevant. People can - and will need to - learn domain-specific theory as they encounter it. University can't teach for every situation that a developer will encounter.

Practice

Students need to be exposed to so many languages that they all look the same. Throw a new language at them every month - C, Java, Python, JavaScript, PIC assembler, MIPS assembler, VHDL, Lisp, and PHP would be a good mix. After that, they'll be able to pick up anything new without thinking about it.

Just code, code, code. Experience is the best teacher. Expose the students to as many different problems as possible. Once you've solved a class of problem, you're much better equipped next time it comes up.


Don't neglect the basics

All good stuff, as a non-software engineer. But do you software engineering types want to be programmers, or engineers?

If all you want to do is program in a nice environment that some capable person has set up for you that's fine, you don't need to know anything outside your world. And surely the most efficient way to learn is an on-the-job apprenticeship and a TAFE course in the currently fashionable programming language.

If you want to have a deeper understanding of what's going on, and be able to adapt to new challenges, a knowledge of the basics will serve you well. Even analogue electronics, which is used to build digital electronics which is used to build computers. It all comes back to basics eventually, and learning about these arcane things can only make your mind more flexible, and give you a broader perspective.

So, learn the specifics of the trade by all means, but don't neglect the underlying theory. Things like version control and build regimes aren't really complicated and you can easily pick them up on the job, although I completely agree that students should be exposed to the concepts as part of other work.

Sean (not verified) – Thu, 2006 – 10 – 19 02:54

Functional

Functional Programming
Object Oriented Programming

Yeah, I attended subjects for both, but I never managed to grasp either that well while I was at uni. Both of them are a fairly large departure from plain sequential, procedural development. They need you view problems from a different angle. A subject that wrapped TDD and OO together would probably manage much better than just throwing java at students and telling them to code in an OO fashion.

Also unit testing would have been a good subject in its self. I only heard of it in passing while at uni, but it's one of the things I do every day. Hmm, Refactoring and Patterns would also fall into that camp but they're less useful until you actually have experience with the problems they solve.

Managing production systems, i.e. upgrades, patching, change management is also one of those tasks that isn't really talked about much, but ends up becoming very important.

In many ways I think that software engineering would be better taught with apprenticeships. Have students work part-time and go to uni part time for 4 years. I mean most of the things your talking about like source control, task management, continuous integration etc don't come into play until you work on medium to large systems. I know as a student I never worked on anything close to that size.

Tristian (not verified) – Mon, 2006 – 10 – 16 12:32

Post new comment

Please solve the math problem above and type in the result. e.g. for 1+1, type 2
The content of this field is kept private and will not be shown publicly.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd>
  • Lines and paragraphs break automatically.
More information about formatting options