Improving university for developers
For a long time, I've maintained that I took two useful subjects during my four-year university education: a data structures course, and the subject where I met a future employer.
What could be done to improve this situation? How could university study be made more relevant?
Formal study is a bit like cross-training for a sport. It'll be beneficial, no doubt, but the best training will be to play the sport itself. The best way to become a good coder is going to be to get lots of practice coding; formal education should simply guide you and give you the benefit of experience. Hence, theory, best practices, and lots of homework.
University is a very time-consuming way to become qualified to work as a developer. I spent most of my time learning things that I'd never use again - analog electronics, physics and higher-level maths. My degree was basically electrical engineering with some computer science thrown in.
The time could have been better spent on how to be an effective developer - the course was called software engineering, but with depressingly little of the engineering side that we take for granted. Bugtracking, source control and builds. These are the three things that most effective teams take for granted and that ineffective teams have never heard of. I've interviewed so many people that have never dealt with them, and I think that should change.
A secondary problem is that it's just not hard enough to get a degree in computer science or software engineering nowadays. I did a lot of tutoring and exam marking, and given the option, I would not pass most of the candidates. In one particular exam, the average final exam mark was 30%. They were all scaled up because the department was not allowed to fail so many students. No wonder there are so many low-quality developers circulating.
When looking at the final marks for a group of students, you can easily see the good ones, the ones you'd hire into your company. They're consistently getting high grades, or writing thoughful answers to questions, or taking a programming assignment just that little bit further to see what will happen. They don't give run-of-the-mill textbook responses or do the bare minimum assignment. And yet, there's no way to reward that excellence - their 95 raw mark will get scaled to 98, while someone who got a 25 raw mark will get scaled to 50 and pass. They'll end up with the same degree at the end.
Imagine a degree that prepares you for the reality of professional software development. What would it entail?
How could I be allowed to graduate without being forcibly exposed to source control systems?
I did use one for one subject, but that was entirely voluntary. We tried to get it hosted on the university servers, but the administrative hassle was so great that we just ran it off one team member's home machine.
Nowadays, source control is both important and complicated enough that students need to be taught about it.
It's important because if a team isn't using it, it probably should be. If you were straight out of uni, trying to find your way in a crazy working world, and there was no source control system in place, would you do anything about it? Would you know that it's not right? Did you have a bout of anxiety when performing your first checkin on commercial code? What if it's not right? What if it breaks the build? What if I damage someone else's file and they get mad? What if?
How does branching work? What's the difference between deleting and obliterating a file? What source control system should I recommend for a particular situation, and why? How do I detect if someone is tampering with a codebase?
These are all crucial for any professional software developer, and very easy to address at an educational level.
When I left university, I went to work for the Honeywell Software Centre. They have the most wonderful bugtracking system that I have ever worked with. You'd come in in the morning and check your bug summary. They'd be prioritised for you. You'd complete some work, you'd close the bugs, your manager was notified, and he'd assign you more bugs. He was doing the same thing with his manager above him. Thus, Steve Yegge's "bugtracker as workqueue" vision is realised. The software was great, the procedures were great, and they were universally accepted by all staff. Hence, successful system.
Since that was my first and only exposure to bugtracking, I assumed that all development shops were like that. Since then, I've been exposed to a raft of poorly implemented bugtracking systems. The biggest problem is simply that people don't use them, even if ordered to. Close behind is that people don't know how to file useful bugs.
Students should have some exposure to bugtrackers - particularly since many will begin in a testing role. One week of explanation, a couple of small assignments where they file bugs. Easy.
I love nightly builds. I sleep better at night knowing that at a moment's notice, I can release a fairly up-to-date version of my code.
In order for automated builds to work, you need an effective source control system. You also need to be able to build in one step. Thus, the ability to produce automated builds speaks positively of the rest of your process.
You can keep your team synchronised easily and lose less time when someone checks in a bad change. You can add in automated testing and move your focus from firefighting to incremental, stable change. You can increase the build frequency and move towards continuous integration zen. Nightly builds aren't the goal, but they are a crucial milestone. Once you're at that point, Everything Will Be OK.
So why don't students know what they're working towards? In a degree titled Software Engineering, why are we still being taught to use the waterfall methodology? Why aren't we learning the major foundations of 'software engineering'? Why aren't we shown a better way to do things, so that students know what things could be like and try to effect some change?
Somehow, we're getting university graduates who cannot explain an opinion or a decision or speak in front of a group.
It boggles my mind to think that any sort of professional can operate without being able to communicate effectively, whether by writing documents, email or speaking.
On a daily basis, I'm asked to explain complicated concepts to nontechnical people, give opinions and justify decisions that I've made. This isn't just something that's nice to have - it's an essential part of the job. Ever seen management's eyes glaze over while a coder explains the depths of their code and their decisions? It's not just about avoiding boredom - it's an efficiency issue.
Time estimation and management
What task are developers asked to do several times each week that makes them cringe reliably? Giving estimates! It's a game you can't win. If you give a long estimate, management will ask you to reduce it. If you give a short estimate, you'll look bad when you miss it.
My experience has been that estimates are reasonably accurate for the case where things go smoothly. Every so often - say, once a week - an issue crops up that sucks up huge amounts of time. There's a bug in your compiler, or something just refuses to work, or performance is unacceptable. These are the things that blow out deadlines badly, not development progressing 20% slower than expected.
I remember doing some trivial time estimation exercises at university, but the essential step that was missing was actually using the estimates. It's all well and good to write a fake project plan that says Task B will take two weeks, but that does nobody any good. Give students some large projects. Have them estimate in a way that might actually work - by speccing out the work down to the hour. Then try to meet the estimates. If you wanted to be harsh you could use the difference between estimated time and actual time to determine the final mark, but that would probably encourage cheating.
I'm still astounded that so many people can't manage their own time effectively. I think this might be part of the success of Agile methodologies - people have no system for tracking and prioritising their own work queue, so Agile gives them one. Suddenly, they're getting important stuff done and there are less complaints because jobs aren't being done. Who'd have thought it?
At some point, everyone's going to need to spec out a project and then assign tasks. Most graduates won't need to do that immediately - but it's good to have some experience and some thinking before the first project goes belly-up. Form groups of four students. Give them assignments that are way too large for any individual to complete in time. Ideally, they should be too large for even the team to complete in time. One person is the leader and has to assign work to the others. Every three weeks, the leader changes and they all start a new assignment.
I remember complaining about deadlines a lot at university. We had an overwhelmingly large amount of work to do. Things only got worse - instead of a large amount of work, there's now an infinite amount of work to do, and some of it needs to be done yesterday. So why not teach students how to deal with crazy deadlines?
I propose a largish assignment, announced in the lecture and by email, where the deadline is *today*. Not at the lecture? Not being at meetings doesn't make the deadline go away in reality! For every day that you take to complete the assignment, you lose 20%. You need correctness checks on the assignment submissions - just like reality - or people will submit crap. And you need to make it worth a decent chunk of the final grade, or people won't take it seriously.
Most programmers - especially new recruits - will be working on someone else's code. It'll be big, and it'll be poorly written, and there'll be a significant amount of time just spent comprehending it. Have an assignment where students have to make modifications to an existing body of code. Add a new syntactic construct to gcc, for example.
As an extension, make the existing code garbage and difficult to understand - preferably in a language that the student hasn't seen before. Or combine the two - student A writes a program in an unfamiliar language. Student B, chosen at random, modifies the program to perform something new. As a bonus task, critique the program and suggest ways in which it could be improved.
I'd love to see two lecturers give students conflicting information on the same assignment. This won't go down well in a university environment - predictable marking schemes are considered sacrosanct - but it's extremely commonplace in the real world. Different people give you different instructions, priorities and deadlines all of the time. You need to resolve these differences and make sure everyone (or everyone important) is happy.
Change the marking scheme of an assignment after it's submitted. Customers change their minds all of the time.
Make students encounter a bug in something that you take as perfect - the compiler or operating system. It's rare that these things ever appear, but you need to be confident in your diagnosis when it does happen.
Give an assignment with conflicting requirements - students either need to negotiate their way out of it or come up with a compromise.
Computer Science theory
There's no escaping it. Just keep it relevant. People can - and will need to - learn domain-specific theory as they encounter it. University can't teach for every situation that a developer will encounter.
Just code, code, code. Experience is the best teacher. Expose the students to as many different problems as possible. Once you've solved a class of problem, you're much better equipped next time it comes up.