Engineering a Better Course: Lessons from the Astronauts Who Steered Us Home. Why the human spark to improve is worth protecting
Normally, I give myself a week to prepare these posts — time to sit, let ideas simmer, and see what bubbles up once the froth of the first draft has settled. But today the news of Jim Lovell’s passing hit like a sudden course correction, and my original plan for the week was promptly jettisoned, somewhere between the launch pad and the second paragraph.
There are certain stories you become aware of that feel almost stitched into the lining of your mind. For me, a visit to the Kennedy Space Centre in Florida sparked an interest in the NASA era of the ’60s, ’70s and beyond. Those scratchy radio calls from space, the crew-cut engineers hunched over consoles, the spacecraft that looked more shed than shuttle. And, at the centre of my favourite one of all, a calm Midwesterner named Jim Lovell.
Apollo 13 is the kind of story that, if you pitched it now, would be sent back for “believability issues.” An oxygen tank explodes 200,000 miles from home. The spacecraft’s computers would blush at being compared to a digital watch. Plans are ripped up, not by choice but because physics demands it. Yet somehow, with ingenuity, teamwork, and a very handy roll of duct tape, the crew make it back to Earth in one piece.
For all its drama, Apollo 13 is not a story about heroics in the Hollywood sense. It’s about what happens when people know their craft, respect each other’s skills, and are trusted to solve problems. It’s about using what you’ve got — however meagre — to close the gap between “this is broken” and “this works well enough to save our lives.”
Which brings me, somewhat reluctantly, to today. I have my own engineering problem to wrestle with. Nothing so dramatic as life support systems failing in space — I’m not currently orbiting anything more dangerous than my own coffee mug — but the challenge comes with its own ethical knots. The kind where the technical fix is possible, possibly even straightforward, but the “should we?” hangs in the air like an unanswered radio call.
It’s a reminder that engineering isn’t just about doing things. It’s also about deciding which things ought to be done, and which ought to remain on the drawing board, however clever they are. In the rush to solve problems, we can forget that why we solve them matters just as much as how. And sometimes the right answer isn’t a new design at all, but a decision to stop, think, and make sure the solution serves more than the spreadsheet.
This is where the Apollo lesson still matters. The team at Mission Control weren’t just trying to get three men home; they were doing so in a way that didn’t compromise their principles. There was no question of cutting corners that might save time but cost lives later. Every decision passed through a quiet, unspoken filter: is this the right thing to do, not just the fastest?
Today’s engineering challenges are different — climate change, renewable energy, AI — but the ethical compass is still the same. We need to keep trusting people to think creatively under pressure, but we also need to give them the moral framework to know when “yes we can” must be followed by “but should we?”
And perhaps, if we zoom out from our own industries and apply this thinking to society as a whole, we might chart a better course collectively. Imagine policy built with Apollo-style focus: evidence-led, resourceful, with a refusal to compromise long-term safety for short-term gain. Imagine technology rolled out with as much care for unintended consequences as for market share. Imagine infrastructure decisions made with the same patient problem-solving that turned a damaged spacecraft into a lifeboat. The problems facing our planet are vast, but so is our capacity to solve them — provided we steer with both skill and conscience.
So I find myself, like Lovell, staring at the controls and weighing the options. I have the tools, I have the know-how, and yet I pause — not because I don’t know the answer, but because I want to be certain the answer is one I can defend, years from now, when the dust has settled and someone asks: “Was that the right course?”
Jim Lovell taught us many things, but perhaps the most important is that steering home is never just about navigation. It’s about judgement. It’s about knowing that the course you set is not just the quickest, but the one you can live with once you land.
Here’s to engineering better courses — technically sound, ethically sure, and with enough room for duct tape, teamwork, and the occasional moral pit stop along the way.