Huh, I've always understood that quote very differently, with emphasis on "premature" ... not as in, "don't optimize" but more as in "don't optimize before you've understood the problem" ... or, as a CS professor of mine said "Make it work first, THEN make it work fast" ...
And if you know in advance that a function will be in the critical path, and it needs to perform some operation on N items, and N will be large, it’s not premature to consider the speed of that loop.
Another thought: many (most?) of these "rules" were before widespread distributed computing. I don't think Knuth had in mind a loop that is reading from a database at 100ms each time.
I've seen people write some really head shaking code that makes remote calls in a loop that don't actually depend on each other. I wonder to what extend they are thinking "don't bother with optimization / speed for now"
But second, I'd remove "optimization" from considering here. The code you're describing isn't slow, it's bad code that also happens to be slow. Don't write bad code, ever, if you can knowingly avoid it.
It's OK to write good, clear, slow code when correctness and understandability is more important that optimizing that particular bit. It's not OK to write boneheaded code.
(Exception: After you've written the working program, it turns out that you have all the information to make the query once in one part of the broader program, but don't have all the information to make it a second time until flow reaches another, decoupled part of the program. It may be the lesser evil to do that than rearrange the entire thing to pass all the necessary state around, although you're making a deal with the devil and pinky swearing never to add a 3rd call, then a 4th, then a 5th, then...)
If you really have a loop that is reading from a database at 100ms each time, that's not because of not having optimized it prematurely, that's just stupid.
Got it. What about initiating a 800mb image on a CPU limited virtual machine that THEN hits a database, before responding to a user request on a 300ms roundtrip?
I think we need a new word to describe the average experience, stupidity doesn't fit.
Reminds me of this quote which I recently found and like:
> look, I'm sorry, but the rule is simple:
if you made something 2x faster, you might have done something smart
if you made something 100x faster, you definitely just stopped doing something stupid
and, I guess, context does matter. If you need to make 10 calls to gather up some info to generate something, but you only need to do this once a day, or hour, and if the whole process takes a few seconds that's fine, I could see the argument that just doing the calls one at a time linearly is simpler write/read/maintain.
I've worked on optimizing modern slow code. Once you optimize a few bottlenecks it turns out it's very hard to optimize because the rest of the time is spread out over the whole code without any small bottlenecks and it's all written in a slow language with no thought for performance.
From my understanding you still need to care care about the algorithms and architecture. If N is sufficiently large, you should pick O(N) algorithm over O(N^2). But usually there is a tradeoff, simple code (or hiding something behind some abstraction) might be easier to understand and maintain, but it might work slower with large input data and vice versa. I would rather write code that will be easier to optimize if there is some bottleneck than to optimize it overagressivelly. Also, different code needs different kind of optimization. Sometimes the code might be more IO heavy (disk / DB or network), for this type of code, the IO operation planning and caching is more critical than the optimization of the raw CPU time. Sometimes the input is too small to have any significant performance effects, and, what's paradoxical, choosing smarter algorithms might even hurt performance (alongside the maintanability). For example, for 10 - 100 items a simple linear scan in an array might be faster than using a O(log n) binary search tree. It's also well known that faster executable code (regardless of being hand written, machine generated, high level or machine code) usually has larger size (mostly because it's more "unrolled", duplicated and more complex when advanced algorithms are used). If you optimize the speed everywhere, the binary size tends to increase, causing more cache misses, which might hurt the performance more than improve. This is why some profiling is often needed for large software than simply passing O3.
Charles Eames said that "design depends largely on constraints".
If you know in advance, then one of the constraints on the design of that function is that it's on the critical path and that it has to meet the specification.
You're stating the obvious that you need to consider its implementation.
But that's not the same as "I have a function that is not on the critical path and the performance constraint is not the most important, but I'll still spend time on optimizing it instead of making it clear and easy to understand."
We will optimize it later, we don't have time for that right now, it seems it works fast enough for our needs right now.
"Later" never comes and all critical performance issues are either ignored, hot-patched externally with caches of various quality or just with more expensive hardware.
Plenty of people seem to understand it as, "don't even think about performance until someone has made a strong enough business case that the performance is sufficiently bad as to impact profits".
Sometimes, especially when it comes to distributed systems, going from working solution to fast working solution requires full blown up redesign from scratch.
well you see, in corporate (atleast in big tech), this is usually used as a justification to merge inefficient code (we will optimize it later). That later never comes, either the developers/management moves on or the work item never gets prioritized. That is until the bad software either causes outages or customer churn. Then it is fixed and shown as high impact in your next promo packet.
1. I have seen too many "make it work first" that ended up absolute shitshow that was notoriously difficult to do anything with. You can build the software right the first time
2. The "fast" part is what I think too many people are focusing on and in my experience the "THEN" part is always missing resources utilization and other types of inefficiency that are not necessarily related to speed. I have seen absolute messes of software that work really fast
I guess it's exactly the opposite for me ... I always hated using "normal" language with the computer.
I often quip that I became a programmer specifically to avoid having to use spoken language. I always twitch at the thought of using any voice-based assistant.
Thinking in systems and algorithms is more enjoyable than using human language when it comes to computers IMHO ...
> I often quip that I became a programmer specifically to avoid having to use spoken language. I always twitch at the thought of using any voice-based assistant.
You're one of these people who think that programming languages are structured and formal whereas in contrast natural language must be unstructured and lacking form? Going by the Chomsky hierarchy of formalisms natural language sits somewhere between context-free and context-sensistive https://en.wikipedia.org/wiki/Mildly_context-sensitive_gramm...
> Thinking in systems and algorithms is more enjoyable than using human language when it comes to computers IMHO ...
You don't think in "systems and algorithms" -- those are the outputs of your thinking.
I enjoy having a computer that allows me to create all kinds of things that weren't possible in 1993 ... mash together all kinds of audio, video, text ... put it in a backpack, bring it somewhere, perform on stage, with an 800$ laptop. Amazing.
I'm one of those "Encarta kids" who dug through Encarta for nights on end while the parents were out, and still spend slow Sundays reading random Wikipedia articles.
Having the archives that have been created since 1993, whether Wikipedia, Youtube (to me still one of the most amazing music discovery tools I've ever encountered), Archive.org, Google Scholar, Zenodo, at my fingertips has probably widened my personal horizon beyond imagination. Not sure who I'd be without it.
So even sadder to see it all drown now in AI slop ...
I feel like you missed out on the best part of Napster - finding someone's stash of music you like surrounded by things you've never heard of and then exploring it. My memory swears you could leave someone a message but that's a lifetime ago, but I know I connected with a few people who helped me absolutely get into metal music and that's changed my life for the good forever.
Other than that you'd go to a LAN party and find someone's file share of goodies, find again the things you were into and now you had a new friend who probably liked things you never knew of and now you two are sharing new things to each other on top of that.
It was really an age of connecting people and exploring the world for me, even as a young kid.
Oh I do remember Napster, but that was way after 1993 as well ;)
Either way, that still took ages to download, etc, so, it was less immediate. And somehow, I remember it more as a source for stuff that's already well-known ...
I'm always amazed how Youtube can be so many different things for different people ... It's true that it used to be better a few years back, but people still upload great content even it it's harder to find nowadays.
Also, music ... back in the 90ies, if you were drawn to the obscure side of music, you'd read about it, and could, at best, imagine what it was like, because your local record store didn't have it, the bigger store the next town over didn't have it, and IF anyone could order it was with a non-refundable down payment.
Nowadays, you can probably find it on YT, and that's great IMHO. I my musical horizon would be so much more limited without it.
I recall sitting around all afternoon to tape Layla off the radio, during a repeat countdown, after hearing it the day before for the first time. The DJ cut in during the fade out with "Indeed..." and forty years later I still can't listen to that song without hearing him at the end.
My musical discoveries exploded with the internet, I can't imagine what I would have missed without it.
> I still can't listen to that song without hearing him at the end.
Something similar with me and "Another Day In Paradise". The first time I heard it was from a cassette my friend recorded from Dubai radio accidentally prefixed with an intro by the radio host..And that intro still comes to mind whenever I hear the song..
>My musical discoveries exploded with the internet
We didn't get MTV until the late '80s in Australia and it only ran for a few hours late at night and didn't move to a dedicated cable channel until cable really took off here in the mid '90s.
There's nothing comparable to something like progarchives.com and similar in my experience of the '80s and early '90s. You had to combine muso friends, music store recommendations and random selections, magazines, artist and genre scheduling on Rage (better than MTV here) and you still barely scratched the surface.
I was recommending the playing of Neil Schon to a guitar playing friend recently and we both observed that neither of us had even heard of Journey until well after their popularity had faded. That you could miss a massive US stadium rock act like that seems preposterous in this day and age.
I don't really have nostalgia for that, I prefer the immediacy honestly.
Nowadays people are captured by music differently, as they were captured by music differently before music could be mechanically or digitally reproduced.
For me in the 90s it was the satellite dish and VHS that opened up the world in terms of content, music channels, movies, etc, channels like Cartoon Network, MTV, Viva & Viva Zwei, and so on. And then the internet for me came in '97 or '98.
I've carried my laptop around in so many different bags over the years ... sling bags, tote bags, waterproof messenger backpacks, IKEA backpacks with laptop sleeve compartment, drawstring bags. I usually pick the bag depending on the occasion ...
bike ride in the rain? -> waterproof messenger backpack
downtown stroll to satisfy my inner hipster? -> tote bag
etc ...
All I know is that I'm a "single compartment" person ... I've always found that having a separate compartment for everything just comes at excess weight and loss of flexibility.
I'd give the movie prop a try for sure. Still looking for a decent source of Tyvek to take an attempt at making my own bag (it's not super commonly used where I live).
I wouldn't agree with this all-or-nothing view that ignores public transport. Yes, plenty of people want to live in the city, so it's dense, but if you live a bit outside, you can hop on a local train and be in the city in 30 minutes.
Also, Paris is an extreme example. There's plenty of mid-sized cities (400k to 1 million or so) in Europe and presumably elsewhere where you can live in a quiet space, maybe even have access to a garden, and hop on the tram or your bike, and be downtown in 20 minutes, without parking lots.
So, you can definitely have both. These places exist.
That's right, in a small city you can do it. Think Brno. But those small cities don't get to be truly dystopian in car-centric societies, either.
And no one is going to build public transport in them in US now - people all flock to megacities, these small places are all bleeding population, population there gets old, and tax sources are scarce.
Heh, I'd just call it practicing ... I mean, it's pretty much a very common way to practice for every serious musician I know (apart from practicing in a group setting, of course).
Woodshedding is definitely "just" practice but at least in jazz it does have a certain connotation. Like a single musician probably has a few different practice routines depending on their goals at the time and woodshedding is implying deep focus on a passage or specific application of a technique. Rather than like, playing through whole songs or running scales or whatever they might also do another time.
It might also have to do with how programming was taught to a certain generation? At least from my own experience, when I was at university OOP and UML was all the rage, so we had to specify through diagrams in fancy diagram editors, then generate the code from that, and then still re-write everything by hand because it never really worked out quite that well.
Well, the code generation was a mistake, but drawing diagrams is explicitly what the guy in this story did:
He started by sitting at his desk and drawing a lot of diagrams. I was the project coordinator, so I used to drop in on him and ask how things were going. "Still designing," he'd say. He wanted the diagrams to look beautiful and symmetrical as well as capturing all the state information.