Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> In extreme cases a no-op build with make can easily get to 15+ seconds.

I have never seen cases so extreme, but my opinion on the matter is that this is a "build smell". If the Makefile has to resolve a DAG this large, that means that developers have to worry about compile- or link-time interactions this large, as well. 100k source files all linked into a single executable is more complex than 10k source files split across 10 executables, and a handful (say <100) of headers which represent "public" APIs. Because if you have 100k source files and your developers haven't all killed themselves already, then there are some firewalls separating various modules already. Formalize it at an API level and split apart the builds, so that it's _impossible_ for anything outside of the API itself to trigger a full rebuild.



Typically this shows up in recursive make projects with lots of sub projects—it doesn't take that much time to stat every file in question but reinvoking make sixteen times can be quite slow.

I don't deal with this by not using make, I deal with this by not writing recursive makefiles.


> it doesn't take that much time to stat every file in question but reinvoking make sixteen times can be quite slow.

Yes, reinvoking Make repeatedly tends to force redundant `stat` calls. But I have worked in environments where heavily-templated code was hosted over a remote filesystem, and every `stat` call was something like 10msec. That adds up _extremely_ fast, even with non-recursive make. Ugh.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: