Abstraction to Granularity
Abstraction
Is it easier, in general, to learn a higher-level abstraction, when you already know the lower level, than to learn the lower level when you know the higher level? For example, is it easier to learn C language when you know assembly, than to learn assembly when you know C?
Anecdotally, speaking to your example, I learned both assembly and C at the same time in college. Neither was “harder” or “easier” than the other, just different. But having had a logic design course before that set the stage for both. So, knowing the abstraction of logic design made it easier to learn the abstraction of a computer language (or a handful of computer languages).
—AO
Jigs
Do programmers believe that the sign of a good tool is that you need to build other tools around it to make it useful? I think the analogy might be to something like a router (of the woodworking kind), where getting the full benefits of the tool involves lots of jigs.
I think some programmers definitely do. I think the eMacs vs vim vs IDE debate tends to correlate to different ideas in this regard. Similarly Linux vs Mac. Do you enjoy customizing your environment or getting things done?
Personally I do not like to have to constantly fiddle with things outside my domain of interest. So if I’m building a UI library then having to tinker with my build chain seems like a huge annoyance and distraction.
—TL
Truth
Is it crazy to think that a source control system should record the true history of a project? How do we even define true history?
On a later thread, KK says beliefs like this make you a “kook.”
Granularity
What is the right granularity at which to record the history of changes to a codebase? Compare to undo in a text editor: at some point between undoing one keystroke at a time and undo that restores the last save point, there’s an optimum. Where is the optimum for source-control commit granularity?