Skip to main content

Context, reconciliation, and performance

I believe the software industry needs a word to differentiate the design of performant software from the inside (or bottom up) versus the outside (or top down). There is a heavy focus on algorithms and data structures in the software engineering world, which is great if you are designing performant software from the inside out (bottom up). But this will only be performant so long as there is one monolithic structure. As soon as there are competing components (or silos of functionality) that need to integrate together, the integration of the components often becomes a performance bottleneck. And this phenomenon is unavoidable because there is a limit to any human's working memory. Large problems are broken down to smaller problems so that they can be more easily grappled with. But the choice of how things are isolated often can incur a performance gain or loss that is just as significant in magnitude as choosing the right data structures and algorithms. And it is this way because data structures and algorithms are precisely the tools used to reconcile state between independent components. If two components are designed in a way where they are opposed to each other's existence, then one would essentially need to copy the entire state from one to the other to reconcile their states. Sometimes the reconciliation algorithm is even worse than copying if copying is not an option (such as when the data tied to the component is too large to copy, e.g. a component is a database).
There does not seem to be a word to measure how holistically designed a set of components are. There does not seem to be a way to talk about how well a system is intelligently designed. Often a system that uses the simplest algorithms within the components (but integrates the components coherently) out performs a system that uses the best algorithms within the components (but does not integrate the components coherently). People think this is all still falls within the realm of data structures and algorithms because all state reconciliation is a data structure and algorithm problem. But what I am more interested in is having a way of measuring how intelligently designed a system is holistically. The absence of extremely powerful algorithms is a sign of good holistic design in my opinion. A very intelligently designed system should have one core algorithm in my opinion. Yet it is very hard to distill complex problems to simple mechanisms. There has to be a word to describe someone's ability to do this distillation. And it needs to be a common word in the software engineering industry. It's just as powerful a tool as choosing the right data structure or algorithm. It's the ability to hold many things in the mind at once. I would use the term "working memory", but the term could be overloaded from existing definitions. The term I am looking for should describe the ability to create handles on the many dimensions and aspects of a problem, so one can swiftly mold the problem into a shape that can be solved in the simplest manner possible. It is a compressive and expansive dance.
Every talented software engineer has this substance in spades. They subconsciously break down complex problems to simple ones. Their designs allow components to integrate smoothly with each other because all friction was preemptively removed in the design process. And it is not something that is easily observable (e.g. you can't ask someone to demonstrate this quality because the ability is essentially a neurological phenomenon) outside of the final work. Bad software engineers don't even know of this substance's existence. They are blind to it. Or worse, they think they have a monopoly on it (and they think people who don't follow in their footsteps have poor taste), yet their final work fails to leave an impression.

Comments

Popular posts from this blog

Guitar improvisation from May 23, 2025

OPINION - On the axiom of choice

I find it distasteful that non-mathematicians think that Gödel's work introduces a level of subjectivity to mathematics. I agree that one can construct an arbitrary number of mathematical universes via selecting an arbitrary set of axioms. But I disagree that they are somehow all equivalent in value or structural consistency. I personally believe that there is one mathematical universe (or category of universes that are structurally equivalent via something like an isomorphism) that has the most structural consistency and can give the human mind the most insight. I personally believe that there are axioms that are representations of structural properties of physical reality. And I believe that there is a set of axioms that aligns perfectly with the physical universe, and subsequently allows the human mind to comprehend its logic to the fullest extent. I believe this because the way that the human mind understands logic is already a consequence of physical reality. Our ability to un...

Time, partitioning, and synchronization

Any time measuring method inevitably runs into the issues of partitioning and synchronization. Partitioning deals with the issue of dividing a larger measure into smaller measures, and combining smaller measures into a larger measure. Synchronization deals with the problem of how a set of devices can self-correct if some of them are corrupted. The two are fundamentally related because often a choice in one determines a choice in the other. A measure is often defined by a set of synchronization points, such as the radioactive decay of an element or the frequency of a crystal oscillator. Synchronization points can often be defined as a measure of a change in space, such as the revolution of a planet around a star, or the change in energy state of an oscillating structure. Fundamental to both is the notion of change. A synchronization event can only be defined if there is a unit of space in which a change is observed. And either the magnitude of the space is large (such as the movement of...