Skip to main content

Context, reconciliation, and performance

I believe the software industry needs a word to differentiate the design of performant software from the inside (or bottom up) versus the outside (or top down). There is a heavy focus on algorithms and data structures in the software engineering world, which is great if you are designing performant software from the inside out (bottom up). But this will only be performant so long as there is one monolithic structure. As soon as there are competing components (or silos of functionality) that need to integrate together, the integration of the components often becomes a performance bottleneck. And this phenomenon is unavoidable because there is a limit to any human's working memory. Large problems are broken down to smaller problems so that they can be more easily grappled with. But the choice of how things are isolated often can incur a performance gain or loss that is just as significant in magnitude as choosing the right data structures and algorithms. And it is this way because data structures and algorithms are precisely the tools used to reconcile state between independent components. If two components are designed in a way where they are opposed to each other's existence, then one would essentially need to copy the entire state from one to the other to reconcile their states. Sometimes the reconciliation algorithm is even worse than copying if copying is not an option (such as when the data tied to the component is too large to copy, e.g. a component is a database).
There does not seem to be a word to measure how holistically designed a set of components are. There does not seem to be a way to talk about how well a system is intelligently designed. Often a system that uses the simplest algorithms within the components (but integrates the components coherently) out performs a system that uses the best algorithms within the components (but does not integrate the components coherently). People think this is all still falls within the realm of data structures and algorithms because all state reconciliation is a data structure and algorithm problem. But what I am more interested in is having a way of measuring how intelligently designed a system is holistically. The absence of extremely powerful algorithms is a sign of good holistic design in my opinion. A very intelligently designed system should have one core algorithm in my opinion. Yet it is very hard to distill complex problems to simple mechanisms. There has to be a word to describe someone's ability to do this distillation. And it needs to be a common word in the software engineering industry. It's just as powerful a tool as choosing the right data structure or algorithm. It's the ability to hold many things in the mind at once. I would use the term "working memory", but the term could be overloaded from existing definitions. The term I am looking for should describe the ability to create handles on the many dimensions and aspects of a problem, so one can swiftly mold the problem into a shape that can be solved in the simplest manner possible. It is a compressive and expansive dance.
Every talented software engineer has this substance in spades. They subconsciously break down complex problems to simple ones. Their designs allow components to integrate smoothly with each other because all friction was preemptively removed in the design process. And it is not something that is easily observable (e.g. you can't ask someone to demonstrate this quality because the ability is essentially a neurological phenomenon) outside of the final work. Bad software engineers don't even know of this substance's existence. They are blind to it. Or worse, they think they have a monopoly on it (and they think people who don't follow in their footsteps have poor taste), yet their final work fails to leave an impression.

Comments

Popular posts from this blog

Time, partitioning, and synchronization

Any time measuring method inevitably runs into the issues of partitioning and synchronization. Partitioning deals with the issue of dividing a larger measure into smaller measures, and combining smaller measures into a larger measure. Synchronization deals with the problem of how a set of devices can self-correct if some of them are corrupted. The two are fundamentally related because often a choice in one determines a choice in the other. A measure is often defined by a set of synchronization points, such as the radioactive decay of an element or the frequency of a crystal oscillator. Synchronization points can often be defined as a measure of a change in space, such as the revolution of a planet around a star, or the change in energy state of an oscillating structure. Fundamental to both is the notion of change. A synchronization event can only be defined if there is a unit of space in which a change is observed. And either the magnitude of the space is large (such as the movement of...

Individuality, creativity, and trust

Trust is an individual experience and a personal feeling. In a sense, trust only exists as a subjective experience. Objectively, the behaviors that manifest from trust are more akin to chemical reactions, where one's feelings of certainty in regard to one event trigger another event. The source of all trust is individual perception, and if one cannot trust their own senses, they can trust nothing. Fundamentally, the observation of natural events allows the mind to create an ordering. Such an ordering is often a direct reflection of the mechanical nature of the world. A feather falls when it is dropped. A pool of water evaporates in the sun. These certainties leave an impression on the human mind, and also set an expectation for human constructions. Subconsciously, individuals expect society to operate in a manner similar to nature. Just as a feather falls when it is dropped, they expect good deeds to be rewarded and bad deeds to be punished. One only has to observe a feather fallin...

Causality, Interaction, and Complexity

In a highly chaotic system, such as a high temperature gas, it is not ideal to use an atomic causal model. Instead, the effective causal model is to approximate to what extent each atom is interacting with every other atom. If we increase the temperature, then the number of atoms each atom interacts with should increase. As the temperature decreases, the number of atoms each atom interacts with should decrease. If we were to randomly sample any atom, then on average, the atom should interact with a set of atoms of a certain size. Instead of thinking in terms of conditional probabilities and causal implications, we think in terms of sets of interconnected events. And this is because it is not computationally effective to analyze chaotic systems in a linear manner. We can apply the same line of reasoning to sampling. If a system has a particular sampling rate, the inputs to the system are batched according to the sampling rate. In other words, the system cannot discern the ordering of ev...