Arc Forumnew | comments | leaders | submitlogin
3 points by rocketnia 1712 days ago | link | parent

I like this whole discussion, akkartik and shader. You've both made a lot of excellent points, and I found myself nodding along to one comment only to nod along to the next as well.

Right now I have a lot to say about the math analogy in particular. (The rest of the discussion has been great too.)

Mathematics has a lot in common with even the messy parts of programming.

Mathematics involves a lot of computation, and not necessarily of the digital computer kind or the arithmetic kind. If a human reader has to look up a definition of a word they don't know, then they're practically having to perform a manual lambda calculus substitution step. A lot of popular concepts in math are subject to transitive closure, which gives them indirect consequences hidden away on non-obvious reasoning paths. A lot of topics in math have to do with metareasoning and higher-order reasoning. Altogether, the kind of effort it takes for a human reader to understand a mathematical claim can involve a lot of the same things that on a computer we'd consider program execution.

As much as people might not like to admit it, mathematical theorems don't always take the form of the precise "don't lie" abstractions shader is describing. When a proof has a flaw in it, people still make use of the theorem, either by conjecturing that it's true anyway for some as-yet-undiscovered reason, or by explaining why the flaws in the proof don't matter in this context. In domains where it makes sense to change the mathematical foundations (e.g. deciding to use a different logic or a different set theory), a lot of the bread-and-butter theorems and concepts of mathematics can end up having flaws in them, but instead of coining new names for all those theorems and concepts, it's easier to use the same names and merely describe all the little patches that are necessary for them to work. So I think mathematics makes use of its share of abstraction-breaking techniques, techniques a software engineer might simulate with some combination of dynamic scope, side effects, preprocessing passes, code-walking, aspect-oriented weaving, dependency injection, or something like that. (This is mostly visible to people who are trying to make the math precise enough that a computer can verify or assist with it.)

Of course, math is not quite the same as software engineering. Unlike software, math is written primarily for humans to understand, and it only incidentally has computational aspects. This influences the kind of BS that's possible with math, both for better and for worse.

- For the worse: Once a mathematical argument goes on for a bit too long, humans rarely have the diligence to require that every single part of that reasoning makes sense; they're content to give leeway to some parts that they already feel they understand clearly. Some popular points of leeway end up serving as the foundations for a lot of mathematics, and we might call those the "syscalls" of math. Of course, every paradox of barbers and liars and time travel and infinity and whatnot reveals that humans are stubbornly hospitable to inconsistent ideas, and software engineering shows that humans' leeway leaves room for bugs on an extremely frequent basis.

- For the better: Since humans are in the loop when it comes to reading and sharing mathematical results, the kind of BS that confuses and dismays people has some trouble thriving. If the effort it takes to apply a mathematical concept is too full of hacks and spaghetti, people probably won't find it to be their favorite concept and won't share it with each other. (Of course, there seem to be some concepts which make a lot of sense once people get to know them, but still seem to require a rather circuitous route to learn about. In this way, people can end up being enthusiastic about parts of math that look, from the outside, to be full of nope.)

Considering all that mess, math nevertheless has a reputation for leakless abstractions, and that reputation is well deserved. "The study and development of leakless abstractions" would be a fitting definition of mathematics. The mess comes from the fact that humans are the ones discussing, developing, identifying examples of, and using the abstractions.

Likewise, even if software engineering deals with a big mess of leaky abstractions a lot of the time, the leakless ones are an important part of the design space. Unlike hardware, software code is a mathematically precise chunk of data, and the ways we transform it and compile it are easily a mathematical topic with lots of room for leakless abstractions. The reason (and perhaps the only essential reason) for the mess is that humans are the ones discussing, developing, making hardware for, and using the software.

While it's clear that math and software are two worlds with notable differences -- distinguished at least by the presence of computer hardware that gives a user meaningful value out of using software they don't know how to maintain -- I believe software could very well develop a popular perception as a world of Platonic forms, the same kind of perception math has. It's not that farfetched for people today to say, "obviously, as soon as you put an algorithm on a device and execute it, it's not the same algorithm." What if someday people say nothing we build or do can be a "true" algorithm because an algorithm is a Platonic concept that our world can only approximate?

Is that the right perception for software? Well... is it the right perception for mathematics? I think the perception doesn't matter that much one way or the other. Everyday software can have leakless abstractions of the same kind everyday mathematics is known for, and many of math's abstractions are actually riddled with holes in ways software engineers might find familiar.