Community and Technology (July 2018)

Economics is the fundamental tension between two poles: community and technology.

Technology is synonymous with capital, in both senses of the term: capital as machinery and capital as abstraction (money, financial derivatives, etc.)

Machinery is that which delimits itself from the world by controlling the flow of energy that passes through it.

It's true that organisms do this, but not to the extent of what we consider to be machines. Computation is the idealization of machinery, with energy controlled to the degree that all difference is eliminated except for that between 0 and 1.

Currency works as a similar simplifier: the value of everything is reduced to a single number. I've said earlier that computation is zero-dimensional; money is one-dimensional.

Unlike machinery, which makes an implicit cut between system and environment, community is a gestalt: phenomena like sharing and trust don't consist of things that belong strictly to an individual but act more like a field in which people act according to a certain dynamic.

I've railed on dualism many times in the past, but ultimately dualism isn't a false ontology, but an artificial one. Natural science, technology, and civil society all rely on an implicit dualism because our bodies are the limit to how much we can intimately know: we create abstractions in order to save space.

Descartes' error was in mistaking this entente for some kind of absolute truth.

But none of this is to say that community is obsolete let alone irrelevant. Institutions get devoured by corruption for the same reason that machines break down: the real world inevitably corrodes the abstract.

What the modernists of the early 20th century didn't understand is that our machines, systems, and theories need a constant influx of human ingenuity in the same way that humans can't survive without gut bacteria.

But human ingenuity can't flourish if our humanity is taken away. The logic of technology is such that it will, left to its own devices, wring every drop of efficiency out of its surrounding ecosystem, which leaves no room for the kinds of activities that make no sense to technology but vital for the illegible knowledge necessary for building and renewing technology.

Computers have to have their inputs and outputs connected to something in order to do anything. The doomsday scenario of a superpowered AI building paperclips at any and all costs is inseparable from the idea of making society overly dependent on any kind of system. The risk is very real, but most AI doomsayers don't understand that AI is just a variation on a theme.

The argument made by Thomas Piketty that unfettered markets cause returns on investments to outpace actual economic growth is getting at something but in a confused way.

Returns simply can't outpace growth in the long run because there eventually wouldn't be enough wealth to keep going around. Financial crises are the inevitable result, and perhaps because of the inevitable finagling that institutions do, war is the only phenomenon recorded to actually flatten wealth inequality.

But it's a tempting mistake to think of this as the greed of the elites (NB: I'm not denying people get greedy): the bank accounts of individuals are just where the capital parks itself, the real phenomenon is systemic overoptimization.

The Death and Life of Great American cities shows us some more concrete example of the need for community. Gigantic planned parks and speculative high-rises inevitably lead to desolation when there's a lack of low level everyday interactions that tie the community together. The runaway financial feedback loops that drive people out of neighborhoods eventually leave inert husks of things that served narrow inhuman purposes.

Technology is a vital force, but left completely unregulated by the host ecosystem, it inevitably optimizes itself to the point of collapse, and in many cases crushes a lot of people when it falls over.