< Soundtrack: Human Fly - The Cramps >
To Rent The Land, You Must Create A Forest
- the Dongshi Forest District Department (src)
True freedom lies where a man receives his nourishment and preservation, and that is in the use of the earth.
Note: I wrote a first draft of this in approx. January of 2017, and have since simply edited and clarified some of the ideas. It seems important to not upset the implicit chronology of the posts on this blog too badly. Additionally, the majority of the thoughts expressed in this essay originally came up in conversation with Skinner Layne, who it is necessary to credit here.
I keep getting into inconvenient conversations with people about why I'm against the idea of a basic income. I bluntly said to someone while exiting a party a couple of nights ago that "implementing it would increase the likelihood of a genocide occurring between the cultures benefitted initially by its implementation." From what I can tell, my conclusion is the exact opposite to the one that I'm generally seeing believed by other people in my life. In the short term they're right. There's going to be a lot less people who are starving on the streets. They obviously shouldn't be there, starving on the street—the whole point is to reduce suffering overall.
Yet, I think they're missing something. It seems like around me, most people's model of long-term is roughly the same as my model of medium-term. There's a general bias against long-termism as a strategy for effectively doing good, if only because the channel of direct experience is by necessity more influential to individuals than the capacity of said individual to simulate hypotheticals. Thus, individuals generally want to satisfy that direct experience, rather than necessarily satisfy that direct experience while simultaneously satisfying the base-complexity necessary for achieving their long term goals. There's also a whole other school of thought—overworking oneself—that assumes one should only satisfy the base complexity for the largest of long term future goals, but that inevitably bankrupts the individual in short term. This leads to a situation where the process of overworking oneself must become performative to requisition enough resources to continue with hope of taking medium term goals as otherwise the loss of agentic capacity from self injury is just too high to continue.
Anyways, the central assumption regarding basic income is that it would provide an adequate social safety net for allowing individuals to avoid being starved to death after being outmoded by automated labor. There is, of course, the inherently antiprotestant other school of thought that suggests that one should not have to work to live, but that is not being taken terribly seriously by the mainstream even if it is clearly more ethical by many compelling metrics. The core notion is that we must subsidize humanity, as we are going to be able to fundamentally outmode each and every of the individual functionalities of humanity with mechanistic replacements. This context-free disintegration of the components of human-build automation is supposedly going to outcompete humans on the terms that humans have attempted to set in the current market economy, leading to absurd scenarios like Bill Gates talking about intensively taxing machine labor to fund such a subsidization scheme. Admittedly, I'm a game designer, and thus the kind of person who finds income taxes absurd as the notion of disincentivizing technological capacity increases because they might be used badly seems massively more difficult than just telling people the old Parkerian adage that with great power comes great responsibility. It seems that it's going to be easier to get at least a segment of humans to wield automation in a largely pro-social way than to ban the construction of tools. Should a segment of humans be willing to wield automation in a pro-social way, they will leave the rest of humanity behind. They will hold the freedom to decide what to do with those who did not choose to embrace the new order ordained by the new automation technology. Additionally, it's not as though anyone besides the most aggressive primitivists are suggesting that automation technology is by definition a net-negative; its technology, like any other. The narrative that we should create a basic income is certainly better than the assumption that we have to create jobs, as the latter is simply a way of marking costs as gains and ignoring the benefits of from any gains we've taken.
With all that out of the way, to the point of why I'm pretty sure basic income isn't the right strategy for dealing with all of this. Beyond the impossibility of the ban automation narrative, the prohibition of dominant economic strategies results in a segmentation of the political body. This leads to a scenario where, from the perspective of the body as a whole, some segment of the population is interpreted as a cancer on society and thus deserving of removal. This casus belli may emerge regardless of whether or not the larger societal body is able to have compassion for the circumstances that led to that population being in a position of parasitism. It does not matter why the parasitism emerged, but rather that the parasitism itself provides the narrative for the casus belli. It also does not matter if there is actual parasitism going on; provided one can use sufficiently socially proven information the social system will grant the authority to do violence against the allegedly parasitic population.
Has this not already transpired in recent centuries? Denied the provision of forty acres and a mule, the African Americans of these United States have been demonized as abusers of welfare systems. This is clearly fraudulent. The African American community was simply denied enough investment to guarantee physical autonomy from the whims of a market defined by the ability of their former masters to continuously offer predatory terms of trade, and a sociocultural environment defined by legitimized lynchings and other terror campaigns. Similarly, can we not triangulate the scenario of the African Americans with the lost cause Southerners who traveled the Oregon Trail, leading them to homesteads and by extension autonomy, securing middle-class status? From this, can we not assume that the key to autonomy is simply autonomy, defined by one's relationship to one's environment? While it is possible that a welfare system can provide necessary stop-gap assistance to people in need, the case of the American welfare queen narrative seems to show that individuals who use charitable infrastructure designed for exactly the kind of unpleasant scenario they have found themselves in will find themselves spuriously accused of abuse by using the very infrastructure put in place to offer them needed relief.
Unless a given population owns or is meaningfully integrated with a means of production, grants of cash rather than capital is effectively a kind of trap. The spending of said money will simply result in an exacerbation of the trends that have placed them in such a position of disadvantage in the first place, akin to the company store that is never incentivized to grant its employees proper autonomy when it pays with company scrip. This, coupled with the creation of political divides that will likely lead to violence, seems like clear enough justification for an opposition to basic income as it is currently presented. As basic income does not grant any degree of real, meaningful control to the populations that are granted resources, it is never in the interests of the governing body giving out such resources to give more than the level of substance necessary to avoid a loss of status from letting the recipients of the basic income starve. Support for a basic land grant program, perhaps based on a trial period of stewardship where one demonstrates the ability to gain a degree of economic autonomy from the use of said land. Provided that one gains such autonomy, one is increasingly immune from abuse by the powers that be.
This is not to say that monetary investment in the disadvantaged sectors of the population is not a worthy thing situationally, nor that there is something worthy in systems like unemployment insurance that are able to provide a safety net through the mitigation of risk. The key is developing a system that is able to integrate humanity into their own systems of production, rather than lining up an eventual conflict. It's worth noting that the Luddites had the gall and panache to actively destroy predatory machines, and that the Diggers had a fundamental connection to the soil. Both of those movements took their own subjective perspective of their value and their values, in the sense of their map of the world, and compared it to their economic status, determining that it wasn't worth playing by the rules of the dominant culture. They would not be tricked by bread and circuses, nor by basic income. While I am sure that both of those movements lacked the imagination to bind technology to humanist purposes, and in that process would produce suboptimal or even possibly unpleasant worlds worlds, their understanding of a rejection of abstract capital in favor of the creation of a concrete environment is a consistent and reasonable position. The Diggers had the intelligence to try to design a program of life that might be able to last intergenerationally rather than trying to freeze history through violence like the Luddites—always a losing proposition. One must sympathize but not emulate those who are left behind by time and just want everything to stay like it is, but of course no man steps in the same river twice.
This is the core question that I want to shed light on in this debate: do we want to create a world where humans are able to exist in a world that isn't toxic to their existence, or do we want to give up, assume that our technology is fundamentally anti-human, and try get on with our lives while we are still able to have them while dooming ourselves to violent confrontation a few generations down the road? To rent the land, you must create a forest. In order to hold territory, you must make that territory tolerable for you. One must select for an environment whose various interdependent component parts select for a more-ideal version of one's self. The feedback process therein makes the kind of progress that doesn't leave you behind. I'm sure that if we'd consciously and competently implemented this strategy earlier we wouldn't be in this mess now, as we are currently dealing with the after effects of all of other previous economic automation crises, with perhaps the habits of Keynesian overclocking to national finance as the last major element that really became integral to our structure without ever being used according to a robustly safe doctrine. The creation of a world where technology is not toxic to its creators is a monumentally difficult task, far more difficult than simply taxing robots, yet it is a task that we are likely capable of completing if we are able to not make the same mistakes that we did previously at lower levels of technology. The exact strategy of this is a subject for another time.