A photo and art journal of my efforts to inspire creativity and universal love by means of an Experimental Collaborative Art Rebellion.
Sunday, March 26, 2017
Thag and Gorgar
A Response to Bill Joy's "Why the Future Doesn't Need Us"
I seem to recall a similar conversation between two people squatting outside of a cave just prior to the dawn of civilization arguing about the long term effects of chipping flint rocks into smaller more efficient arrowheads.
A long time ago (spoken, for some reason, with British accents)
Gorgar: Thag, have you considered that if you actually manage to invent these so-called "Arrowheads" of yours that consequent humanity may become all-too-powerful, possibly going on to invent "farming" thereby destroying the rainforests and other vital ecosystems, murdering countless species in the process, and I might add, eventually over populating the world, creating a hyper-commercialized, superabundant yet spiritually diminutive and perverted race of hideously obese personages who are too weak and lazy to hunt a Mastadon across the frozen tundra, or live in right ruddy-good leaky dripping dark dank and filthy caves like good people, and finally creating a Hegelian one-world Totalitarian Republic, only to finally engulf the entire planet in a cataclysmic self replicating automatonic "Gray Goo" which would be certain to destroy all life on earth? I mean have you taken all of this into account before you simply forge ahead with implementation of this, this, thingie? Oooo, it's pointy.
Thag: I realize that, Gorgar, but right now I'm more worried about the Saber tooth tiger trying to track down our kids. (Shouts) Thag Junior! stop playing with that "Fire" and go hide in the cave! Hurry up! Good lad... You see, of course, Gorgar, that we won't last long, as squishy pink snacks for giant hairy clawed and fanged predators if I don't finish this up...".
Gorgar: Dreadfully good point, Thag, but still, I'm worried about the long term consequences...
Thag: Well, Gorgar, I share your concern, but as I've said, I'm afraid we have more pressing matters at hand and will have to trust that our descendants will evolve huge enough brains to figure out how to deal with the situation at the time.
Gorgar: Yes, I see what you mean, Thag, but don't you agree that at a certain point highly optimized automation will exceed our descendant's ability to cognize and maintain the technological infrastructure... eventually the machines will simply have to control the systems because they will be too complex for any organically based brain to comprehend, and then, THEN I say, our descendants will become slaves to the "machine"... that's my concern, you see. The thing that quite troubles me the most is the idea that a tiny technocratic elite will subdue the vast majority of humanity with economic and military coercion supported by mind and body controlling nano technologies. Well now, that's a nettlesome thought, isn't it?
Thag: I worry about that too, Gorgar, but I'm not sure what we can do to prevent it. Besides, I think there actually is a larger issue at stake. Eventually that asteroid, you can see it next to that star if you squint [note: eyesight was better in those days], is going to wobble past that little moon over there [note: Europa] and it's trajectory [note: grasp of astrophysics was better in those days] is going to send it on a direct intercept with the earth... by then, hopefully, our descendents will have invented space machinery that can turn it aside before it destroys everything again... you remember what happened last time?
Gorgar: Oh yes, that was a bloody awful mess, now wasn't it?!." (winces) [note: memory was better in those days]
Thag: Right. So my point is that despite the dangers, we really just need to keep growing and evolving and implementing because we won't survive at all if we hide in these caves and reject the technological advancement of the smaller flint arrow heads forever. If the Saber tooth doesn't get us today, the asteroid will get us tomorrow, you see.
Gorgar: Hmmmm... well, what if we put some sort of cap on the advancement of technology... for instance perhaps we might keep inventing new technologies until we get to a certain point... like self-replicating automatons or cellular nanobots? Then we make a strict policy to stop there.
Thag: That might not be a bad idea, Gorgar, however I suspect that we will not be able to do that for two reasons. Firstly, at what point would we decide to stop? And how do we know that that would be the right point? What technologies hide just around the corner from the last one that we have not predicted that will turn out to be necessary in the long run? You have to remember that each time we make an advance it creates new problems, and then we advance another step to solve them. The second issue is that by attempting to put a cap on the advancement would in all probability require the creation of the very sort of totalitarian regime you are worried about preventing in the first place, since it would require such an organization to monitor human behavior sufficiently to be assured that all such restrictions were adhered to by all scientists and local and regional governmental bodies.
Gorgar: Ah yes, quite right. But to your first argument, at some point won't we hit the law of diminishing returns and wind up with technologies that produce greater risks than their benefits confer?
Thag: Why, yes, Gorgar, it would be, however that is where you have to remember that if we proceed with technological advancement the system will not be a fixed pie as it is now, but an expanding pie. Every time we advance the opportunities also advance... eventually we will colonize the planets and the system will adjust to include new worlds. In that way the risks at any give point in the system will again be proportionally less than the overall benefits, as we need to advance technologically in order for the pie to expand.
Gorgar: Hmmm... well, I suppose I might buy that argument, but it just strikes me as risky, Thag.
Thag: There are no guarantees, Gorgar. At any point in the system we can wipe ourselves out. Most likely sheer greed and stupidity would be the cause. During each phase of technological advancement there will be inherent risks to the entire system. However, also consider that at any time a stellar incident could wipe out all life on the earth, anyway. If we do not advance then we will only have one planet to live on, and when this one goes, well so will everything our descendants might build and create. All of that wonderful art and poetry and music (uh... of the future, of course) will vanish from the universe forever. That's what I'm worried about... pass me that oak shaft will you?
Gorgar: I suppose you are right. Anyway, as to your second argument, what really troubles me there is that these flint arrowheads are going to doom our descendants to a miserable gray existence where the vast majority will become more and more superfluous as the means of production are ever increasingly put in the hands of machines. There will be more and more people, since we are just basically hot to trot for the ladies, naturally, but eventually we will hit the point where there are too few necessary jobs required as the automation takes over and does everything for everyone. I imagine the result will be like a grim sort of hell on earth either the elite will exterminate the excess humanity, or in the best case scenario they will 'compassionately' produce massive shambles of poverty pierced by tiny pin pricks of extraordinary opulence and sickening decadence, all pervaded by a syrupy mix of perversion and localized violence. That is the future that I fear.
Thag: Well said, Gorgar. Your insight into human nature is keen as the Plesiosaurus' eye. Perhaps we face the choice of a society of elite technogogs controlling the masses with nanobots, propaganda and ridiculous distractions, or perhaps even the complete extinction of the human race, and quite possibly life itself.
Gorgar: Hmmm... I just have to believe there is a better alternative than these.
Thag: Right! how about that our descendants evolve a generally higher sense of morality and compassion over time, that overall they help as many as they can while still advancing to colonize the worlds, forming a fairly stable, fairly decent form of government that works toward the reasonably achieved good of as many as possible, providing legal protections against totalitarianism and other forms of slavery, and gets lucky by not screwing up and destroying the world with the accidental or intentional release of the Gray Goo?
Gorgrar: Hmmm, well, Thag, I guess that will just have to do, but it is still risky. What do you think the chances of things going that way actually are, given what we know about human nature?
Thag: Well, hard to say, Gorgar, but I'd give it at least a 50/50 chance.
Gorgar: Those are disturbing odds, my friend. Good luck with it all.
Written March 20, 2001 by ArtRebel
Subscribe to:
Posts (Atom)