An exploration into accelerationist extremism and its logical conclusions

Below I’m going to get into some very pragmatic behavior modifications that could follow logically from accelerationist beliefs.

Advances in AI, and biotechnology make it seem likely there will be substantial medical breakthroughs in our lifetime. However - I do not think these breakthroughs - which will include genetic modification of children, and live therapies - will be distributed to society. Even more - I think developments will be kept secret.

Why?

  1. Playing god is very dangerous and unpopular.
  2. Mainstream religions (Islam and Christianity) might have outright extreme moral objections that could result in violence.
  3. Extreme moral and legal challenges around human testing
  4. Democratic countries or majoritarian systems will not willingly allow a couple individuals to functionally become gods ala the immortals in Altered Carbon
  5. Limited adoption would lead to social unrest (i.e. why are these people allowed to extend their lives while we are not). This becomes even more severe when applied to offspring (think, Gattaca)

From this premise we can derive a couple likely conclusions:

  1. There will be a limited group of people who get access to these technologies early on
  2. These individuals will need to develop a high degree of trust with one another
  3. The entire process seems likely to happen in international waters
  4. Access to this group of people, and the technology described above will require vast amounts of wealth but also social signaling, political connections, and trust

For convenience, let’s introduce the idea of the Ark. The Ark is the group of individuals who get access to extreme life extension and synthetic biological enhancements (which could include integration between the brain and AI based interfaces) before the rest of society.

We can intuit traits about the Ark:

  1. It will be technology adjacent
  2. It will be libertarian and not rooted in socialist or majoritarian ideology
  3. It needs to exist at a supra-sovereign level
  4. It likely will have transhumanist, non anthropomorphic values as the entire premise is transcending humanity
  5. Pseudo-religious elements are necessary to preserve ‘in-group’ signaling, which is valuable because many religious groups or governments would be directly adversarial to the Ark and its members (personal safety issues)

Merely generating wealth won’t be enough to join the Ark, therefore - but generating wealth in the right way.

Once you have a belief in something like the Ark - things get very clear. Your two goals are:

  1. Generate vast amounts of capital via techno-libertarian, super sovereign mechanisms that appeal to other transhumanists and enrich like minded peers
  2. Not sacrifice your health to do so - as sacrificing health would be strong negative signaling

Pragmatically - this requires a complete rethinking of human priorities.

  1. Investing time in having children or mating is likely not smart before joining the Ark or at least becoming aware of in-embryo options which you can’t possibly be aware of now but are likely already at play. This would, of course, require a wife/ significant other who was a-priori bought into this idea. Which is pretty unlikely to arise spontaneously so it’s best not to waste your time/ others’ time looking generically.
  2. The amount of time required to sleep enough, exercise, diet effectively, and also generate extreme amounts of wealth/ resources to join the Ark (or be of use to it) is so high as to require cutting out virtually all non-essential activities. Wealth is likely an inadequate “ticket” - i.e. random oligarchs won’t be able to just buy in.
  3. Outsiders are potentially dangerous. Ark beliefs are fairly extreme and likely will require cultivating high degrees of trust with insiders. If outsiders do not share these beliefs, then they could endanger the group because of the high level of controversy surrounding various implementation details. Thus - extreme ‘in group’ thinking is beneficial. Someone who expresses ideas or thoughts explicitly against the Ark, essentially, have to be ostracized

We should probably specify this more thoroughly. Someone is likely a foe of the Ark if they:

  1. Express a belief that biology should not be meddled with
  2. Believe death is good or noble
  3. Believe that humans should not evolve
  4. Actively criticize accelerationism
  5. Have loyalty to an entity which would make life difficult for Ark members on the basis of being part of the Ark

Time should be allocated to people who:

  1. Actively believe in the Ark, its feasibility, desirability and its underlying ideological requirements
  2. Are likely to advance health/ wealth / influence requirements to reach the Ark

This exploration perhaps yields a set of questions rather than prescriptions:

  1. Would I rather skip this entire montage, live my life, and simply die?
  2. Is there a better way to join the Ark, or more direct than generic wealth generation + proximity? Such as joining a company likely to get proximate to the outcome?
  3. If the Ark were possible, how long would it be before results become clear? How long before severe danger would manifest for Ark members and from this perspective - to what extent is anonymity important or desirable? Possible given constraints?
  4. What governments would potentially protect Ark members? Is a sovereign actor ideal or would something closer to a “Network State” be ideal?
  5. Does the Ark already exist? How long has it existed for?

The thought exercise, for me - has been clarifying. It’s pointless to pretend to be a normie if your beliefs are extreme enough. You’re not doing anyone any favors by lying about what is important to you. Not your friends, potential coworkers, or mates.

If you believe in the Ark, and don’t reject its premise, it’s an all or nothing bet. There can be no hedges or half measures. Due to the extreme risk and controversy of the topic - it really is going to be one of those things where you’re either in, or you’re out. Seeing things clearly is a prerequisite to move forward.