grundar

grundar t1_irs1w9b wrote

> by 2050 we’ll need to remove and store 5-16 billion tons per year.

This is the key point.

Direct air capture of CO2 is far too small-scale to be an excuse to slow down emissions reduction; that's not even on the table.

What DAC is useful for is:

  • (1) Offsetting hard-to-decarbonize edge cases.
  • (2) Reducing atmospheric CO2 to minimize overshoot.

To accomplish either of those goals, though, DAC will need to be deployed at a massive scale mid-century, and research shows it takes decades to scale up to that level of operations. Scaling up a large industry by 10x takes ~15 years, and there's less than two of those before 2050.

So the point of working on DAC now is not to have an excuse to delay emissions reductions; nobody serious is proposing that.
The point of working on DAC now is so it will be available at the scale needed in mid-century.

Mitigation at this scale takes decades of preparation.

26

grundar t1_ird670q wrote

> > For example, regulated capitalism
>
> You are talking like we are in the early 90s and we have a free market.

You'll note that I never once said "free market". I used the phrase "regulated capitalism" very deliberately, as history has shown us it tends to deliver better social outcomes than unregulated capitalism.

> Now the real picture is 2 investment funds own the world and all new companies are created with investments from them.

If you feel that is accurate, I would encourage you to learn more about the world.

3

grundar t1_irc1tm4 wrote

> Also consider that resilience in the nowadays world means order, violence and punishment.

Not necessarily.

For example, regulated capitalism has a certain amount of resilience to greed by way of co-opting it. One way to satisfy greed under regulated capitalism is to capture market share by making a better and/or cheaper product, which in turn offers benefits to the rest of society. In that way, an anti-social impulse (greed) can be co-opted into providing a pro-social outcome (improved goods for others).

It's by no means perfectly resilient, of course -- greed can and does lead to significant anti-social results under regulated capitalism -- but that does provide one example of a way in which resilience to bad behavior can be a result of system design rather than coercion through force.

5

grundar t1_iranrf0 wrote

> If their percentage of society becomes dangerously large then people will vote what to do or the system will naturally collapse and be replaced with something else.

i.e., if one of your foundational assumptions is wrong then the system won't work.

Put another way: the system won't work unless your underlying assumptions about human behavior turn out to be true. That is, unfortunately, a key flaw that has doomed many utopian visions in the past, and will likely continue to do so in the future.

One of the most important features a socioeconomic system can have is resilience to human misbehavior. A great many systems would work if humans behaved in just the right pro-social ways, but history has shown us that it's very naive to expect that from large (10M+) spread-out (country+) societies. One of the great successes of modern systems such as democracy and regulated capitalism are that they are fairly resilient in the face of bad actors.

Take, for example, democracy vs. dictatorship. At its best, a dictatorship can be amazing -- decisions are made quickly, efficient solutions are deployed, waste is minimized -- but in reality bad behavior tends towards the "Dictator's Trap" of fear leading to poor information leading to poor decisions leading to poor outcomes; at its worst, dictatorship results in genocide and collapse. Democracy, by contrast, has much less variance -- it can never be as efficient as an enlightened dictatorship, but it will also never sink to the depths of a corrupt or murderous dictatorship, and history has shown that on average the democracy will tend to give better results.

So while it's certainly interesting to consider alternative social, political, and economic systems -- and while there are almost certainly better ones out there that we haven't tried yet -- those systems have to be resilient to human misbehavior to be even remotely realistic.

3