DukkyDrake

DukkyDrake t1_iwiz5m3 wrote

Tachyum Prodigy Offers 128 AI Exaflops for Slovakia’s €70M World Fastest AI Supercomputer

>The datacenter footprint of the Air-Cooled Prodigy Supercomputer is 107 racks of computer nodes and 10 racks of storage. The Liquid-Cooled Prodigy-based Supercomputer is 48 racks of computer nodes and 10 racks of storage. Future deployment of a modular supercomputer on the first generation Tachyum Prodigy platform is scalable up to 4 DP exaflop and 1 AI Zetaflop at less than 70 megawatts for only 500 million EUR.

>The Prodigy-enabled Slovakian supercomputer would be in AI 7x more powerful than the NVIDIA Eos, which is anticipated to provide 18.4 exaflops, and over 25x more powerful than the Fugaku Supercomputer, which is currently the world’s fastest. Tachyum rack-based solutions offer comparatively more powerful performance than Tesla Dojo and the Cineca MARCONI100 computing systems, which are ranked among the largest and most powerful supercomputers today.

I even found an older reference to the proj in an article from 2020 "Slovakia aims to build world’s fastest AI supercomputer, Technology News By Nick Flaherty"

2

DukkyDrake t1_iw330uz wrote

>What if the future doesn’t turn out the way you think it will?

It probably won't. You do what people have always done since time immemorial, you adapt to an ever-changing world and survive.

I expect something towards the bad end of the spectrum in the US thru midcentury. If longevity escape velocity is achieved much before then, the bad case could be perpetual.

>The Economics of Automation: What Does Our Machine Future Look Like?

1

DukkyDrake t1_iujqbym wrote

Hope this doesn't inspire George Church to start working on his idea to solve global warming by making ocean bacteria immune to bacteriophage. The idea being if ocean bacteria aren't regularly exterminated by bacteriophage, they would quickly eat up dissolved co2 in the oceans.

This was one of the most terrifying things I've ever heard of at the time.

6

DukkyDrake t1_iu8tuu4 wrote

How about AGI dismantles Mercury and manufactures quadrillions of von Neumann berserkers to inhibit the rise of other intelligent species in the universe.

1

DukkyDrake t1_itnwmz5 wrote

> The main problem with this is that you are assuming all change is bad.

No, I just never worry about the good cases. The good case is the default state, one only need concern themselves with the bad cases.

>I don’t see how you’re jumping from fast technological progress to society stopping.

How can you not see that pathway. The biggest is fast technological progress creates a super intelligent agent and it accidentally kills everyone.

>which is whether it is more likely to be positive or negative for society

One cannot predict what the world looks like after the singularity, hence the name.
One can theorize about the kinds of tech the avg person could get their hands on, just about anything permitted by physics, and what would do with it. It would take just 1 person to make that choice.

>historically the industrial revolutions and things of that nature which are a type of scaled down singularity have been extremely positive for society

Did those past events, which played out over decades, provide each human on earth access to superhuman competence & labor?-No.

There is no point considering any good when superhuman competence & labor could allow an endless number of maximally bad events. Some prankster is bound to create that suitcase containing 50 billion flying insect bots 200 micron in size, each carrying a 100 nanograms payload of botulism toxin.

1

DukkyDrake t1_itnn0dn wrote

When I say societal changes, I refer to the large societal level structures that enable society to function. If the supply chain for food stuffs stops working, how long do you think you could survive after the shelves at your local supermarket goes empty. That sort of thing.

If changes occur in days or weeks and not on decadal or centuries timescale, parts of your society could stop functioning before the tech that permit some workers to walkaway reaches everyone. That's why fast societal level changes destabilize society and endangers its ability to support its populational. It's almost always bad, the only sorts of thing that causes fast societal level changes are usually actual revolutions/wars. Industrial revolution wasn't quick, that played out over the course of almost a century. Collapse all those changes down to a week or 2 and you might be closer to the mark.

That's also why there can't be a Ubi before AI replaces most job, the desperate low way wage workers could just walkaway resulting in civilizational collapse.

>I think that society will change at whatever rate is necessary,

Society isn't a monolithic thing, it's a bunch of individuals doing their own thing. It's structured in such a way where there are only a few degrees of freedom that allows one to survive easily. Basically, get a job, earn money, buy food and shelter. Increase those degrees of freedom with tech that allow survival independent of society, most people would walk away. Tech never spreads uniformly and instantaneously. You need society to survive until you reach a state where you no longer do, transitioning from one state to the next is fraught with peril.

>this says nothing of the rate that social progress will move

The Technological Singularity has everything to do with that rate of change on human society, else no one would really care. The Singularity isn't a future point where companies are simply selling new fancy consumer products faster and faster, this event won't be in the control of baseline humans.

>is a hypothetical future point in time at which technological growth becomes uncontrollable and irreversible, resulting in unforeseeable changes to human civilization

> "centered on the accelerating progress of technology and changes in the mode of human life, which gives the appearance of approaching some essential singularity in the history of the race beyond which human affairs, as we know them, could not continue"—John von Neumann

>The concept and the term "singularity" were popularized by Vernor Vinge in his 1993 essay The Coming Technological Singularity, in which he wrote that it would signal the end of the human era, as the new superintelligence would continue to upgrade itself and would advance technologically at an incomprehensible rate.

1

DukkyDrake t1_itktg3q wrote

I'm saying it's the rate of societal changes that is usually bad, nothing to do with specific technologies. When it comes to specific tech being bad post singularity, history has no lessons to teach. A smart phone being bad or good doesn't quite relate to every random person on the planet getting access to tech that could end human civilization. Better smartphones or whatever is no good to anyone if they are not alive to enjoy it. It's completely unreasonably to equate the singularity with anything that came before.

1

DukkyDrake t1_itj280d wrote

I think you need to brush up on singularity theory, you're comparing very different things and timelines.

I bet you don't have a nuke in your basement, what if you could and it would cost you no effort or money.

Ex:

>Molecular manufacturing raises the possibility of horrifically effective weapons. As an example, the smallest insect is about 200 microns; this creates a plausible size estimate for a nanotech-built antipersonnel weapon capable of seeking and injecting toxin into unprotected humans. The human lethal dose of botulism toxin is about 100 nanograms, or about 1/100 the volume of the weapon. As many as 50 billion toxin-carrying devices—theoretically enough to kill every human on earth—could be packed into a single suitcase. Guns of all sizes would be far more powerful, and their bullets could be self-guided. Aerospace hardware would be far lighter and higher performance; built with minimal or no metal, it would be much harder to spot on radar. Embedded computers would allow remote activation of any weapon, and more compact power handling would allow greatly improved robotics.

0

DukkyDrake t1_itglbj1 wrote

>The technological singularity—or simply the singularity—is a hypothetical future point in time at which technological growth becomes uncontrollable and irreversible, resulting in unforeseeable changes to human civilization.

There is no way to predict what the world would look like after the singularity event, but any system that quickly undergoes uncontrollable changes have historically been almost universally bad in the short run for anything that depends on that system for its survival.

If the singularity is fueled by a super intelligent monolithic agent, extinction of all life could be the result.

19

DukkyDrake t1_it9uood wrote

>The Economics of Automation: What Does Our Machine Future Look Like?

It depends on AGI, it will probably be the bad outcome if AGI is the size of a building and runs on gigawatts of electricity. If it runs on commodity hardware a group of western professionals can pool their funds and acquire, it could be the good case or Mad Max Beyond Thunderdome level worse case.

1

DukkyDrake t1_istxsyf wrote

Physical products will never scale as fast as digital ones.

What happens to prices when there is an oversupply. The capital that creates production capacity isn't doing it because their owners are humanitarians, they expect a profit. It takes controlled production and supply to maintain markets. If prices collapse, production will follow, even if that production costs very little. Unless every individual has their own pet AGI, I expect the same forces and dynamics to be in play in society.

1