Frumpagumpus

Frumpagumpus t1_j1wv2by wrote

ok but we on reddit and also, I mean the people talking about it are the ones implementing it (i'm no bigwig but I do have a pull request or two (to mostly unimportant projects)/am api consumer/implementer)

speaking of implenting actual stuff with the api, nothing I plan on in the near future would really have an ethical dimension i dont think. though I could see possibly doing stuff in like a 5+ yr timeframe where I might pause for a second lol.

another thought is that legislation is mostly written by staffers (from what I know of the USA system) and they might be here talking about this stuff...

1

Frumpagumpus t1_j1wdtft wrote

> I want to try and safely and ethically consider the societal application of AI,

hypotheticals have already been considered ad nauseam, i take this to imply you would advocate for some sort of pause, which I don't think is either possible or desirable.

I am just pointing out it's all been re hashed over and over already

1

Frumpagumpus t1_j1wbsed wrote

lesswrong has been spilling copious amounts of ink on this topic for like 2 decades, talking is done (well actually we are talking about it more than ever and bringing a lot of new people into the conversation), doing is now, what do you want us to do, consult 5 yr oids to see what they think?

in many ways (not all ways) technological progress has stagnated for years up until this point.

1

Frumpagumpus t1_j1mqw9q wrote

thanks for compliment, merry xmas,

to me principle -> rule as is theory -> implementation

agents traverse space, agents doesn't have ability to traverse all of space, also some parts of space will end agent, some traversals are not fair or logical

1

Frumpagumpus t1_j1mn38y wrote

> ambiguous

> Instead focus on principles that all stakeholders can agree upon.

idk even in math zfc axiom set is not universally used and some basic axioms like axiom of choice are considered controversial and thats about as low level/universal as you could possibly get

1

Frumpagumpus t1_j1kftsb wrote

since I am obsessive autist i hope u dont mind if i circle back to this, let me rephrase,

I think selfish gene hypothesis is kind of like saying the purpose of a computer virus is to replicate some snippet of assembly code it compiles to. I mean yes, it does that, but the purpose of the virus is probably better described as "steal your bank password"

it's not a perfect analogy because biology is actually more complicated and actually has more layers of abstraction, and there is actually more indirection and competition between the goals of the layers (e.g. maybe something like dna -> rna -> proteins -> bioelectrical and chemical signaling environment -> collections of cells -> organs -> organism -> population -> ecosystem), a similar hieararchy in a computer might be like processor -> assembly code -> thread -> daemon/service -> operating system -> network (but a computer is more deterministic and aligned and straightforward than a biological system) (just cuz something is at the lowest level doesnt mean it gets the final say on what the purpose of the whole is)

1

Frumpagumpus t1_j1ju1hw wrote

i think there are a lot of aspects of it you can anticipate even if it is incredibly alien, you might not be right about everything but you will probably be right about some things.

its quite possible to speculate about cultural changes that might arise from near lightspeed communication, thinking at 5 gigahertz, brain encryption, software brain cloning, software brain merging even though those are all alien scenarios to our current daily lives, you can draw boundaries around the space of possibilities and you might be right about them.

i have no idea what comes after a dyson swarm but I doubt it will be finished (assuming no ai ruin, which you can also speculate about) sooner than a hundred and fifty years, maybe even 500 years from now which is pretty far out.

4

Frumpagumpus t1_j1h3l82 wrote

i think dawkins selfish gene hypothesis is mostly wrong.

biological systems are in programming terms, function factories and not functions themselves.

They don't have discrete goals, just constraints. They amble along in a higher dimensional "goal space".

but yes I'm sure there will be some better scissor statements.

similar to your worry but what I would be more worried about is someone using a programming AI to develop a family of viruses that almost simultaneously encrypt all computer memory on the planet lol. as far as existential risks go.

1

Frumpagumpus t1_j1h30ii wrote

a war is one way to characterize it, i see it more as communities form, enclaves form, inevitably, as a result of the communication constraint that distance creates.

even in a microprocessor different parts of the chip will have their own memories, registers, caches.

1

Frumpagumpus t1_j1h1pyn wrote

i read everything there i posted half of it lol. Did you read the extreme views? Bot is just suggesting a slightly more interesting version of secular humanism (minus anthropocentrism).

Minus that take your point is fair, but uh, yea that exact same thing is gonna happen with the other religions and whatnot

people gonna people

I'm an ex christian, but my problem was never with the social structure of the churches i grew up in, it was that their axioms were just plain wrong and led to ridiculousness, and they couldn't let them go. I would even describe some subset of the practices that constitute Christianity as basically good - singing together, being at least a bit cautious around sex (they take it too far obviously unless you are in like a unitarian universalist church), volunteering, prayer even sort of, forgiving can be under emphasized, etc.

There was always a little bit of spice that churches had that hackerspaces lacked, at least a little bit of shared ideology, a communal intent and willingness to set aside individual self interest (typically at a hackerspace there will be a couple ppl slaving away selflessly but not a communal culture of it, and e.g. with the sex thing, there's a reason churches way more popular w/women than hackerspaces, singles groups make the rest of the church a safer space)

(ye i'll probably end up with some splinter group considering how popular polyamory is among rationalists lol, which, i'm not a monagomist, i'm more of a transcend the animal urge ist lol)

also theres groups like sunday assembly or unitarian universalists, but they are too watered down, they don't have a true shared eschatology or goal or vision

not to mention chatgpt or gpt4 could do a way better job than a pastor in so many ways.

1

Frumpagumpus t1_iz15rex wrote

the basic jist is there's a feedback loop where you create an intelligence which makes a smarter intelligence.

it's hard to say how fast it will happen. but one thing to keep in mind, is that the difference between how fast computers think vs how fast humans think is like comparing the speed of the superhero the Flash to a regular human.

4

Frumpagumpus t1_iyjopcp wrote

i was arguing in favor of choosing manual labor careers in another thread but imo, if you are already a programmer i would try and use AI to do cool stuff right now. i just wouldnt start a cs degree right now lol. well, actually i would but only if i was financially secure and not trying to get income lol (because i think computer science is philosophically important).

i think of the take off i have in mind as mild but i think most people would think of it as fairly extreme XD e.g. 15 yrs from now almost all labor is automated and the AI is building seasteads to launch from to begin construction of it's dyson sphere swarm.

1

Frumpagumpus t1_iy4i893 wrote

not just AI, i use an operating system called nixos which gives you somewhat unprecedented levels of control over building your operating system. or it makes that level of control far more accessible than it had been before (when you would rely on your distro package maintainers exclusively to build your software). i think nixos will only get more widespread in industry, semi usurping docker (in some of its roles) in some environments. probably there are other examples (maybe yubikey/password managers somewhat?).

i had never recompiled my kernel myself for example before using it.

2