MrZwink

MrZwink t1_j20tvw0 wrote

A transtor is a semi conductor. It moves electrons through a semipermeable barrier. This is an interaction at a quantum level. The smaller you make them the more prone to quantum tunneling they become. So no a transistor is not on or off, 0.001% of the time it's both or neither.

There are safeguards in place in computers to check for random bit flips because it is needed. It's called hashing.

Im not saying computers don't work. I'm saying a computer is a machine that processes information on a quantum level. And it is impossible to separate the computer from it's quantum interactions. Heisenbergs uncertainty principle applies wether you want it or not.

You cannot build a house without bricks.

1

MrZwink t1_j20m791 wrote

I love it how you say that like it's a surprise. I know I'm correct. I'm probably getting a lot of downvotes because I really don't likes kaku's unsubstantiated blabbering. And it shows.

While you're right about 1 atom. A brain is more than that. You can copy the entire brain. But anything in "active" memory would be destroyed. There are millions on quantum interactions ongoing at any one time. But then this is essentially the same problem. You have to choose: measure the state, or the interactions.

2

MrZwink t1_j1z3xhu wrote

Can't be done, heisenbergs uncertainty principle. Mr Kaku knows this. He just likes to fantasize in the media.

You cannot make a copy of a brain because it would need to copy all particles, and the their interactions at the same time to "image" the brain. But one cannot measure where a particle is and what it is doing at the same time. Because the measurements disturbing the interactions.

This is something enherent to quantum mechanics, and not a solvable issue to overcome.

−3

MrZwink t1_j1op1bl wrote

Turkish , Santa is an Americanized version of Saint Nicolas. A Catholic bishop from turkey. Who is celebrated as a national children's holiday brought over by dutch settlers to the United States. And then Americanized by coca cola company.

0

MrZwink t1_j0yvwb5 wrote

i completely agree, i dont think however that "watch out" is an ample warning, and we shouldnt pin people for not attaining a certain level. its ok that 60% of society never attains university level. they dont need to. the true question will be what will we do with the mouths we need to feed when these people are no longer "needed" in the workforce.

right now, if you dont work, youre out on the streets.

1

MrZwink t1_j0xq5oi wrote

The difference now, is that computers will exceed human capabilities at specific tasks.

Why would you need a Japanese translator when you have an ai that can pass the Japanese university entry exams. (When 60% of japanese never attained that level)

why would you need a human driver when you have an ai that does the same with fewer accidents?

Artists specialise in one field. The ai can draw anything from Dali to Picasso to mondriaan to banksy.

This isn't like the 1900s were textile mills created work for technicians. Or the 2000's where it created programmers. Guess what the ai can program already. It can solve math problems humans can't. The ai can train itself. And maintain itself. And with enough training it'll be able to manage projecsts and do finance. Etc etc etc.

And while there might initially be maintenance jobs. The ai will eventually be able to do that aswell. And as we stand now. The ai will automate 95/100 jobs by 2065.

29

MrZwink t1_iziexmn wrote

They find correlation, not causation.

This means they have notorious difficulty with queries that make no sense. A good example is Galactica, facebooks scientific paper ai. asking it for the benefits of eating crushed glass. And it tries to answer. It doesn't notice the question is flawed. It just tried to find data that correlates to the query. And makes stuff up.

It is the question if we will be able to ever teach ai common semse.

6