BobbyThrowaway6969

BobbyThrowaway6969 t1_j6p2not wrote

Double precision is the black sheep of the family. It was just thrown in for convenience. GPUs don't have double precision because what do you care if a vertex is a millionth of a pixel off or a billionth? Graphics has no use for double precision so why make the chip more expensive to produce?

Compute programming might need it but not for the general public.

3

BobbyThrowaway6969 t1_j6p131z wrote

I left "1+1 math problems at the same time" pretty vague on purpose. Math in my analogy isn't referring to processor arithmetic, it refers to "stuff" a processor can do. They don't all have to be on the same task. Some can handle vertices while others handle pixels.

>they work on the exact same kind of problems the CPU does.

They can do arithmetic the same way, sure, but you wouldn't exactly expect to be able to communicate with a mouse & keyboard using one of the cores in a GPU.

The instruction set for a GPU (based around arithmetic) is definitely nothing like the instruction set of a CPU lol. That's what I meant by 2nd grader vs mathematician.

3

BobbyThrowaway6969 t1_j6mehn5 wrote

It takes energy to fuse atoms together, but the act of fusing atoms together also releases energy, aka, you have to break a few eggs to make an omelette.

As long as the energy coming out of fusion is higher than the energy needed to do it, a star can exist happily.

For all the elements before iron, this is the case, more energy comes out.

Iron, however, is the first element that takes MORE energy to fuse than it gives back, the star isn't so happy any more. It now has to use a lot of eggs to make a rather sh***y omelette.

1

BobbyThrowaway6969 t1_j6lg0ft wrote

The CPU is a mathematician that sits in the attic working on a new theory.

The GPU is hundreds of thousands of 2nd graders working on 1+1 math all at the same time.

These days, the CPU is now more like 8 mathematicians sitting in the attic but you get the point.

They're both suited for different jobs.

The CPU could update the picture that you see on the display, but that's grunt work.

Edit: I don't mean the cores in a GPU are stupid, but their instruction set isn't as complex & versatile as a CPU's which is what I meant.

663

BobbyThrowaway6969 t1_j6h3k8t wrote

4 in decimal form actually doesn't exist in the computer. It either draws pixels in the shape of a 4, or you have a 4 drawn on the keyboard, they're the only places "4" exists like that, the moment you press the 4 key, it's already in binary form. Your keyboard sends the binary scancode representation of "4" to the computer.

2