Viewing a single comment thread. View all comments

SurroundSwimming3494 t1_j3g9vpv wrote

I'm almost positive he meant generate movies eventually and not later this month bc there is no way, even considering the amount of progress AI has made in recent years, that full-blown movies are already possible.

Of course, we'll know later this month, but I'm extremely skeptical, to say the least.

56

TheSecretAgenda t1_j3gapb4 wrote

I could certainly see animated movies being ready very soon. Photo realistic live action in a couple years.

13

SurroundSwimming3494 t1_j3gbpfi wrote

>I could certainly see animated movies being ready very soon.

Maybe soon but not weeks soon.

13

Yuli-Ban OP t1_j3j9mtd wrote

That's not how image generation works. Animation and live action are not any more or less difficult than each other to a diffusion model. If anything, live action has extraordinarily more data to learn from than animation, because with animation you have different art styles with only a limited number of frames for each style (though popular and widely used styles like the standard anime look, beanmouth, rubberhose, etc. would naturally outweigh the far more niche and unique styles). Whereas there's quite literally thousands of years' worth of live action video uploaded on YouTube alone.

2

Equivalent-Ice-7274 t1_j3orum8 wrote

Do you think this AI generated movie technology could be used for animating robots?

1

Gab1024 t1_j3h3vgj wrote

He did say in the interview that he thinks full generated movies with real persons, will be there in 1 or 2 years

8

DukkyDrake t1_j3ho2r8 wrote

VR is next.

>HyperReel enables "6 Degree-of-Freedom video"... It runs 18 frames-per-second at megapixel resolution on an @NVIDIA RTX 3090, using only vanilla PyTorch.

HyperReel

7

DreamsOfCyber t1_j3gvl7p wrote

We've already seen AI generated videos from a couple of months ago so I wouldn't be surprised if we could make shorter films although very incoherent but NVIDIA I believe did showcase an AI that could take instructions (how to move the camera, where to look, etc...) and generate a video from that and sure the video itself was messy but it DID follow the instructions almost perfectly.

5

starstruckmon t1_j3gwfqu wrote

That's definitely what he meant. We'd be lucky to have Deep Floyd released by the end of the month.

1

enilea t1_j3gzs7m wrote

I did see months ago they were planning to have an animation module but I assume it's going to be short gif like content like the stuff we've seen until now.

1

drizel t1_j3joaf2 wrote

I think he said they’re up to 30 generations per second this month so essentially real time rendering.

1