Submitted by Mr_Hu-Man t3_y131bf in singularity
4e_65_6f t1_irv74gd wrote
GPT-3 uses sequences to 'predict' what word comes next.
You could probably train it to predict the weather by training it with a database of sequences of weather events and it should output the most likely to happen next based on past reference.
This principle should in theory work for everything as long as your database accurately describes the events in an understandable sequence of text.
Mr_Hu-Man OP t1_irv7m3h wrote
This is what I’m getting at: could we start extremely simple, and scale up from there until we have a future prediction tool
footurist t1_irwffnf wrote
If you're thinking going towards capabilities even remotely approaching Laplace's daemon ( even just for tiny chunks of the universe like the weather of city x ) then sadly ( or not ? ) that kind of assurance is way too computationally expensive and requires datasets no one can assemble.
However, a lot weaker variants may be possible, for that I don't know enough.
SPOILER
>!That said, in the tv show Devs they got it to work, lol.!<
Viewing a single comment thread. View all comments