Viewing a single comment thread. View all comments

MrMunchkin t1_j7rb14a wrote

Yikes, there's too much to unpack here but I think what you're referencing is the images that are created from the archive. Are you familiar with the 3 stages of the pipeline?

Remember too, there are 10 detectors in the JWST, and the limit in the SSR is only 65GB, so much of the processing is done on board to reduce data excess. Tons more info can be found here: https://jwst-docs.stsci.edu/jwst-general-support/jwst-data-volume-and-data-excess

More info on the data pipeline can be found here: https://jwst-docs.stsci.edu/jwst-science-calibration-pipeline-overview/stages-of-jwst-data-processing#:~:text=The%20processing%20of%20JWST%20data%20goes%20through%203,%28slope%29%20images.%20Stage%202%20calibrates%20the%20slope%20images.

Also keep in mind JWST does thousands of exposures using many of the instruments. That data is accumulated in the SSR and is streamed every 12 hours or so to earth.

0

axialintellectual t1_j7rmtl1 wrote

> there's too much to unpack here

Well, no, there really isn't. You say Webb produces data 'without intervention by a human', and 'a huge amount of findings [are] produced by an algorithm'. That's a really weird way of putting it, because the vast majority of Webb time is obtained by individual projects designed to look at specific things, with dedicated analysis plans. Of course there's a nonneglible amount of bycatch, so to speak - but that's not what I read in your comment.

1