regular_modern_girl

regular_modern_girl t1_iscvnc2 wrote

A lot of culture-bound syndromes are also partly based on this phenomenon, particularly those concerning some type of perceived witchcraft.

One of the more peculiar examples is Koro, aka “penis panic”, which has been reported in a number of different cultures across the world, and basically involves men becoming so paranoid that a witch is making their genitals shrink and/or not work properly that they actually become delusional and hallucinate their penises retracting into their bodies (to some extent the retraction might be quite real, considering anxiety can cause testicular/penile retraction, the thing is that in reality it’s obviously not actually permanent, though).

The funny thing about culture-bound syndromes is that not only can (fear of) witchcraft cause them, but appropriate magic charms or spells can actually be effective in treating them (via the regular placebo effect, obviously).

13

regular_modern_girl t1_isbetxk wrote

To clarify, I guess what I meant by “distinguish” is whether or not different isotopes behave fundamentally differently as far as biochemistry is concerned (apart from the obvious difference that some are radioactive and give off ionizing radiation that destroys biological macromolecules, etc.), like in the way that different elements do, and the only specific example of that it seems is with different hydrogen isotopes. It seems that basically apart from deuterium, as long as an isotope isn’t radioactive, it can be used biologically just the same as the more common isotope of that element (assuming a biochemically-relevant element here, obviously) without causing any problems.

Although tbf, I guess the comment that started this thread was asking if the human body (in particular the thyroid gland) “knew the difference” between different isotopes of iodine, and I guess sort of what I’m asking here is, do C3 plants take in more or less of a given isotope than C4 plants because something is making a distinction between the isotopes (regardless whether or not there’s a specific “purpose” to taking in more of one than the other), or is it just a side effect of getting carbon from CO2 in the air versus carbonic acid/carbonate in the water?

1

regular_modern_girl t1_isbb3dq wrote

Is that really biological systems distinguishing, though, or is that just human researchers looking at isotope ratios and using them to determine where a given element in a biochemical context came from? Like do plants that use C3 photosynthesis have a different ratio than C4 plants because something about the process causes them to preferentially take in one carbon isotope over another, or is it just because (as you said) they’re getting their carbon from different sources?

1

regular_modern_girl t1_is6e6zn wrote

From my knowledge of biochemistry (and chemistry in general), this is correct, there isn’t any difference in how biological systems handle different isotopes of a given element, with the notable exception of hydrogen isotopes, which are apparently different enough from each other atomically that they have notably different effects on biochemistry (in particular, an organism that is given only heavy water in lieu of regular water will eventually become “deuterated”—meaning that the majority of its hydrogen-1 is replaced by hydrogen-2 or deuterium—and suffer a variety of symptoms strikingly reminiscent of radiation poisoning, even though deuterium is not itself radioactive, before eventually dying, apparently because the added mass of deuterium just throws off a bunch of key cellular processes enough on a molecular level that basic enzymes don’t work correctly; note that this doesn’t happen just from consuming small amounts of heavy water, which is actually considered non-toxic on its own, just so long as most of the water you’re consuming has H-1), but this is the one unique exception to my knowledge (at least this is what my chemistry professor claimed).

EDIT: I looked more into this, and it does indeed appear to be the case, as there have actually been studies with oxygen-18 (which like hydrogen-2 is to hydrogen-1, is a rarer, heavier, but non-radioactive isotope of oxygen that makes up something like 0.2% of Earth’s atmosphere) which have shown that aerobic organisms can safely breathe it exclusively with no ill-effects, unlike replacing an organism’s water intake with heavy water. So essentially, biology really doesn’t distinguish between isotopes, and it usually doesn’t matter unless it’s a heavier isotope of hydrogen, or a given isotope is giving off ionizing radiation.

8

regular_modern_girl t1_is42e64 wrote

They might have meant “saltiest seawater in the world”, but yeah, there are endorrheic lakes or pools that are magnitudes more saline than any part of the ocean.

The average salinity of the ocean is about 3.5% iirc, the southern portion of the Great Salt Lake is about 5%, while the northern portion is as much as 20% (they’ve been separated by a railroad causeway since the 1950s, hence the drastic difference in water chemistry), the Dead Sea is about 30%, Lake Assal in Djibouti is 35%, and Don Juan Pond in Antarctica (the most saline known body of water on Earth, unless you count the concentrated brine pools that sometimes form deep in the ocean) has been measured at over 40% iirc.

2

regular_modern_girl t1_is3sxqp wrote

Just to be charitable to the user who’s arguing with you, I suppose it’s maybe possible they’ve seen pictures of methane clathrate deposits on the seafloor, which does look a lot like ice and is sometimes even (erroneously) referred to as “methane ice”. I could see how someone might see photos of a methane cold seep in NatGeo or something, see people on boats topside holding samples of what looks like ice, and possibly getting confused over the text referring to it as an ice-like substance at the bottom of the ocean.

However, methane clathrate is obviously not actually ice (or at least it’s not usually classed as one of the forms of water ice and is really kind of its own thing chemically), and I might be assuming too much good faith here . In any event, this other user needs to admit that they’re probably not going to win an argument about basic physical properties of seawater with an actual expert on the physical properties of oceans.

1

regular_modern_girl t1_is3mrx1 wrote

ice-Ih (the only form of water ice that occurs naturally in terrestrial conditions) is a weird solid in that it’s actually by definition significantly less dense than its liquid phase, due to peculiarities of its crystal structure.

This actually leads to a number of peculiarities when it comes to ice (the common earthly form of it, at least), such as that it floats on liquid water, it takes up more volume than liquid water, and that higher pressures actually generally melt it by lowering its freezing point (the exact opposite of how most crystalline solids work).

If we were talking about almost any substance other than water here (or talking about very different conditions than Earth) then what you’re saying here would be largely correct, but water (and especially ice-Ih) is just really weird like this.

EDIT: I forgot that ice-Ic (the cubic crystal form of ice-I) is hypothesized to occur naturally in tiny amounts in the upper atmosphere, and trace amounts of ice-VII (one of the high-pressure variants) have been found as natural inclusions in diamonds as a user below just informed me, but obviously neither of these things are really relevant to the argument at hand. Ice-Ih is still the only one we encounter in daily life, and the only one to occur in substantial quantities in nature here on Earth. The full water ice “zoo” that we’ve managed to synthesize in lab conditions up to this point consists of something like 20 or so different crystalline forms (I think depending somewhat on how exactly you distinguish some of the structures), as well as non-crystalline amorphous ice (which has a disordered molecular structure like glass, occurs in very low pressure conditions, and might actually be the most common form of water across the universe).

1

regular_modern_girl t1_is2xubu wrote

>As a result, whilst it’s possible to form dense waters at the surface of the ocean (which can sink), there is no process that can reduce the density of the resulting deep waters, and thereby bring them back to the surface.

Is this why those mini-brine lakes form on the seafloor in some areas? Like will the densest, most saline waters end up all coalescing together deep in the ocean until the salt content is so concentrated that it approaches saturation? Or are those brine pools the result of something geological instead? (I know that that they’re associated with methane cold seeps)

3

regular_modern_girl t1_is2rpvn wrote

The only vertebrates I know of that can survive freezing completely solid are ectotherms, and I’ve also never heard of any bird or mammal species being able to withstand it, so it might just be that organisms that have evolved to operate with a constant core temperature aren’t able to survive the extreme cold leading up to freezing, even besides the problem of ice crystals damaging cells? But I don’t know, that’s mostly just a guess and it may just be coincidental that no endotherms have evolved this ability.

Interestingly, there has been some evidence to suggest that critically-injured trauma patients can sometimes be kept just barely alive long enough to be saved by being cooled down to very low temperatures in a controlled setting, as I guess this basically slows down a lot of physiological processes in such a way that essentially buys doctors time to do what they need to. The procedure is called EPR, or Emergency Preservation and Resuscitation, and is still experimental, and it’s obviously still a far cry from complete freezing, but it is something.

10

regular_modern_girl t1_is2q7ew wrote

This is mostly the same thing with other cryptobiotic states as well, like in organisms that can survive complete desiccation. Often the key chemical preservative there is actually a sugar called trehalose iirc, which at sufficiently high concentrations helps to keep cellular structures intact even in the near-complete absence of moisture, such that the inner processes of the cells are basically “frozen in time” when they dry out, and therefore able to spring back to life once they’re rehydrated.

Again, this only really works with small organisms below a certain level of anatomical complexity, and I’m sure there are certain cell or tissue types that just don’t respond well to this kind of preservation, but apparently it’s part of what allows tardigrades to enter their famously nigh-indestructible “tun” state, and is also found as an adaptation in some desert-dwelling insects, and the eggs of a number of aquatic creatures that have evolved to weather extended periods of desiccation (sometimes very extended; brine shrimp eggs from literally thousands of years ago dug up in the Bonneville Salt Flats of Utah have been found to still be viable).

Even though there are several reasons it probably would never be suitable for allowing an entire human to be basically mummified and then brought back to life, trehalose has seen a lot of use as a preservative for blood or tissue samples, making it so they can be completely dried out and then reconstituted as needed (like apparently dried blood samples preserved with trehalose will even retain the distinctive vivid red of fresh oxygenated blood, rather than the dull rusty brown we usually associate with old, heavily-oxidized, dried blood).

4

regular_modern_girl t1_irpqze2 wrote

Strictly-speaking, no, because all memories are to some extent false. Although there’s still many unknowns when it comes to how exactly memories are stored, there’s obviously no evidence that anywhere in your brain fully-constructed experiences are stored and preserved exactly as they were first perceived; instead, your memories are most likely really just a bunch of disparate sensory and linguistic associations that get reconstructed into what we consciously perceive as a coherent memory each time we recollect them. Because of this, memories by definition are constantly being rebuilt, blanks are always having to be filled in, and to some extent, it’s likely that the more you recollect a given memory, the less it has to do with the reality of the original experience and the more it’s just a fabrication of your own mind.

Again, this is probably all up for debate to varying degrees, as there are different theories for exactly how information gets committed to memory and pulled up again later, but I’d say it’s a more common position than not among both neurologists and psychologists that memories in general are always more fabricated than genuine.

Of course, in cases of pure confabulation, or where a completely false memory is induced by outside suggestion, it can sometimes be circumstantially possible to recognize the break with reality, but that’s really dependent on context and isn’t always (or even often) possible. As a whole, there’s no general way of making a distinction, as no memory is fully accurate.

3

regular_modern_girl t1_irg9ndg wrote

yeah I was gonna say, I think the exact causes of ASD and autism in general are pretty controversial and somewhat unsettled, particularly the exact way genetics plays a role, and which genes are most likely to be responsible, so unless there’s been some very recent major breakthrough, “30% of ASD cases are caused by de novo mutations” seems a really bold claim, considering that would make de novo mutations one of the most significant contributing causes by far, and at least last I heard, the exact genetic causes of autism were not that settled (and heritable genes were more often thought to play a large role in at least certain types of cases). The former, more conservative claim seems a lot easier to swallow.

4

regular_modern_girl t1_ir6k8r2 wrote

This is a single-celled green algae rather than bacteria, but it still fits the spirit of OP’s question as it’s an example of high concentrations of a microbe in rain; the rare but officially-recorded phenomenon of red rain or “blood rain” seems to very likely be the result of an unusual bloom of the microalgae Trentepohlia in storm clouds, or at least that seems most likely from studies of recent instances of the phenomenon in Southern India. Exactly why this occurs is not understood, although it has been recorded multiple times in the same geographic areas.

There are, of course, anecdotal reports of “blood rains” in various parts of the world going back to antiquity, which could conceivably be due to the same algae or a similar one, but since those instances generally fall more into the realm of folklore and mythology at a certain point, I’ll save speculation on them.

EDIT: since the above hypertext is just a news article, here’s an actual paper on the subject. Just wanted to make sure I had something truly scientific in here, since this a subject that got taken up by ufologists and conspiracy theorists when it was first being reported in the ‘00s, and there was all sorts of wild speculation about the cells in the rain being extraterrestrial in origin (with no real basis, of course), so I figured detailed published research on the Trentepohlia theory from a reputable journal was needed. Also, it seems that most likely it isn’t an algal bloom in storm clouds so much as the cells being swept up in storms from aquatic environments on the surface.

13

regular_modern_girl t1_ir2u0pa wrote

Generally-speaking in recent times, it has been done by taking animal models (usually mice, as they reproduce fast and actually share the vast majority of their genome with us), and creating “knockout” forms of them where one or more stretches of DNA that code for a protein (which is actually all that a gene is) are “switched off”, and then seeing how this effects the animal.

I actually once got to meet Mario Capecchi, the researcher who pioneered “knockout mice”, and won a Nobel Prize for it.

Prior to this innovation (which I believe only came in the 2000s, although maybe it’s a bit older), biologists were mostly in the dark about which genes related to which traits, and it was really mostly guesswork. Genetics used to focus more on Mendelian heredity (which if you ever learned any biology in school you were likely introduced to), which connects observably inherited traits with a sort of theoretical entity called an “allele” (which don’t necessarily correlate to genes), and then attempts to sort of reverse engineer the rules for how that trait is passed on (whether it is recessive or dominant, etc.). Since most traits actually have more than one gene involved (since literally all a gene does is act as a molecular blueprint for a single protein, that’s it), sometimes in fairly complex ways, this way of looking at genetics is obviously very imprecise and rudimentary, and it’s mind-boggling how much genetics has advanced just in the past twenty years (it really is almost like going from abacuses to electronic supercomputers in just a couple decades). Thirty years ago we hadn’t even sequenced a whole human genome, now we can pin down certain traits to particular stretches of DNA, and then even selectively alter that DNA if we want (or put it in other organisms, etc.).

EDIT: There are other methods as well, but they tend to be more indirect and imprecise. Like mutant forms of some simpler organisms like C. elegans roundworms and E. coli bacteria have been studied and selectively bred for a while to try to disentangle which stretches of their DNA the mutations might lay in (as with bacteria especially there’s generally less DNA to look at, so this is more feasible than with organisms that have huge genomes like us, or surprisingly like a lot of plants; seriously, plants are so much more genetically-complex than you might expect, in part because a lot of them require a ton of different enzymes for complex biosynthetic pathways of various organic chemicals, and enzymes are proteins, and therefore each is tied to a particular gene).

6

regular_modern_girl t1_iqub47l wrote

I think it depends on a lot of factors. In general, I believe that adipose tissue (fat) retains heat better than a lot of other tissues, so I think it depends somewhat on an organism’s bodily composition. For ectotherms, they do need circulation to distribute warmth around to all their tissues, and a bigger body means that process naturally is going to take somewhat longer. This is part of why crocodiles (which in many cases are really big animals) are all in aquatic sit-and-wait predatory niches in their surviving forms, as they can be huge but still relatively sluggish and spend a lot of time basking, because they just need enough energy to strike out hard when the moment is right, they don’t need to pursue prey, etc. There used to be fully-terrestrial predatory lineages of crocodilians, but they were in general not quite as big and were built very differently, being generally lean and long-legged for pursuing land-based prey (they were probably a lot like big monitor lizards like Komodo dragons in terms of the niche they occupied and how they hunted, and they were probably similarly limited to being diurnal hunters in hot climates).

Like I know that, at least for big predatory dinosaurs, the assumption that they were ectothermic like living reptiles was long a source of confusion, to the point where (prior to the theory that dinosaurs were probably mostly or entirely warm-blooded) a number of paleontologists forwarded the idea that dinosaurs like T. rex were more likely scavengers than active hunters, because it was just too hard to imagine them being able to relentlessly pursue prey with that kind of bulk if they were cold-blooded.

Clearly, it isn’t a complete barrier against getting really big, as plenty of very large presumably cold-blooded creatures have existed over time (like I believe that most Mesozoic marine reptiles like ichthyosaurs and mosasaurs are assumed to have been ectotherms, since they came from older diapsid lineages that hadn’t evolved endothermy—in fact, mosasaurs seem to have literally been gigantic marine lizards, related to monitors—and they were still able to get pretty massive in some cases, although still not quite as big as the largest whales), but I also think it probably indicates something that the very biggest of animals to ever live all seem to have been warm-blooded, and that’s definitely not that endothermy itself is a barrier to great size, quite the opposite in fact.

1

regular_modern_girl t1_iqt6ye0 wrote

I don’t know what you mean by “megafauna back then may have been pseudo-warm blooded”, partly because I don’t know what “pseudo-warm blooded” means. For the most part, an animal is either endothermic, or it isn’t (unless you’re talking about a few special cases like bumblebees, which use movement to raise their temperature sometimes, but are overall still ectothermic, but this is only really a thing in a few small invertebrates to my knowledge). Dinosaurs are believed to have all been endothermic, although like their modern bird descendants, they probably did have lower body temperatures on average than most mammals. But if you’re talking about past mammalian megafauna (or even present to some degree), the suggestion that they may have not been truly endothermic seems pretty baseless, as the range of average body temperatures in living mammals doesn’t really correlate with size very much (in that there are both examples of large and small animals with relatively high temperatures, and large and small with relatively low temperatures), really, adaptations to different climates and other physiological factors are much more important, and larger animals with high body temperatures require a more energy-rich diet to sustain the metabolic requirements, but to my knowledge that’s the main limitation. And all living mammals are true endotherms that generate body heat metabolically, even if there is some variance there.

Whales are true endotherms, and as far as we can tell, blue whales are actually the biggest animals in general have ever gotten (at least in terms of mass). A much bigger constraint on size is gravity (which is why aquatic animals have always grown the largest), followed by pulmonary efficiency (essentially, it takes special physiology to ensure that oxygenated blood can reach all the tissues it needs to in a big animal, and again, gravity is also a factor here when an animal is terrestrial, which is why it’s thought that the complex bird-like respiratory systems of dinosaurs may have played a role in them getting so big on land, in addition to the higher oxygen content of the atmosphere during the Mesozoic).

While terrestrial mammals never got as big as dinosaurs, they did once get a lot bigger than African elephants do today, such as the gigantic rhinoceros Indricotherium, and there’s absolutely no reason to believe any of these animals weren’t true endotherms (we obviously can’t know from fossils exactly how their body temperature stacked up to their smaller living relatives, but again, it’s not likely to have been nearly as important factor as many other things).

EDIT: also, as someone pointed out above, getting really big is actually in some respects harder for ectotherms, as it takes longer to warm a very large body in the sun, so they need to spend far more time basking the bigger they get. This is part of why cold-blooded giant dinosaurs were initially criticized as a theory, because a cold-blooded Tyrannosaurus, for instance, would literally waste all its time soaking up heat from the sun rather than hunting, and would just be far too plodding and sluggish to be an efficient apex predator.

EDIT 2: my bad, it seems that there are also some fish with a physiological means of generating some of their own heat, to the point of sometimes being referred to as “endothermic” informally, but they do so through different means than birds/dinosaurs or mammals do, and this also still doesn’t relate to any past land megafauna.

6

regular_modern_girl t1_iqsllh5 wrote

Ants get around by a combination of pheromones (chemical signals) and a special internal sensory system that functions almost like a natural pedometer for them (in that—if I’m remembering this correctly—it essentially “unwinds” when an ant takes steps in one direction, and them “rewinds” when they take steps in the opposite, so in this way worker ants can essentially sense whether they are moving the correct direction with regards to their original path or not, and then in combination with pheromone “breadcrumb trails” can almost always retrace their exact steps back to the colony like clockwork). Of course, there are many ways these navigation systems can be thrown off, like a human moving an ant some distance from its original path, or cutting off one joint of each of their legs so that their internal “pedometer” system will register too few steps (this is how the system was originally experimentally observed; some ants had their legs shortened, and others artificially lengthened, and it would cause them to either stop short of the colony or walk too far very consistently. Not the nicest of experiments if you’re someone who feels bad for ants, but it did provide some valuable insights).

So to answer your question, no, if you were to move the ant far enough off course, it would not be able to find its colony again, or at least would have great difficulty in doing so most likely.

And ants definitely do not join other colonies. Ants are an example of a eusocial organism, meaning essentially that they live as a group of closely-related individuals that function as a cohesive collective, with only select members of the collective having reproductive privileges (the queen and the male drones), and the rest being strictly non-reproductive. It’s a social structure seen in ants, bees, wasps (although not all species of any of these), termites, some beetles, some marine shrimp, and (possibly most bizarrely) naked mole rats and another species of mole rat (the only known eusocial vertebrates), and probably others. Eusocial organism colonies are all the children of (usually) a single reproductive female “queen” and a small number of reproductive males (which in eusocial insects tend to be short-lived and literally exist solely to fertilize eggs and produce workers, as well as also being children of the queen themselves) and therefore are all genetically very close and almost act like the cells of a single “superorganism” more than individuals, as the workers can’t reproduce and have literally no other reason for existing except serving their colony’s needs. A worker ant can’t join another colony any more than one of your white blood cells could be put into another person’s body without being rejected and destroyed by the other person’s immune system; ant colonies are a packaged deal, and all other colonies are seen as competitors for the same resources that need to be destroyed on contact.

4