No, speaking as a former glaciologist, that's not likely to be true. The parent poster is correct. Younger Dryas featured multiple advances and retreats, which is inconsistent with an impact. Multiple impact over many centuries just isn't that statistically likely, and we would see better evidence of it.
Instead, the most probably cause is freshwater pulses coming out of the melting Laurentide ice sheet changing circulation in the Atlantic. This is actually quite well supported.
Younger Dryas was also confined to the parts of the northern hemisphere, mainly in higher latitudes. A large enough impact to significantly alter climate would see the resulting particulates globally mix in the atmosphere, resulting in more uniform cooling.
Instead, the fact that it was confined to parts of the northern hemisphere also supports the freshwater pulse hypothesis, since it is only a change in the transport of heat from the tropics to the pole, not a change in total heat.
The fact that freshwater pulses from the Laurentide ice sheet caused so much cooling is actually a concern with climate change. It's possible that it could happen again with melting of the Greenland Ice Sheet, at least a weakened version of it (the Laurentide Ice Sheet dwarfed the GIS).
A report was published in Nature in spring of this year showing multiple lines of evidence for impact-related effects in the YDB layer in southern Chile:
"In the most extensive investigation south of the equator, we report on a ~12,800-year-old sequence at Pilauco, Chile (~40°S), that exhibits peak YD boundary concentrations of platinum, gold, high-temperature iron- and chromium-rich spherules, and native iron particles rarely found in nature. A major peak in charcoal abundance marks an intense biomass-burning episode, synchronous with dramatic changes in vegetation, including a high-disturbance regime, seasonality in precipitation, and warmer conditions. This is anti-phased with northern-hemispheric cooling at the YD onset, whose rapidity suggests atmospheric linkage. The sudden disappearance of megafaunal remains and dung fungi in the YDB layer at Pilauco correlates with megafaunal extinctions across the Americas. The Pilauco record appears consistent with YDB impact evidence found at sites on four continents."
That just means that there was an impact at around the same time. There's been numerous impacts like this that have been claimed to be the cause of Younger Dryas in the past. However, just because there was an impact doesn't mean it had anything to do with YD.
The Atlantic meridional overturning current shutdown from a freshwater pulse is much more parsimonious, and is exactly what we would expect from the physics of the Laurentide melt going into the Atlantic.
Impacts still don't explain the multiple advance and retreats during the Laurentide, but it is obvious why that would occur with the AMOC shutdown explanation:
Melting causes the AMOC to shut down, causing nothern hemisphere cooling. The cooling reduces melting, inducing the AMOC to start back up, increasing melting once again.
I accept part of what you're saying, which I believe is that the Laurentide melt was probably happening anyways, with the associated effects on the AMOC. Glacial oscillation are their own cycle, clearly.
But the YD impact scenario hardly seems irrelevant to the YD. I'm curious what other impacts you're aware of that you deem irrelevant.
The best candidate I'm aware of for the YD impact would possibly/probably be related to the Hiawatha crater [1], which is 31km after likely boring through a km or so of ice. This puts it amongst the largest impacts on Earth, e.g. since back to the Chesapeake/Siberian impact around 35mya, (if not beyond after the ice impact is taken into account, e.g. to Chicxulub) [2]. It's not solidly dated yet, but according to the authors the evidence at the site is consistent with an impact during the Pleistocene, and they even say it may still be hot, despite being packed in ice!
If we're talking about the same event, with evidence across 24-53 sites on 4 continents that includes continent-wide fires, impact winter and floral and megafaunal extinction, then I'd be hard-pressed to see how this doesn't have to do with the main climate change event of that period; that's the definition of what makes the YD significant: the drastic floral changes (e.g. of Dryas octopetala) at the boundary. The impact seems a more likely candidate cause for the abrupt global changes.
Or put another way, I think the question is back to you of how changes in the AMOC during the YD somehow caused the characteristic extinctions and charcoal residue observed in southern Chile at the YDB.
The contemporaneous, global inferno and impact winter going seems too much to chalk up to coincidence. That's where I see the parsimony breakdown for a simple Laurentide melting/AMOC change being responsible for the YDB.
Vitamin D deficiency actually really common, especially with dark skinned people living at higher latitudes. Most office people could do with a bit more UV.
I probably wouldn't worry about it unless you have really large arrays. It's still probably pretty insignificant compared to the time spent updating the DOM.
Check out Funkia List [0], an persistent O(log n) random access list implementation. I use it pretty extensively when I'm writing typescript.
It actually can beat native arrays on concat and push, often by very wide margins, while also being immutable. These kinds of operations are actually much easier to optimize if you can assume immutability.
Modern persistent tree-based data structures are usually O(log32n) random access, which is essentially O(1) for all practical purposes. With a mutable array of references, random access follows one pointer, while these structures follow usually less than 5. Sequential access is usually O(1).
These kind of persistent data structures are already the standard data structures in Scala and Clojure, and they are fast enough for the vast majority of non-numerical purposes. In typical access patterns, they are faster than mutable arrays.
Sequential access is O(1) just like mutable arrays. These structures really are fast and practical. The benefits of immutability are enormous. That's why they are so popular.
Edit: That's true about log32 and log2 become close in the limit, but that's irrelevant for practical data sizes. For example,
Sequential access in an array has caching advantages because the memory is contiguous. For a linked structure, that is often not true. Linked structure mostly suck at sequential accesses. Extreme random access can actually work against arrays for the same reason.
There is a good reason why the subscript of log is often not even mentioned. Logarithmic is logarithmic, no longer what subscript you bother in investing extra space for.
log32n/log2n is always 5, highlighting why this is such a meaningless comparison. O(logn) (basically) refers to a constant multiple clogn, by which logic O(log32n) = O(5log2n) = O(log2n) = O(logn). It's all the same.
Case in point, if your algorithm takes log32n operations but each OP takes 5x longer its exactly the same as log2n. This is true for any value n, not just large values.
The problem is making concat fast given the original array still exists and is mutable.
In the major JS engines string concat is essentially O(1) because strings are immutable (and they flatten them out as appropriate according to whatever heuristics make sense).
But for array concatenation in js i can do
a.concat(b)
Followed by
a[0]=something
Or
Delete a[1]
Or
A.push(something)
Or a.length++
Etc
To make this particular array concat fast would require significant perf impact to all other arrays, even those that aren’t involved in concat
These persistent structures use what's called structural sharing. The simplest persistent structure is just a regular linked list. While random access is O(n), inserting, deleting, or concating is O(1). Since the list is not mutated, we can insert an element to the head of the list just by taking the new element and making a pointer between it and the original list. The original list is not modified, nor is it copied.
More sophisticated persistent structures like Funkia List have O(log32n) access, which is basically constant time. This makes them better general-purpose data structures than mutable arrays.
I've lived in Toledo, and I've heard the Twin Cities get colder than Fairbanks in the winter. They have frozen lakes, the only thing they don't have is polar bears.
I grew up in northern Iowa, so the Twin Cities were our closes major metropolitan area. My mom's family is from 2 hours north of them. My wife's parents both grew up even further north. It gets cold, but once it gets below 0F it all feels roughly the same. At that point there is no humidity in the air, which is what really makes you cold to your bones.
I think the biggest thing people have to get used to is that there is roughly a 120F temperature swing from January to August.
One winter I was vacationing in Beijing, Changchun, and Haerbin. Beijing was about -10C, cold I guess. Changchun was about -20C . I took a bus to Haerbin and saw the mercury fall to -40C (they have digital thermometers at the front on the long distance buses in NE China!). Anyways, there is a huge difference between -10C and -40C, when I got back to Beijing, it felt fairly balmy.
Yea I wouldn't recommend doing anything other than hopping between buildings or vehicles when it gets down to -40 (same in both C and F, which I think is neat)
-10C is 14F, which I agree is a pretty nice day if you're properly outfitted.
You can actually do stuff in -40C, just wear a lot of clothes and don't expect your digital camera to last very long when you are trying to take picture of carved ice sculptures and ice bars, etc...
Honestly, the coldest winters I spent were in Southern China, even though it was 5 to 10C out, no indoor heating really sucks and will beat you down quickly.
Residents don't sleep much, so this doesn't surprise me at all.
Related, the residency process needs to be massively reformed. The flimsy justification for making people responsible for human lives work 80 hours a week is usually that the long hours help them learn faster. But that's really BS. The brain has an incredibly hard time forming new memories when sleep deprived.
I know a couple of nurses who often talk about how spaced-out residents are at night. They'll page them to consult on something, usually waking them up. The residents will usually just blearily agree with whatever the nurse was planning on doing, so they aren't getting in meaningful physician supervision.
The pioneers of the modern residency system were fueled by copious amounts of cocaine & other stimulants. Perhaps we need to reevaluate the requirements & expectations we place on medical residents today. It's essentially a form of professional hazing, and I personally know many surgical residents that are literally operating on people today while absurdly sleep deprived (by no fault of their own, just the insane hours of their program).
That said, if whoever's operating on me is running on 3 hours of sleep I'd rather they be hopped up on stimulants than not...
One time I met with the head of an union and he told me many tough jobs (mining mainly) were perfomed while on drugs, which aren't available anymore causing all kinds of issues for the workers. I wonder if there was truth in that statement and maybe some formal research on this.
As I recall, the CIA did do a fair amount of this research on this, up to and including implanted cortical stimulators, which created some serious backlash.
Getting prescribed "on/off" switches in pill form is a well-known thing in the military. You just can't do combat air patrol over remote areas of Asia without some uppers. This quickly went from the pilots (mid-grade officers) and other fight crew, to being adopted by folks who have to go halfway around the world routinely (flag officers).
It's an active topic of conversation, some do, some don't. As a military physician who was a line officer and has been through 5 years of graduate medical education (internship + residency), and now occasionally has to do those round-the-world trips, it's not clear to me that there's an obvious right answer in policy or per-person. We (leaders and followers alike) expect leaders to function at the outer limits of human capacity, and have for a long time. I can tell you this: it doesn't get easier with age: a fairly common definition of success as a leader is proving your ability to take on more responsibility, so the more you do, the harder it gets. So avoid starting early.
As a former surgical resident, can reluctantly confirm. I've had many conversations half-asleep where I've afterwards been slightly uncertain of whether I just had a conversation with someone or was dreaming.
That said though, those situations generally concern relatively safe decisions, like minor pain killer dosage adjustments.
If what the nurse calls about is sufficiently serious, adrenaline kicks in - and that thing can get you going really fast.
Actually, I wouldn't be surprised if some of the most expensive (in telomeric sense) part of residency is exactly that; the situations of mobilization from near-zombie sleep state to hypervigilance within seconds.
Naturally residency can and should be organized better than it is, but there are reasons why things are fundamentally organized the way they are as well. From a resident's perspective, on-call time primarily buys you time for elective surgeries - where the real learning happens.
What are some good reasons that things are organized this way?
I’ve heard that long shifts help with continuity as fewer doctors need to pass information about the same patient thus reducing communications overhead a bit. Anything else?
If only that, it seems the drawbacks in terms of risks to care quality as well as to the resident’s learning & long-term health may be greater than the benefits.
I think 'needlesurgeon's answer otherplace in this thread answers this well. Especially in adressing the point that it is in a way essentially a numbers game; It takes a certain patient population size to provide sufficient volume and diversity of cases per year to educate a certain number of surgeons over a certain span of years. You could make on-call easier by thinning this out over more surgeons-in-training, but then it would take almost twice as long for them to get the same experience. The problem with this is that the duration of a normal career isn't really that long compared to the time it takes to master a surgical field. If you work really hard and have great progression, you may be able to be top notch in your field for maybe 5 years before your skills start to decline.
Also, those surgeons who are on top of their fields are incredibly important for the field as a whole, as it is they who inform all other surgeons through a kind of cascade of consultations.
The thing with continuity is right. Hand-overs always means some degree of information loss, especially for non-verbal information. One of the most imporant clues indicating need for surgery can be the character of stomach pains upon manual examination for example. If the same surgeon does the examination with some hours intervals, he or she may be able to detect subtle signs of deterioration which a new surgeon would not.
> resident’s [...] long-term health
Oh. Well. Haha. When an anesthesiologist colleague of mine commited suicide at one point the only thing we were told at the morning briefing was that the planned surgeries of the day would regrettably not be initiated exactly on time.
The learning curve argument boils down to "yes actually long term sleep deprived surgeons learn faster then rested ones unlike the rest of population".
Patient deaths rise during residency programs, meaning that we’re not only burning out the residents but we’re also killing random civilians for the sake of the residency program.
I'm eagerly awaiting their sparse matrix support. It's unbelievable that the entire JVM doesn't have a single comprehensive, production quality sparse matrix library [0]. This is one of the big things keeping my machine learning in Python.
Yeah, Neanderthal is great (I'm a Clojure a user). It's got support for structured sparse matrices (like Toeplitz) the last I checked, but not general CSC/CSR matrices.
It's really easy to do this with pure functional programming in impure languages like Scala. You can use arbitrary impure code within lazy IO values. Outside, the functional purity
makes reasoning easy. If, as recommended, there is only a single point in your program where the IO values are actually executed, then the execution order can be reasoned about statically.
Instead, the most probably cause is freshwater pulses coming out of the melting Laurentide ice sheet changing circulation in the Atlantic. This is actually quite well supported.
Younger Dryas was also confined to the parts of the northern hemisphere, mainly in higher latitudes. A large enough impact to significantly alter climate would see the resulting particulates globally mix in the atmosphere, resulting in more uniform cooling.
Instead, the fact that it was confined to parts of the northern hemisphere also supports the freshwater pulse hypothesis, since it is only a change in the transport of heat from the tropics to the pole, not a change in total heat.
The fact that freshwater pulses from the Laurentide ice sheet caused so much cooling is actually a concern with climate change. It's possible that it could happen again with melting of the Greenland Ice Sheet, at least a weakened version of it (the Laurentide Ice Sheet dwarfed the GIS).