Hacker News new | past | comments | ask | show | jobs | submit | skadamou's comments login

If you are able to pay off your credit card balance each month, you can earn rewards points on every purchase which is money back in your pocket. Even if it seems like a tiny amount, in the long run, you are leaving money on the table using a debit card as opposed to a credit card for most of your purchases.

The big caveat being you MUST pay the balance off each month to avoid paying interest otherwise you are losing money by using a credit card.


At least in Chile, it’s common to have 3 months of interest-free payments offered by the banks, and some stores offer up to 24 months for cards issued by certain banks.

It really helps with one-off costly purchases, like a new device or some stuff for home/garden. I never miss card payments and keep my finances organized, avoiding purchases I can’t pay off monthly.


>If students expect that the university will provide accommodation, the university needs enough staff to run a small city and all associated services.

No doubt you need admin to help accommodate students learning needs but I've come around to thinking that they should change the parameters around testing and give every student the opportunity to use "accommodations" rather than making them prove their disability. Everyone is being granted the same degree, if a significant number of the students in your program need accommodations, like extra time on the exam, why not grant it to everybody who asks? Or better yet, just give everyone the time they need. It seems silly to me that you need to prove your need before you can get things like extra time - I think it should be opened up to everybody


I think you mean that the US has the longest continuous running democracy - not just government [1] I think this true and I agree with you that the while the constitution is not perfect, it is still an incredibly impressive document.

[1] https://www.weforum.org/agenda/2019/08/countries-are-the-wor...


I find that list very questionable. I'm Swedish and while yes, we sorta got general voting rights for men in 1911, we didn't get truly general voting rights for men until 1918 and women 1919.

However, before that you could vote if you had enough captial. Is that democracy? That list says it isn't whilst saying that America, in which minorities and women could not vote, is a democracy. That seems like a line draw specifically to be able to say America is the oldest democracy, very disingenuous.

If you consider minorities and women to be people then America didn't become a true democracy until 1965.


From the text:

"This document is dedicated to the memory of the late Professor Charles Lochmuller (right) of Duke University. Dr. Lochmuller was a good guy, a natural comic, and an eminent scientist.

Here I approach biochemistry in a new (I believe) way. It is tradition, starting with Lehninger's first Biochemistry textbook and continuing in essentially all subsequent biochemistry textbooks, to teach about each type of biopolymer in isolation of the others other. Protein DNA, RNA and carbohydrate are described in distinct, well-separated chapters as unrelated chemical phenomena.

In Part 2 of this document I present DNA, RNA, polypeptide, and polysaccharide in the context of their common attributes. Rather than focusing exclusively on the differences (amino acid side chains, nucleic acid bases, etc), I focus on the profound universal properties (self-complementarity, emergence, etc) that unite biopolymers. In my view only by learning about biopolymers in context of each other can one hope to achieve a reasonable understanding of them"


"We found no evidence of increasing risk with a larger area of total tattooed body surface."

Without a dose response, I'm inclined to believe that the increase in lymphoma seen in people with tattoos has more to do with confounding factors than with the ink or the act of getting a needle poked into your skin. I would think that controlling for all confounders in a study like this would be exceptionally difficult.

That said, I'm pretty sure that at least some inks do contain known carcinogens[1]

[1]https://tattoo.iarc.who.int/background/


This is a Swedish study, so what might be possible is using the population registry to contact siblings of the cancer patients to ask about traits like tattooing and then their health data would already be in Swedish system and linkable. This would control for a lot of the relevant confounders.


Though not those related to people's choice to get tattoos.


Tattoos are so common these days that I'm not sure you can say much about a person just because they have tattoos.


Any given person, for sure. In aggregate though I bet you'd find correlations.


Not sure you could in Sweden. Stats[1] show 21% tattoo prevalance in Sweden compared to 12% in the EU in general.

[1] https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10163470/


they are statistically cooler


Which is a strong confounding factor ...

"Sharing traits common in motor cycle gangs increases risk of cancer - new study"


No, it would control for a lot of those factors. Things like impulsivity, sexuality, sensation-seeking etc. All very heritable or family-level. If you ever find any correlations reported for tattoos between family members, it's gonna look like everything else: steeply increasing with relatedness.


Even a correlation with the amount of ink could be a lifestyle confound. I'm pretty sure that the population that has a small tattoo differs from the one with large parts covered. Indeed, it is hard to find a cause.


Yes. Also, the survey response rate was the biggest difference between groups (54% vs 47%), which could easily explain the observed differences. The confidence intervals cross 1.0 for nearly all reported IRR values.

For those who don't know how to interpret medical evidence, this study is very weak.


Those response rates are fairly awful with two groups that are markedly different. Seems very likely that they’d self-select on the face of it especially if they knew what the research question was.


Tattoos basically shift your immune system into overdrive https://www.youtube.com/watch?v=nGggU-Cxhv0


Indeed. It's not the ink content that led to Am J Clin Pathol. 2014;142(1):99-103. saying:

"The mean age of death for tattooed persons was 39 years, compared with 53 years for non-tattooed persons (P = .0001). There was a significant contribution of negative messages in tattoos associated with non-natural death (P = .0088) but not with natural death."


I'm not sure "people with negative msgs in tattoos died 14 years earlier" sheds light for me on the TFA.

TFA has a more direct, physical, concern - it starts from a well-known, that tattoo ink ends up in lymph nodes, and it does a statistical analysis showing there's a significant statistical result in lymphoma occurence.

I think people with negative tattoos dying younger reduces the # of people with tattoos who get lymphoma, as they have less ink-in-lymph-nodes years.


It shows the existence of some very strong confounding mechanisms.


There's certainly plenty of those! :)

I doubt they intended to communicate something that general, and if they did, I doubt they meant to pick one that would reinforce the conclusion.


Yeah totally agree. That the size of the tattoo or the number of them not increasing risk makes no sense. Somewhat like claiming whether you smoke a cigarette or 20 a day, the risk is the same. If the latter was true it would more likely indicate that there is some other commonality in that group increasing the risk.

Also the slicing and dicing, “11 more than the index year” and so on, is multiple hypothesis testing on the face of it; I wonder if they adjust for that.


I'm stoked to see an article shedding some light on this! Yes, the best practices for how we treat concussions is shifting and some people are not getting the best advice post-concussion.

The discussion about who to blame is a little naive IMO. Doctors do not learn everything they need to know to be doctors in medical school. Largely, they learn the foundations of medicine that they need to be successful in a residency. In residency is where they will learn the things they need to know about how to treat patients in their specialty. The authors primary care physician was very likely taught about concussions in residency, before these new guidelines came out, and simply has not updated their practice to be in line with the new guidelines... Science changes and I'm sure it is hard to stay on top of each new thing that comes out but at the end of the day, that is the job of the physician to stay on top of new developments.


Lately, I've come to the conclusion that WFH tends to benefit older devs with more experience at the expense of their younger colleagues. If you are established/experienced and already have a strong network in your field, working from home is great because it gives you more flexibility and less distractions from your work. If you're a junior dev however, working from home creates a higher barrier to asking questions/learning from more experienced folks on your team and also negatively impacts your ability to network/meet people in your field.

I'm sure that this is super situational dependent but on the whole I think it negatively impacts junior devs but I would love to hear some other people's thoughts.


Yes and no.

I personally don't see a lot of difference in pairing remotely over sitting next to each other, as someone with a non-standard keyboard and layout I actually find it a lot easier, because I can watch and do some doc reading/research/tickets during that.

If you are more hands-off and don't engage with your younger colleagues, then yes, absolutely.

I don't really know how you would just magically absorb knowledge and not wasting all your time if you listened to everything that everyone in your physical vicinity says. (Including cursing, talking about sportsball results, etc.pp) But maybe people who are good at listening while doing something else see this differently.

What I do miss are the coffee break conversations, hopefully also with people from other teams and not only your own, but I think this can be remedied by going to the office occasionally, certainly not something I need on a daily basis.


Here's the meat of the post:

"To study this, Koff and his collaborators around the world are using existing samples, stored in biobanks, and sequencing vast numbers of crucial lymphocytes, known as B and T cells, each of which contains a unique receptor to recognise parts of viruses, bacteria, or tell-tale signs of cancer cells. Koff’s aim is to create an atlas of these receptors, and feed that information into an AI model to predict what the complete repertoire of B and T cells might look like in young, otherwise healthy individuals, what changes as we age, and what we might be able to do to modulate it."

This seems worthwhile to me but I'm skeptical of an AI's ability to accurately predict biology completely. This is (very loosely) sort of like an AlphaFold for antibodies however, we already know that AlphaFold does not predict unique protein domains well. Maybe this will improve with time - and more protein structures, but fundamentally I don't see how AI could predict the existence of something novel... Similarly, I don't see how this project could accurately map all of the possibilities for B/T cells.


Here's the central finding

"Here is a simple experiment we did one summer afternoon in Kamilche Point, Washington. We compared several forms of organically grown shiitake mushrooms, which had starting level of 100 IU/100 grams. We compared the vitamin D levels of three sets of mushrooms, all from the same crop. The first was grown and dried indoors. The second set was dried outdoors in the sunlight with their gills facing down. The third set of mushrooms was dried outdoors in the sunlight with their gills facing upwards for full sun exposure. The most vitamin D was found in shiitake dried with gills up that were exposed to sunlight for two days, six hours per day. The vitamin D levels in these mushrooms soared from 100 IU/100 grams to nearly 46,000 IU/100 grams (see chart). Their stems, though, produced very little vitamin D, only about 900 IU. Notably, vitamin D levels dropped on the third day, probably due to over-exposure to UV."

It's worth noting that this is Paul Stamets' work. He's definitely worth the google search if you've never heard of him and are into this sort of thing - super interesting guy. He is not educated as a scientist though so I usually take most of his stuff with a grain of salt.


Interestingly enough I did a deep dive on mushrooms and vitamin D about a month ago. You can easily find a lot of papers that demonstrate sunlight and vitamin D with mushrooms, so don't take the unofficial training as a knock (we should always be skeptical, but verification exists).

It's all about UV light. Some producers are now using UV to help increase the vitamin D of their mushrooms (along with killing bacteria) but they can also quickly lose it when in the fridge. So just set them out on the windowsill 15-90 minutes before you use them. The more the better. Surprisingly this even works for dried mushrooms, so just build the habit. If you don't, you basically shouldn't expect to be getting meaningful amounts of vitamin D from your mushrooms. The effects are this large.

In addition to this, make sure when you cook, that you start your mushrooms dry. This isn't a vitamin D thing, but most people cook their mushrooms wrong. They have a lot of water in them to begin with. Start dry, then when you add things they will soak up surrounding flavors and have a better texture (people's main concern), being more meaty than spongy/slimy.


> In addition to this, make sure when you cook, that you start your mushrooms dry. This isn't a vitamin D thing, but most people cook their mushrooms wrong. They have a lot of water in them to begin with. Start dry, then when you add things they will soak up surrounding flavors and have a better texture (people's main concern), being more meaty than spongy/slimy.

This is how they do it in Chinese soups, but honestly I haven't personally seen other cultures use dried mushrooms.


I suspect they mean "dry" as in "not damp from washing" not "dry" as in "dehydrated". Lots of people wash their mushrooms, which can lead to them cooking down soggy.


You're closer, but I specifically mean to take your fresh mushrooms and throw them into the skillet without any oil (or water). Using dehydrated mushrooms in soups is kinda a shortcut to this.

Though the above comment mentioned Chinese soups and we should recognize that they often use wood ear, which is a different class of mushrooms than most westerners are used to. Wood ear are jellies whereas most western cuisine uses mushrooms with caps and stems (cremini, chanterelles, morels). Still, in general I recommend dry sauteing mushrooms before adding them to the rest of your dish.

More detailed cooking explanation:

Mushrooms are mostly water and air (i.e. a sponge), which is why the traditional French cooking methods tell you to not wash your mushrooms but to clean dry. So recognizing that these are mostly water, really we want to draw the water out, heat (break down cell walls), and then rehydrate with something more flavorful. Most people fuck up on the first step, by using too high of a heat or using something that the mushroom absorbs (oil or water). You start low heat, then the mushrooms release their water, you then bring the heat up to evaporate the water (now cooking the mushrooms). After that, your options change and you need to decide if you want to just brown the mushrooms with oil and seasoning or add them to a sauce or soup. The former will maintain a lot of mushroom flavor but also that nice fat and seasoning. The latter is going to absorb your sauce and add texture and add umami flavor. Either way, you generally reduce the heat here. But I cannot stress enough how mushrooms are sponges and you have to cook under this paradigm. It's like the difference between cooking french toast with bread straight from the loaf or using bread that has been left overnight (or a bit stale). Though maybe this is a bad comparison because a lot of people don't use stale bread for french toast either, despite that being a major part of its history.


I usually look at all the gunk on the mushrooms, and knowing they're grown in a chicken poop mixture, wash them anyway. Probably not better than a good wipe but I can't help myself.


Rinsing them is absolutely fine. The "food science" recommendation around cooking mushrooms has changed in the last few years. Many cooks are seeing good results by starting to sautee the mushrooms in water and only cooking with oil once the water has boiled off. This enables the air pockets to collapse and for a lot of the moisture to evaporate so you end up needing significantly less fats. I've switched to this method myself, and I've been very happy with the results.

https://www.americastestkitchen.com/cooksillustrated/article...

https://www.youtube.com/watch?v=OPJmJdStvwI


This is a really cool idea, thanks for sharing it. I can’t wait to try it.


Might well depend on the species as to which conditions are most favorable, but AIUI most of your farmed mushrooms are grown on other-than-poop substrates: grain, straw, wood and wood pellets, coffee grounds, shredded coconut husk.

Probably still a good idea to clean them off, though: none of those other things are particularly tasty :)


> knowing they're grown in a chicken poop

Most mushrooms you eat aren't coprophilous fungi. You're probably thinking of some variants of magic mushrooms (not all magic mushrooms grow on dung).

The mushrooms you're eating grow from the dirt. If you're into mushrooms, you'll probably also have mushrooms that grow on trees. But unless you're looking to trip, you're not getting poop shrooms.


Commercially grown mushrooms are usually grown on a pasteurised medium so anything attached to the mushroom should be free of pathogens.


Many Southeast Asian cultures dry mushrooms. I would suppose, purely based upon the scarcity of winter foods, that northern cultures (Korea through Siberia, the Altai, Central Asia, Russia and Nordics) would also.


Is this a problem a lot of people have, the slimy mushroom thing? I don't do anything special or careful at all with mushrooms; I rinse them off, maybe halfheatedly salad-spin them, chop them up and put them in a hot pan with some fat. I've never had a slimy one.


I used to carefully clean my mushrooms with a brush to keep them dry so they wouldn’t end up slimy. But someone told me that was a myth and I decided to try just quickly rinsing them instead. Never resulted in slimy mushrooms, and I was shocked at how much crap ended up in the rinse water.


My current belief is that there's nothing you can do to a mushroom that a hot pan won't fix.


Parboiling is often a critical step for certain mushroom species to remove toxins. But that's probably only relevant if you're foraging outside your local grocer.


Tbh, it mostly comes from people using too much oil. I have a longer description of my preferred cooking method above, though I should add that I still love them when cooked the normal way.


Glass blocks UV radiation I believe, so you'll need to lay them outside to get sunlight directly


Glass blocks UV-B and UV-C, but not UV-A (unless you have a film). Many mushrooms can be enriched with UV-A, but UV-B is best. So yes, I agree, better to place outside in direct sun (or window open). But building the habit is the harder part for people tbh.


Also transparent polymers will generally break down with prolonged UV exposure.

Alternatively to leaving them exposed outdoors, you can use quartz.


Glass doesn't block UV A so that might not be the case, don't know in particular which type of UV rays the mushrooms need to produce vitamin D however.


“He is not educated as a scientist though so I usually take most of his stuff with a grain of salt.”

As someone who was educated as a scientist, you should reevaluate this opinion.

There are plenty of brilliant professional scientists. There are also a lot of idiots in science.

There are also a lot of brilliant people who decided that the scientific path was too restrictive or outright false, so they (brilliantly) invested their own scientific life force externally

Having a science degree doesn’t make one less fallible to mistakes. If anything it makes people more likely to fudge to get ahead.


> As someone who was educated as a scientist, you should reevaluate this opinion.

I would take a lot of science with a grain of salt too. Looking at the recent peanut butter allergy studies for example. Original 2016 study used 10 jewish children in Britain vs. 10 jewish Children in Israel. This study caused policy change around the world. Australia went from under 25% to over 80% of people feeding their children peanut butter under the age of 1. It had no discernible effect on allergies to peanut butter at all. They've followed it up with a larger study, 100 jewish kids in Britain vs. 100 jewish kids in Israel to prove out their original theory.


https://directorsblog.nih.gov/2017/01/10/peanut-allergy-earl...

“That trial, involving hundreds of babies under a year old at high risk for developing peanut allergy, established that kids could be protected by regularly eating a popular peanut butter-flavored Israeli snack called Bamba. A follow-up study later showed those kids remained allergy-free even after avoiding peanuts for a year.

Under the new recommendations, published simultaneously in six journals including the Journal of Allergy and Clinical Immunology, all infants who don’t already test positive for a peanut allergy are encouraged to eat peanut-enriched foods soon after they’ve tried a few other solid foods.”

Early exposure seems to be the recommended path.


I get that it's the recommended path, it's been recommended since that first study in Australia and has had zero effect on peanut allergy rates even though # of people feeding peanut butter to their children under 1 has increased significantly.


Honest Q, Is that true? I mean, A) that since this study in 2015 or so, there’s been no decline in the prevalence of peanut allergies in the relevant populations? B) people are in fact submitting their children to disciplined exposure in a significant (enough) quantity?


Huh? I thought very recently that link was established as being real?


Please post a link to the study.


May I ask why you've included the study participants religion even though it doesn't appear in the studies you've cited? (afaik)


From the study[1]: "Several years ago, we found that the risk of the development of peanut allergy was 10 times as high among Jewish children in the United Kingdom as it was in Israeli children of similar ancestry. This observation correlated with a striking difference in the time at which peanuts are introduced in the diet in these countries: in the United Kingdom infants typically do not consume peanut-based foods in the first year of life, whereas in Israel, peanut-based foods are usually introduced in the diet when infants are approximately 7 months of age"

So, the difference in real-world outcomes amongst Jewish populations in different countries inspired this study.

[1]: https://www.nejm.org/doi/10.1056/NEJMoa1414850?url_ver=Z39.8...


The person in question is definitely a good example of someone you SHOULD take with more than a grain of salt. From what I could gather (mainly from /r/mycology) he's not really well respected in his field, and seems to be rather a salesman who draws premature conclusions and evangelises them. The Netflix documentary with him also had my "this sounds too good to be true" sensors going off more or less constantly.


Do you have any logical reason to not trust him or is it simply “I have a feeling”

I don’t know I can’t say. But scientists in academia and in national labs are salesmen for themselves and their research too, so I think you’re possibly giving him too hard a time compared to them


stamets strikes me as a well-intentioned enthusiast who is more scientific than most. He did a ton of work on the actual techniques used today in a lot of commercial production and is a really interesting guy. It's also evident that he has sampled a little too much of his own more 'exotic' product.


In the documentary (I'm assuming Fantastic Fungi), he came off as one of those types that are first and foremost making money off it, but also believes some of what he's saying. I honestly just wanted that documentary to just be a Planet Earth style thing with mushroom footage, and less talking heads.


We've bought an oyster mushroom kit from him, as well as some soil inoculum. From an ecological standpoint I wonder how genetically diverse his inoculum strains are, and whether they can crowd out native strains of fungi.


I've read most of his books/publications, his work is good (and generally reproducable), but he is somewhat prone to grand statements.

Thing is, a lot of labs will largely ignore fungi because they are fucking awkward to work with


I am open to the idea that someone from a background outside of academia could come up with some really brilliant scientific work. I just know that I am not capable of discerning what is solid science in fields different from my own so I typically rely on signals like an academic background (although not exclusively) to help me gauge the trustworthiness of a claim.


Stamets makes some extraordinary health claims about mushrooms that afaik have never been clinically tested (ex. anticancer and antiviral applications). I think a grain of salt is appropriate in this case - but he could be entirely right!


> He is not educated as a scientist though so I usually take most of his stuff with a grain of salt.

Credentialing doesn't / shouldn't matter for science. The science should stand on it's own as long as you follow the scientific method, document your assumptions & results, and do work that is reproducible. Bonus points for good peer review. We should reserve the "grains of salt" for people who are sloppy or have been provably bad actors. </soapbox from someone who is "educated as a scientist">


I actually appreciate his wealth of knowledge (Paul Stamets) and his presentation of facts, often I learn a lot and his holistic approach is something that most universities would find impossible to implement as curricula. While not a classically-trained scientist, he is an erudite scholar of the highest caliber and his continual contributions to science and furthering our understanding of the natural world more closely correlates to the inceptive motivations of Science than would be found in the modern day tower of ivory.


For someone not educated as a scientist, he has a remarkably solid history of academic publications. Pretty inspiring actually.


> The vitamin D levels in these mushrooms soared from 100 IU/100 grams to nearly 46,000 IU/100 gram

Isn't 46000 IU well into the dangerous range? This procedure sounds like an easy way to poison your liver.

The number I found says 4000 IU is a recommended max. You could say that you should only eat <10 grams then, but how much variability is there in the amount?


According to this at least 5k to 50k daily appears to be safe:

https://pubmed.ncbi.nlm.nih.gov/30611908/#:~:text=In%20summa....

Not sure what happens if you eat 10 mushrooms though


Per 100g was the amount, I think. Need to eat a decent quantity at that rate.


In many countries, the standard of care for low vitamin D is a bolus dose of 60K IU once a month is perfectly fine. It's only a problem if you were eating this much every day.


When it's from food the tolerance is much higher. It's a bigger danger when taking supplements.


Is it? Interesting.


The "100 IU/100 grams" starting point is with fresh mushrooms, before they were dried. It's not clear to me from the article whether "46,000 IU/100 grams" is per 100g of the fresh mushroom input or per 100g of the dried mushroom output. Mushrooms are ~80-90% water so it makes a big difference.


It makes a big difference, but even if you removed 90% of the fresh mushroom's weight without changing vitamin D levels, you'd still only have 1000 IU / 100 grams.


Sounds so simple.... But how can one check vitamin D iu's for oneself? What machine is used, and how does it work?

Or is this a public relations piece from the mushroom marketing board?



Thanks. If that's it, it doesn't sound that amazing or accurate:

> A nonspecific test merely indicates that any of several in a category of substances is present. Although a nonspecific test could statistically suggest the identity of the substance, this could lead to false positive identification. However, the high temperatures (300°C) used in the GC-MS injection port (and oven) can result in thermal degradation of injected molecules,[3] thus resulting in the measurement of degradation products instead of the actual molecule(s) of interest.

from the wiki


He posted fake news about trace of life on Mars on twitter


It's interesting the vitamin D level dropped if over exposed to UV. I wonder what caused the drop.


It's unstable under overexposure of uv light

https://www.sysrevpharm.org/articles/the-effect-of-temperatu...


I think vit. D is generated from a reaction of a protein with UVB.

UVA is rather destructive, so I would assume that’s what’s going on — to much UV-A on a part of the shroom that by design is shielded from direct exposure, is basically breaking everything down.

That’s my .02.


perhaps UV?


The articles explanation for the origin of this super gene seems similarly complex. Relying on their being some previous version of ant that is not clonal is a step too far into speculation for me to consider that hypothesis too seriously. Or maybe I’m missing something? Maybe an entomologist would think differently!


I'm no entomologist, but I'm sure there was some previous version of these ants that was not clonal. The last common ancestor of all ants was not clonal.


Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: