I started on a new charcoal illustration yesterday. Typically I’ve only done portraiture in charcoal; being able to do faces that look like human faces, much less resemble specific people, has been a major goal for basically my entire life. But I’ve finally gotten to a place where I feel my faces mostly look like faces in the way I hope. I want to try some higher-concept stuff.
My approach is to find reference images, print them in b&w, and then cut+paste them together in a collage. From there, I draw the collage. Right now the one I’m working on is called “make it make sense,” a phrase that keeps coming to mind (for personal reasons I don’t feel like discussing). I’m shooting for a sorta symbolic/collage look even within the drawing, rather than attempting a realistic reproduction of a particular piece.
One thing I find fascinating about working with charcoal in a technical, structured way is that it feels like watching a drawing come into focus. I start out with broad areas of light/dark and slowly render out details here and there. The very last thing I do is adding highlights, which invariably takes things from flat to popping out. I never get very rendered frankly — which is pretty common — but I hope my rendering will be more mature this time, I guess? I’ll definitely post it later.
Also, as I mentioned yesterday, I’m trying out separating Sara Reads the Feed posts by content areas. Today we’re looking at medical/health-related news — you into it?
Raw milk lovers want to drink bird flu
Over the various SRF posts, I’ve especially been looking at bird flu. Anyone who hasn’t totally memory-holed the ongoing spread of COVID-19 is probably actively coping with horrible feelings from one pandemic; it’s fair to say that closely watching bird flu springs from my desire to be less surprised by another pandemic.
I don’t really know how to explain people who are insisting on drinking raw milk (potentially infected with bird flu) hoping that it will inoculate them against it. (Gizmodo via Quartz) Apparently raw milk advocates think that the bird flu stuff is just fear-mongering. (Ars Technica)
I guess it shows how bad health education is. It doesn’t feel community understanding of epidemiology improved dramatically in the wake of COVID — if anything, a new level of politicization may have made it worse.
Incredibly, the surging popularity of raw milk seems to be directly related to the detection of bird flu in unpasteurized dairy products and a mistaken belief that being exposed to the virus will be beneficial to humans. […]
Social media platforms like TikTok and Twitter have plenty of anti-science activists extolling the virtues of raw milk, and those influencers have seemed to only gain traction since bird flu was first detected in American dairy cows on March 25.
My 13yo, who is interested in immunology, speculated that drinking pasteurized milk with dead bird flu virus particles might serve to give us some immunity. But these people are seeking to drink live virus. I don’t know if my 13yo’s guesses are on the mark, but it seems safe to say that drinking live virus is just how you end up with the virus.
This is coming from the country where people really think that the MMR vaccine will give children autism, so I shouldn’t be surprised, and yet.
Environmental drivers of rising disease rates
It was only a couple days ago that I mentioned seeing more articles about biodiversity loss.
NPR has a new article about “global change drivers,” specifically highlighting biodiversity.
“We look for general patterns because if they hold true, they might apply to humans,” said Carlson. “Even if these are findings that apply to bats and rodents and primates, but not necessarily us, it’s still bad for us if bats and rodents are sicker, he says, in part because those diseases might jump to us.
For all these species, biodiversity loss emerged as the biggest factor in increasing infectious disease risk, followed by the introduction of new species, climate change and, to a smaller extent, chemical pollution.
Basically, the more rare & uncommon animals we have around, the better: diseases spread when they can easily access a lot of hosts.
There’s some other surprising results in this article too.
Surprisingly, habitat loss — which is a major cause of biodiversity decline — was associated with a decrease in infectious disease outcomes.
The rapid pace of urbanization likely explains this counterintuitive result, Rohr says. When a grassland or forest is bulldozed for human development, most of the plants and animals are wiped out – along with their disease-causing parasites. Urban areas also tend to have better sanitation and access to health care, which could also account for the surprising result, too.
Still, the lack of an effect of habitat loss is somewhat surprising, given scientists have drawn clear links between deforestation and increased risk of diseases like Ebola.
They note that climate change plays the biggest role in zoonotic disease (sicknesses jumping from animals to humans), so all in all, it’s a really big nuanced picture.
Early cancer detection through blood proteins
One cool thing about the COVID-19 pandemic is that it accelerated research funding, and it feels like we’re seeing a lot of the benefits in cancer research. I keep coming across news about novel cancer treatments. On a more personal note, my pitbull got mast cell cancer a few months ago, and then a single injection cured it. They just stuck the tumor with a needle and filled it with some kind of injection and the whole thing fell off.
Though King is a young dog, cancer is one of those things that becomes inevitable the longer an organism lives, thanks to the weirdo nature of cell division. So any advancements in this area are enormous for everyone.
The Guardian talks about potentially being able to detect cancer seven years earlier by looking at blood proteins.
The study, funded by Cancer Research UK and published in Nature Communications, also found 107 proteins associated with cancers diagnosed more than seven years after the patient’s blood sample was collected and 182 proteins that were strongly associated with a cancer diagnosis within three years.
The authors concluded that some of these proteins could be used to detect cancer much earlier and potentially provide new treatment options, though further research was needed.
Early detection means more time to use these fancy new treatments, so let’s hope this research bears fruit!
The oft-untold history of calorie counting
I make no secret about my personal history with eating disorder. My long-time “favorite” method of weight control is calorie-focused. So I was really interested in this Smithsonian Mag article about the historic individuals who pioneered calorie counting as a thing.
I’m going to nitpick the end of this long essay with my own experiences as sole citation, so take it with a grain of salt.
What most disappoints me is how this article just ends up promoting the diet fad du jour (Ozempic et al).
The vast majority of calorie-restricting diets have been shown to fail in the long run and in fact often result in a weight regain beyond the starting weight. Numerous studies over recent decades have shown that taking in calories and burning them (that is, eating and exercising) are not separate processes but are instead intimately related in a complex dance: Cutting calories triggers a cascade of hormonal reactions that increase hunger and fatigue while slowing metabolism, making it more difficult to lose weight. One research analysis in the journal Public Health Nutrition describes attempts to achieve and maintain a calorie deficit as “practically and biologically implausible.” New weight-loss drugs such as Ozempic appear to interrupt that cascade, by manipulating hormones in the gut and the brain to decrease appetite. […]
The error in the “calories in-calories out” equation may boil down to this: Human bodies are not coal-burning machines, and food is not coal. Rather, the body and food are both vastly more complex, and they interact in complicated ways that have evolved in humans over eons.
The research analysis linked doesn’t support the implications of the article overall. The idea that eating at a calorie deficit doesn’t lead to weight loss is simply a myth. It’s in denial of the law of thermodynamics.
The work required to determine your calorie deficit and adhere to can be confusing, though.
Yes, you will get hungrier if you need to eat more, but you will still lose weight if you maintain the deficit nonetheless. You will be able to eat fewer calories and remain at a deficit as you lose weight; adipose tissue takes calories to maintain, so you need to eat less at 150 lbs than at 220 lbs. Go ask your local anorexic for more details.
Likewise, you will not gain weight from nowhere. It’s not magic. Food provides a certain amount of energy to the body (with calories as the measurement of energy), and your body uses a certain amount of energy; if you do not supply your body with excess energy, it cannot be stored as fat.
Numerous factors inherent in foods affect how many calories are actually retained in the body, and whether those calories are stored as fat or, for instance, burned for energy or used to build tissue and muscle. Highly processed carbohydrates break down almost instantly in the body, prompting insulin release and fat storage; protein breaks down slowly and requires more energy to do so, essentially “using up” some of its calories just in digestion. Some foods, including certain types of nuts, have considerably fewer calories when measured in the body than they have in lab tests. And food when raw yields fewer calories in digestion than the same food cooked. These anomalies are just the tip of the iceberg.
The amount of calories you absorb from food are already taken into account on nutrition labels. Differences from changes in preparation are minute, well within the margin of error.
The macronutrient composition of calories (protein vs carbs vs fat) mostly changes the difficulty of eating at a caloric deficit: more fat and protein make it easier because you’re less hungry. Being more hungry doesn’t make your body hang onto calories, though.
The fact is that most people are really bad at accurately logging how much you eat, and they generally overestimate how many calories are burned via activity. I treated myself like a science experiment in conjunction with internet strangers doing exactly the same thing for about fifteen years straight; all of us found that accurate counting completely works. It’s extremely predictable. It also requires a lot of rigor.
If you just pad out your calorie counts a little (add 20-50 calories here or there), and if you don’t add extra calories for casual exercise (like the elliptical at the gym), AND if you reduce daily calories as you lose weight, it all works fine. Do you see why a lot of people fail at it, though? It’s like living a math problem. It appeals to the anxious and obsessive and disordered.
Where calorie counting fails is how it creates unnatural behavior in humans. We are not machines who should weigh and log every bite. Eating behaviors are emotional, cultural, and social. Quantifying our food is bad for us.
If you regain your weight after you stopped counting calories, it’s because you’re eating more. And your natural behaviors (like listening to your appetite) have been interrupted by these unnatural behaviors.
The article concludes with a picture of an Ozempic box.
Ozempic and similar drugs, first prescribed to regulate diabetes, have reshaped the debate around losing weight through will-power alone.
The debate is mostly reshaped by advertising dollars to excess. There’s a lot of nauseating discourse around these expensive drugs as the cure for a problem, when you’re going to have the exact same issue with calorie counting: once you stop doing the thing (counting calories/taking the shots), you’ll regain the weight. Possibly with more, because you haven’t actually altered your behaviors in a meaningful way.
When I was hospitalized for my eating disorder, we learned about eating mindfully. We were counseled on ways to reconnect with our appetites, eat to satiation, and feed ourselves when hungry. It was all behavioral. It came with therapy. That’s how you get healthy on an individual level. Disappointing to see Smithsonian Mag promoting Ozempic.
On a tangential note: I think, in the future, we’re going to see how different neurotypes affect appetite dramatically (dopamine deprivation leads to more snacking, like with ADHD). We’re also going to see how limited time and money for food preparation (because the work environment is hostile) will make people eat more junk. We will see more and more how this is less a medical problem and more a social problem. Maybe. Can companies profit off of fixing society? No? Then maybe not.