The meaning of raw, cultured, grass fed, freshly churned butter.

Any idea what "real" cultured butter really is?

If you happen to live in Europe, you can easily buy really delicious and truly cultured butter. Lucky you.

But if you live in the U.S., you're out of luck unless you happen to live in one of the states that allow raw milk products. And even then, it's hard to come by and you have to be prepared to pay hefty prices, as it should be. You get what you pay for, right?

You may be surprised to know that in North America, even butter sold as cultured is almost never the real thing. You can thank our federal government for that, along with our broken food system. (Hmm, or are they one and the same?)

Anyway, unless you've been to Europe or live in a state where raw dairy products are legal to sell or you make your own (if you have access to a cow or fresh raw cream) chances are you've never truly tasted the amazing flavors of real raw, cultured, grass fed butter.

I just made a fresh batch of cultured butter. I'm fortunate to get to milk one of the farm's cows where we live. Her name is Mathilda and she's a very sweet Jersey cow.

Right after milking her, I filter the milk and let it rest until the cream rises to the top. (That's what real milk is supposed to do). Then I skim the cream top with a small ladle and put it into a bowl to rest at room temperature until it gets very thick on its own. This is the natural process of fermentation. This step may take half to two days, depending of the weather and the milk itself and how cultured I want the butter.

Once the cream is as cultured as I like it, I taste it to make sure it's sweet and ever so slightly tangy. If for some reason it has gone bad, though it rarely happens, you would know. And this is all about your senses. You can see mold, you can smell the off scent and you can touch the unpleasant sliminess. If none of those signs are present, then just to be sure, you can take a tiny taste to make sure it's tangy instead of bitter.

Then I whip or churn it until it becomes butter. The buttermilk left behind after the fat solids separate leave a white liquid behind. This is the byproduct of the cultured cream turned into butter -- and is what real buttermilk is -- completely different from the stuff sold as "buttermilk" in the stores.

Contrary to popular belief, or what we have forgotten since widespread use of pasteurization (just a little more than one hundred years ago), dairy products made from raw milk are safe to eat. Like anything else when it comes to food, you have to practice good hygiene with the cow and make sure the machine and all the equipment and bottles you use are clean.

It's ironic that all all the incidents of salmonella and e. coli result from meat, chicken, fish, eggs and vegetables. Why, because anything can cause these illnesses, not just raw dairy. The irony is that year after year there are outbreaks of these illnesses and it's always as result of pretty much anything but raw dairy.

So the excuse to pasteurize and homogenize dairy is simply a convenience for the dairy industry. It enables them to produce milk from unhealthy cows in dirty conditions at a massive scale and get away with it. 

For example, since every year there are outbreaks of salmonella and e. coli from meat, poultry and vegetables, what would happen if to prevent those outbreaks the government passed laws requiring that meat, chicken, fish and vegetables were cooked before selling? Well, that sort of what's done to milk. 

By the way, even though we've been programmed to be terrified of raw milk dairy in North America by the government and the dairy industry, what makes raw milk very easy and safe to consume is the culturing process because that helps reduce pathogens. Fermentation is what naturally makes dairy not only safe and delicious to eat but also very nutritious precisely because of the microbes in it, which are so essential for good health. And have been for thousands of years.

It's easy to illustrate this example. If you leave a quart of raw milk and a quart of pasteurized milk on the counter unrefrigerated, in a day or two the raw milk will become a delicious clabber -- and if you strain off the whey, a delicious farmer's cheese. The pasteurized milk becomes horrible, spoiled and illness-inducing. 

The process of fermentation or culturing actually eats away the lactose in the milk while making it teem with beneficial microbes or good bacteria. This is the reason why people who suffer from lactose intolerance can usually eat dairy products made with raw milk and not get sick.

Like most of the food-related health problems we suffer now, but which people didn't suffer from much in the past, the problems with milk are the result of radical industrialization. 

Extreme industrialization transforms entire categories of food into something incompatible with human biology. Then we blame the food generally. Milk -- and for that matter grains, meat and other foods -- are not "bad for you." The industrialized version is bad or you.

Healthy milk starts with a healthy cow raised outside on pasture grass. Then you get the milk and you do I what I describe above.

But how can people not get sick or become lactose intolerant when milk comes from factory farms were cows have no room to move, live miserable lives while being fed toxic and pesticide-laden feed? These cows never even see the sunlight or breathe fresh air and are pumped with nasty hormones to make them produce more milk. These cows often develop horrible infections and cancer tumors and then are given antibiotics.

And, of course, all that is passed on the milk they produced. So then the milk is heated to a point that everything bad and good is killed. And it's often homogenized and bleached to make matters worse. Our bodies don't even know how to deal with the dairy when we consume it so we end up with lactose intolerance among a host of other things that can result from consuming these products. I know, it sounds horrible, but that's because it really is horrible.

These are foods literally created to be good products but not good for consumers. In other words, good for lasting in transportation and on store shelves.

This is one of the many reasons that cow dairy and pasteurized and homogenized dairy or foods are not part of the Spartan Diet. But, for now, since we have access to Mathilda, we are making our own fresh and aged cheeses, yogurt, kefir, butter and other dairy products that were always made with clean, raw, pastured milk the old world way.

So what's one to do if one has no way of getting access to raw dairy products from pastured raised cows? Avoid pasteurized dairy like if your life depends on it. 


First persimmons of the season

I enjoyed my first fuyu persimmon of the season just last week. I actually added it as an ingredient in a green leafy salad. Their crunchiness and sweetness balance nicely with the softness of lettuce as well as the tartness of the citrus dressing I made. 

I also love hachiya persimmons, which taste like honey with an intricate hint of mango, nectarine, apricot and honeydew melon. Ancient Greeks called them the "fruit of the gods" or "divine fruit." And they do taste divine.

Like pumpkins, persimmons have a beautiful bright but deep orange color. They're a true berry and are in season from October to December. Many local farmer's markets sell persimmons abundantly during this time. 

Though there used to be hundreds of persimmon varieties, the most common ones sold in the U.S. are the Hachiya and the Fuyu. I enjoy both kinds but the former is my favorite. The Fuyus are usually eaten hard, since they're not astringent. You can cut them into wedges like an apple (with peel and all) but they can also be eaten when they're soft. 

Hachiyas, on the other hand, must be eaten soft. When Hachiyas are hard, it means they’re unripe and therefore astringent. Never try to eat a hard Hachiya. You would be unpleasantly surprised by an extreme feeling of dryness, bitterness and numbness in your mouth because of the high levels of tannins. 

Persimmons are underappreciated in the United States, especially the Fuyu variety. I believe the reason is that they have what you could call a "slimy" and “mushy” texture. People who didn't grow up eating tropical fruit with such characteristics can have a hard time acquiring the taste. 

Hachiyas are usually sold unripe or hard, but they'll eventually ripen (in one to three weeks). If your patience is being tried, place the hard Hachiyas in paper bag with apples or bananas. These release ethylene gas, which speed up the ripening process. They'll get very soft and delicate to handle (like a balloon filled with water). 

Ripe Hachiyas look almost translucent. And when you cut one in half, it will expose the jelly-like flesh, which is very slick -- sort of like custard. Select Hachiyas that have a deep orange color with beautiful glossy skin. The black color patches some may have are just sun spots -- they’re okay to eat. I like to cut them in half crosswise and simply scoop out the inside with a spoon. Hachiyas are great for adding to dressings and baked goods, including cakes and fruit breads. Fully ripe Hachiyas should be stored in the refrigerator or freezer for later use in baking, or for eating frozen like a sorbet. 

Fuyu persimmons have the shape of regular tomatoes and have a golden orange color. The fuyu can be eaten like an apple with its skin, but the calyx or top must be removed. If you like fruit in your salads, fuyus are great for that. I also love them in fruit salads. They really add wonderful sweetness. 

Whether you prefer the fuyus or the hachiyas, these two persimmon varieties each have their own wonderful qualities and unique nutrients to offer. The soft hachiya is lower in calories and higher in vitamin C. But the fuyus offer more potassium, calcium and protein. The moral of the story: Learn to enjoy both of them.

 

Hachiya persimmons.

Hachiya persimmons.

No, the Government Does Not Test Food for Safety

American consumers generally believe that if a food is on the shelf at the supermarket, the ingredients in that product must have been tested by the FDA for safety. If it’s there, it must be OK, right? 

It turns out that such a belief is false. 

Food products may contain any of 10,000 or so “additives” -- often chemical colorings, preservatives, antioxidants, stabilizers, gelling agents, thickeners and so on --  that have been approved by authorities tasked with protecting the health of consumers (and a 1,000 or so that have been neither approved nor rejected).

This approval process is a charade, according to two new studies by the Journal of the American Medical Association (JAMA) and by the Pew Charitable Trusts. 

The Pew study found that 54% of the chemicals added to food have never been tested for safety. Even most basic testing for toxicology has not been conducted on 88% of chemicals deemed of “elevated concern” for reproductive and developmental health. 

The FDA also does not require serious testing to be done on packaging, even though in recent years a wide range of endocrine disruptors (hormone destabilizers) in packaging has  been linked to serious health problems. 

And of course, combinations of chemicals are not tested or required — yet these cocktails of untested chemical combinations are exactly what consumers are ingesting when they eat packaged foods. 

The way it works is that companies wanting to sell a new chemical as a food additives submit their proposal to FDA panels for review. These proposals contain assurances put together by the companies about the safety of the ingredient. 

The panels then either ask for further questions, or simply approve the chemical based on what the manufacturer has claimed. (The majority are approved without question.) 

So who is sitting on these FDA panels? Who are the people decided to approve or deny the chemicals we eat? 

The JAMA study found that “an astonishing 100% of the members of 290 expert panels included in [FDA] review worked directly or indirectly for the companies that manufactured the additive in question.”

The company wishing to profit from an additive ingredient tests it for safety, makes their case to a panel made up of people who will also profit from the sale of that ingredient, then the additive is inevitably approved without any oversight, second opinion, independent testing or anything. 

The study also determined that about one thousand additives are in the food supply without any FDA knowledge or review. 

So let’s review the facts about the approval process for additives: 

* The companies that make and sell chemical additives do whatever safety testing is done. There is no independent testing. 

* Those same companies choose whether or not to submit new chemicals for review. Neither the FDA nor the consumer has any idea that they are in the foods. 

* More than half of new chemical additives are not even tested by the company. They are not tested by anyone. 

* Proposals for new additives are submitted to review committees, and most are approved based solely on the claims of the manufacturer. 

* Every single person who sits on FDA approval committees for additives works for the chemical additives industry. 

The bottom line for food consumers is that industrial food giants can and do put just about any chemicals or other additives into your food, and there is no government monitoring, testing or oversight. 

Sodium Diacetate

Sodium Diacetate

Sorry, Paleo People: Grains Are Part of the Human Diet

There are many versions of the modern Paleo diet, assumed to be based on a partial or simulated version the diet of humans during the Paleolithic era (starting about 2.5 million years ago and ending about 10,000 years ago with the advent of agriculture). All these variants share an opposition to the consumption of grains, such as barley, wheat, rice, quinoa, kasha, oats, millet, amaranth, corn, sorghum, rye and triticale. 

That anti-grain stance is based on the belief that since Paleolithic man didn't eat grains, we shouldn't either.

Archeology is now proving that Paleolithic man, in fact, ate grains. The entire premise of the Paleo diet's anti-grain stance is false.

Paleo diet fans are right about one thing, though: Industrial bread and industrial grain consumption plays a large role in the health crisis. But it's the industrial version of grain consumption -- the monoculture of mutated modern wheat in high quantities and unfermented -- that causes health problems, not grains per se. ​

In fact,  strong evidence has recently emerged that humans and pre-human ancestors have been eating grasses and grass-like plants for about 4 million years, which eventually lead to people focusing on the seeds of those grasses in the form of grains. 

How did this misunderstanding happen? Archeological evidence is skewed toward materials that survive the centuries, such as stone, bone and other hard objects. Soft materials (such as grains) don't survive unless hard objects were used to process them. Even then, actual food residues are unlikely to be detectable millennia later.

Fortunately, advancing technology is enabling us to figure out what ancient peoples really ate without relying on surviving ​bone and tools exclusively. 

When the Paleo concept was first popularized in 1975 by Walter L. Voegtlin, and even when Loren Cordain published his influential book The Paleo Diet in 2002, there was little material evidence for Paleolithic grain consumption. That lack of evidence, combined with an absence of grain in the diets of today's remaining hunter-gatherer groups, lead to the belief that grain consumption was not part of the Paleolithic diet.

The oldest evidence we have for the domestication of grains is about 10,500 years ago. But the direct evidence for the processing of wild grains for food goes back much earlier than domestication.

Mortars and pestles with actual grains embedded in the pores were found in Israel dating back 23,000 years, according to a 2004 Proceedings of the National Academy of Sciences paper. Note that the grains processed were wild barley and possibly wild wheat. This is direct, unambiguous evidence that humans were eating grains deep into the Upper Paleolithic era, and 13,000 years before the end of the Paleolithic era and the beginning of domesticated grains, agriculture and civilization.

A paper published in Proceedings of the National Academy of Sciences details the new discoveries of Paleolithic-era flour residues on 30,000-year-old grinding stones found in Italy, Russia and the Czech Republic. The grain residues are from a wild species of cattail and the grains of a grass called Brachypodium, which both offer a nutritional package comparable to wheat and barley.

Archeologists published a paper in the December, 2009, issue of Science unveiling their discovery in Mozambique of stone tools with thousands of wild grain residues on them dated to 105,000 years ago -- during the Middle Paleolithic. The grain was sorghum, and an ancestor of modern sorghum used even today in porridges, breads and beer.

Some Paleo diet advocates claim that while there is evidence of sorghum processing, there is no evidence that the practice was widespread or that the grain was sprouted and cooked in a way that made it nutritionally usable -- in fact, the dating shows usage of the grain well before the development of pottery. 

This is true: There is no evidence of widespread use or cooking. It's also true that there is no evidence against it. We simply don't know. 

It's easy to imagine how Paleolithic man might have processed grains for food. Essene bread, for example, is made by sprouting grains, mashing, forming into flat patties and cooking them on rocks in the sun, or on hot rocks from a fire. It's easy to sprout grains -- in fact, it's hard to keep them from sprouting without airtight containers or water-proof roofs.

Before the development of pottery, gourds were used for cooking and food storage and carrying. By filling a gourd with water and dropping rocks into it from a fire, the water boils. Into that boiling water, the addition of meat, vegetation and grains would make the most nutritious meal and the most efficient use of available foods. It would enable the removal nutrition from the marrow and creases of bones, soften root vegetables, improve the digestibility of foods like leaves. In other words, such cooking methods would not only be necessary to benefit from grains, but from a wide variety of other foods as well. 

Other early neolithic methods for cooking grains, which we know about from ancient writing including the Old Testament, include cooking primitive bread on hot rocks in the sun and were methods available to Paleolithic people. 

It's also interesting to speculate on fermentation of grains, something practiced by nearly all traditional cultures. If Paleolithic people gathered excess grains and carried them, the question is not whether they fermented them, but how they could have prevented them from fermenting. ​

None of these technologies -- sun-cooking, hot-rock frying and gourd-based boiling -- would leave a trace for archaeologists after 100,000 years. 

The Paleo Diet belief that grain was consumed only as a cultivated crop, rather than wild, also fails the history test.

The grain we now call wild rice was a central part of the diets and cultures of Ojibwa peoples in Canada and North America, and an important food of the Algonquin, Dakota, Winnebago, Sioux, Fox and many other tribes through trade. There was even a tribe called the Menominee, or "Wild Rice People."

Native American and First Nation gatherers of this grain did so by canoe in a method prescribed by tribal law for at least 600 years when they were hunter-gatherers. The cereal crop was instrumental in enabling the Ojibwa people to surve incredibly harsh Northeastern winters, the annual success of which shocked early French explorers. 

Today, most wild rice you can buy in the store is grown in paddies in California. However, the Ojibwa still harvest wild rice in canoes, and you can buy it from them on the Internet.

So now we can say it: Archeology has proved that grains were part of the Paleolithic diet. The anti-grain stance of modern Paleo dieters is based on incomplete archeology.

And it's time for Paleo diet fans cave-man up, admit the error and to start eating healthy, whole ancient grains.

​Barley.

​Barley.

Industrial food supply massively contaminated with 'superbugs.'

Consumer Reports tested a ground turkey from a wide range of retail stores and found that 90% is contaminated with "superbugs" -- antibiotic-resistant bacteria.

In addition to that highly dangerous bacteria, 90 percent of turkey tested "contained at least one of five strains of bacteria, including fecal bacteria and types that cause food poisoning, such as salmonella and staphylococcus aureus." 

Turkey labeled with "no antibiotics," "organic," or "raised without antibiotics" also contained bacteria, but those were less likely to be antibiotic-­resistant superbugs.

Earlier this month, the National Antimicrobial Resistance Monitoring System released a report that found more than half of samples of ground turkey, pork chops and ground beef bought in US supermarkets contained antibiotic-resistant superbugs

The National Antimicrobial Resistance Monitoring System is a group jointly formed by the Food and Drug Administration and the Agriculture Department and the Centers for Disease Control and Prevention. 

The study percentage of contaminated samples is alarming in part because it's a huge increase over the past -- the problem is growing fast. 

The contamination of the food supply with disease-causing bacteria that can't be treated with our strongest antibiotics is caused by the widespread use of antibiotics in livestock to make them bigger and also to enable them to survive in cramped, unhealthy conditions without dying of the diseases that spread in such an environment. (Almost 80 percent of all antibiotics sold in the United States are used in animal agriculture.)

The bottom line is that consumers buy meat based on price, and antibiotics makes it cheaper. 

The take-away: Overwhelming marketing, packaging and propaganda has convinced everyone that highly industrialized food is clean and safe and that it's been tested and approved.

The truth is the opposite: Industrialized food is generally filthy, dangerous and, by the way, environmentally damaging and there is no big government agency testing or inspecting your food before you get it.

Also: Cheap food isn't cheap. Consumers pay far more in other ways than they save at the checkout counter. 

Both the safety and cheapness of industrial foods are delusions. ​

The Spartan Diet rejects all industrialized food, opting instead for post-industrially produced food and wild fish, game and fowl. ​

Screen Shot 2013-05-02 at 6.51.20 AM.png

New Discoveries

Excess protein linked to development of Parkinson's disease http://j.mp/VJyouy

New concern raised over nanoparticles in food: http://j.mp/VG1oDI 

Phthalates, found in most plastic containers, have anti-androgenic effects and may disrupt fat and carb metabolism. http://j.mp/UTS166 

Binge drinking appears to cause inflammation in the brain region that oversees metabolic signaling http://j.mp/UJux3i

 

Roasted tomatoes, onions and garlic. 

Roasted tomatoes, onions and garlic. 

New Discoveries

New research reveals how antibiotics produce changes in the microbial and metabolic patterns of the gut. http://j.mp/VWTNQZ 

Bees need good gut microbes to stay healthy, too: http://j.mp/WRC6SI 

The diet of actual Paleolithic man was higher in carbs and lower in fat than modern #PaleoDiet fans: http://j.mp/TIxLUi 

Food labeled and sold as organic often isn’t http://j.mp/VBwgFN 

Tomatoes may protect from depression http://j.mp/VMzBBU 

Saturated fats tied to falling sperm counts in Danes: study http://j.mp/11aHozr

 

Oh, cheer up and eat some cherry tomatoes!
Oh, cheer up and eat some cherry tomatoes!