Mebendazole

Mebendazole

In the 2024 laboratory study, scientists treated human colon cancer cells with the anti-worm drug Mebendazole and then measured how many cells died after 48 hours.

Using a special test that separates living cells from dying cells, they found that 78% (±12%) of the cancer cells were pushed into apoptosis, which is the cell’s natural self-destruction process.

This result was extremely statistically strong (P = 0.0001), meaning it was very unlikely to be due to chance. In simple terms, this shows that mebendazole didn’t just slow the cancer — it actively forced most of the cancer cells to shut down and die in the lab.

PMID: 37837472

Finish reading: https://pubmed.ncbi.nlm.nih.gov/37837472/

Astrid Lindgren

Astrid Lindgren

Sweden, 1941. A mother sits beside her daughter’s bed. The girl is burning with fever, slipping in and out of delirium. “Tell me a story,” she whispers.
“About what?” the mother asks.
“Tell me about Pippi Longstocking.”
Astrid Lindgren had absolutely no idea what that meant. Her daughter Karin had just invented a name out of thin air. But Astrid started talking anyway—making it up as she went.
She described a girl with bright red pigtails and mismatched stockings. A girl so strong she could lift a horse. A girl who lived alone in a house called Villa Villekulla with a monkey and a horse, with no parents to tell her what to do. A girl who ate candy for breakfast, slept with her feet on the pillow, and told adults “no” whenever she felt like it.
Karin loved her. Astrid kept inventing more Pippi stories every time her daughter asked.
A few years later, Astrid slipped on ice and injured her ankle. Bedridden and bored, she decided to write down all the Pippi stories as a birthday present for Karin. Then she thought: maybe I should try to publish this.
Publishers rejected it immediately.
The character was too wild. Too disrespectful. Too inappropriate. This was 1944 Sweden, where children’s books were about obedient boys and girls learning moral lessons. Pippi Longstocking was pure chaos—a child living without adult supervision, lying when it suited her, defying teachers, physically throwing policemen out of windows, refusing to go to school or follow any rules.
Critics would later call the book dangerous, warning it would teach children to misbehave.
But in 1945, one publisher—Rabén & Sjögren—took a chance. They published Pippi Longstocking.
Children went absolutely wild for it.
Finally, here was a character who represented everything they weren’t allowed to be. Loud. Messy. Free. Independent. Pippi had adventures on her own terms, made her own decisions, and treated adults as equals rather than authorities to be feared.
Some adults were horrified. But other adults—and millions of children—saw something revolutionary: a story that treated children as intelligent, capable people deserving of respect and autonomy.
Astrid kept writing. She created Karlsson-on-the-Roof, Emil of Lönneberga, Ronya the Robber’s Daughter. All of her characters questioned authority, trusted their own judgment, and had rich emotional lives. Astrid never wrote down to children. She didn’t simplify their feelings or pretend life was always happy. Her books dealt with loneliness, fear, injustice, even death—but always with respect for children’s ability to understand complex emotions.
Her books began reshaping how Swedish culture understood childhood itself.
By the 1970s, Astrid Lindgren wasn’t just Sweden’s most beloved children’s author—she was a cultural icon with real political power.
In 1976, she wrote a satirical fairy tale called “Pomperipossa in Monismania” published in Sweden’s largest newspaper. It mocked the country’s absurd tax system using humor—describing a children’s author being taxed at over 100% of her income.
The piece exploded into national conversation. It sparked fierce debate about tax policy. The Social Democratic government, which had ruled Sweden for over 40 years, lost the election shortly after—partly because of the tax debate Astrid’s satire had triggered.
She’d proven her voice could move mountains.
And she decided to use that power for something that mattered even more than taxes.
In the late 1970s, Astrid turned her full attention to a brutal reality that everyone in Sweden simply accepted as normal: hitting children was legal.
Parents spanked. Teachers used rulers and canes on students. It was called “discipline,” not abuse. It was how things had always been done.
Astrid Lindgren believed it was violence against the most defenseless people in society. And she believed it had to stop.
She began speaking everywhere—newspapers, television, public speeches, interviews. She wrote articles. She appeared on national programs. She used every ounce of her fame to argue one simple point: hitting children teaches them that violence is acceptable. Physical punishment doesn’t create better behavior—it creates fear, shame, and the lesson that might makes right.
Sweden listened to her.
In 1979, Sweden became the first country in the entire world to legally ban corporal punishment of children.
Parents could no longer legally hit their children. Teachers couldn’t use physical punishment in schools. The law didn’t criminalize parents, but it established an absolute principle: children have the right to protection from violence, even from their own parents.
It was revolutionary. No country had ever done this before.
And Astrid Lindgren’s advocacy was absolutely crucial to making it happen.
She didn’t stop there. She campaigned for animal rights, environmental protection, and humane treatment of farm animals. She used her platform to push Sweden toward becoming a more compassionate society—for children, for animals, for anyone vulnerable.
Astrid continued writing into her eighties. She published over 100 books translated into more than 100 languages. Pippi Longstocking became a global icon—a symbol of childhood independence and joy recognized on every continent.
When Astrid Lindgren died in 2002 at age 94, Sweden mourned her like a beloved national grandmother. The Swedish royal family attended her funeral. Thousands lined the streets. The ceremony was broadcast live across the nation.
But her real legacy was what she changed.
Sweden’s 1979 ban on corporal punishment influenced the entire world. Today, more than 60 countries have followed Sweden’s lead and outlawed hitting children. That number grows every year.
And countless millions of children grew up reading about Pippi, Emil, Ronya, and Karlsson—characters who showed them that being a child didn’t mean being powerless, voiceless, or less important than adults.
Think about what Astrid Lindgren actually accomplished.
She created Pippi Longstocking in 1941 to entertain her sick daughter. That girl with red pigtails and superhuman strength became one of the most recognized characters in children’s literature worldwide.
But Astrid’s real achievement was understanding that if you’re going to write stories where children have dignity, you have to fight to build a world where they actually do.
She wrote books that respected children. Then she helped create laws that protected them.
Sweden became the first country to write that respect into law.
Because one author believed children deserved better—and refused to stay quiet until the world agreed.
Astrid Lindgren proved that respecting children wasn’t just good storytelling. It was good policy. It was justice. It was necessary.
And it started with a feverish little girl asking her mother to tell her about a character with a funny name.
That’s how revolutions begin.

When the Vaccinated Body Becomes the Broadcast Tower: The Shedding Paradox

Covid Jab Created Harm Factory

This explains why the unjabbed also need to detox the Spike Protein.

Story at a Glance

The paradigm shift: What we call “contagion” may not require pathogens at all. Cells under stress naturally broadcast molecular signals via extracellular vesicles—biological packets that can transfer information between organisms and create the illusion of infectious transmission.

The mRNA revolution: COVID-19 vaccines have transformed human cells into producers of spike-bearing exosomes that circulate for months, appear in all body fluids, and carry pharmacologically-induced signals throughout the population. This is biological broadcasting at an unprecedented scale.

The amplification crisis: Self-amplifying RNA vaccines now multiply this process exponentially, creating replicating genetic instructions that generate vast quantities of synthetic biological signals—potentially turning each injection into a self-perpetuating broadcast system.

The regulatory void: No authority has investigated whether these vesicles influence unvaccinated individuals, despite widespread reports of symptoms following intimate exposure. We have deployed a global biotechnology without understanding its most basic consequence: whether it alters biological communication between humans.

The central revelation: Billions of people may now be involuntary broadcasters of pharmaceutical signals, fundamentally changing the biological information environment of our species.

Finish reading: https://sayerji.substack.com/p/when-the-vaccinated-body-becomes

Here are my two offerings to potentially put a body on the road to recovery:
https://www.healthelicious.com.au/NutriBlast-Anti-Spike.html

https://www.healthelicious.com.au/NutriBlast_DNA_Heart_Mitochondria.html

Neil Diamond

Neil Diamond

He walked away from medical school with $50 in his pocket to chase an impossible dream—and wrote the song that would make stadiums sing for 60 years.

Brooklyn, 1960.

Neil Diamond sat in his NYU dorm room, supposedly studying for his pre-med finals. His parents—humble Jewish immigrants who’d sacrificed everything—were counting on him to become a doctor. Security. Stability. The American Dream.

But Neil couldn’t focus on anatomy textbooks. His mind kept drifting to the melody he’d been humming all week. His fingers kept reaching for his guitar instead of his stethoscope.

That night, he made a choice that terrified him.

He dropped out of medical school. Walked away from the scholarship. Left behind his parents’ dreams and his own guaranteed future.

For what? A job writing songs at Sunbeam Music Publishing for $50 a week.

His parents were devastated. His friends thought he was crazy. He had no backup plan, no connections, no certainty that he’d ever make it.

For six years, he lived on hope and stubbornness. Writing songs nobody wanted. Playing gigs nobody attended. Wondering if he’d made the biggest mistake of his life.

Then 1966 happened.

A song he’d written—”I’m a Believer”—became one of the biggest hits of the decade. Not for him, but for The Monkees. Suddenly, the kid from Brooklyn who’d gambled everything was being played on every radio in America.

But Neil wasn’t done.

He wanted people to hear HIS voice telling HIS stories. So he kept writing. “Solitary Man.” “Cherry, Cherry.” “Cracklin’ Rosie.”

And then, in 1969, he wrote eight simple words that would become bigger than he ever imagined:

“Sweet Caroline… good times never seemed so good.”

Nobody knows for certain who Caroline really was. Some say Caroline Kennedy. Others say it was about his wife. Neil himself has changed the story over the years, almost like he knew the song needed to belong to everyone, not just to him.

Because that’s exactly what happened.

“Sweet Caroline” became the song couples slow-danced to at weddings. The song crowds screamed at baseball games. The song that brought together complete strangers in bars, concert halls, and living rooms across the world.

For over five decades, Neil Diamond gave us the soundtrack to our lives. More than 130 million records sold. A legacy that touched four generations.

In 2018, his voice began to fail him. Parkinson’s disease forced him off the touring stage—the place where he’d felt most alive for 50 years.

He could have disappeared quietly. Retired in peace.

Instead, he keeps writing. Keeps creating. Keeps proving that the fire that made a 20-year-old drop out of medical school never really goes out.

The kid who risked everything on a dream didn’t just make it.

He made us all believe that impossible dreams are worth chasing.

Because sometimes, the biggest risk isn’t following your heart.

It’s spending your whole life wondering what would’ve happened if you had.

Brigadier General Theodore Roosevelt Jr.

Brigadier General Theodore Roosevelt Jr

June 6, 1944.

As the landing craft approached Utah Beach, Brigadier General Theodore Roosevelt Jr. gripped his cane and checked his pistol.

He was fifty-six years old. His heart was failing. Arthritis had crippled his joints from old World War I wounds. Every step hurt.

He wasn’t supposed to be there.

But he had insisted—three times—on going ashore with the first wave of troops. His commanding officer, Major General Raymond “Tubby” Barton, had rejected the request twice. Too dangerous. Too risky. No place for a general.

Roosevelt wrote a letter. Seven bullet points. The last one: “I personally know both officers and men of these advance units and believe that it will steady them to know that I am with them.”

Barton relented.

And so Theodore Roosevelt Jr.—eldest son of President Theodore Roosevelt, veteran of World War I, twice wounded, gassed nearly to blindness—became the only general officer to storm the beaches of Normandy in the first wave.
This wasn’t ancient history. This was June 6, 1944.

The ramp dropped. German guns opened fire. Bullets slapped the water. Artillery shells screamed overhead. Men scrambled onto the sand, some falling before they took three steps.

Roosevelt stepped off the boat, leaning on his cane, carrying only a .45 caliber pistol.

One of his men later recalled: “General Theodore Roosevelt was standing there waving his cane and giving out instructions as only he could do. If we were afraid of the enemy, we were more afraid of him and could not have stopped on the beach had we wanted to.”

Within minutes, Roosevelt realized something was wrong.
The strong tidal currents had pushed the landing craft off course. They’d landed nearly a mile south of their target. The wrong beach. The wrong exits. The whole invasion plan suddenly useless.

Men looked around in confusion. Officers checked maps. The Germans kept firing.

This was the moment that could turn the invasion into a massacre.

Roosevelt calmly surveyed the shoreline. Studied the terrain. Made a decision.

Then he gave one of the most famous orders in D-Day history:

“We’ll start the war from right here!”

For the next four hours, Theodore Roosevelt Jr. stood on that beach under relentless enemy fire, reorganizing units as they came ashore, directing tanks, pointing regiments toward their new objectives. His cane tapping in the sand. His voice steady. His presence unshakable.

A mortar shell landed near him. He looked annoyed. Brushed the sand off his uniform. Kept moving.

Another soldier described seeing him “with a cane in one hand, a map in the other, walking around as if he was looking over some real estate.”

He limped back and forth to the landing craft—back and forth, back and forth—personally greeting each arriving unit, making sure the men kept moving off the beach and inland. The Germans couldn’t figure out what this limping officer with the cane was doing. Neither could they hit him.

By nightfall, Utah Beach was secure. Of the five D-Day landing beaches, Utah had the fewest casualties—fewer than 200 dead compared to over 2,000 at Omaha Beach just miles away.

Commanders credited Roosevelt’s leadership under fire for the success.

Theodore Roosevelt Jr. had been preparing for that day his entire life.

Born September 13, 1887, at the family estate in Oyster Bay, New York, he was the eldest son of Theodore Roosevelt—the larger-than-life president, war hero, and force of nature. Growing up in that shadow was impossible. Meeting that standard seemed even harder.

But Ted tried.

In World War I, he’d been among the first American soldiers to reach France. He fought at the Battle of Cantigny. Got gassed. Got shot. Led his men with such dedication that he bought every soldier in his battalion new combat boots with his own money. He was promoted to lieutenant colonel and awarded the Distinguished Service Cross.

Then, in July 1918, his youngest brother Quentin—a pilot—was shot down and killed over France.

Ted never fully recovered from that loss.

When World War II began, Theodore Roosevelt Jr. was in his fifties. Broken down. Worn out. He could have stayed home. Taken a desk job. No one would have blamed him.

Instead, he fought his way back into combat command. He led troops in North Africa. Sicily. Italy. Four amphibious assaults before Normandy.

And on D-Day, when commanders tried to keep him off that beach, he refused.

“The first men to hit the beach should see the general right there with them.”

After Utah Beach, General Omar Bradley—who commanded all American ground forces in Normandy—called Roosevelt’s actions “the bravest thing I ever saw.”

General George Patton agreed. Days later, Patton wrote to his wife: “He was one of the bravest men I ever knew.”

On July 11, 1944—thirty-six days after D-Day—General Eisenhower approved Roosevelt’s promotion to major general and gave him command of the 90th Infantry Division.

Roosevelt never got the news.

That same day, he spent hours talking with his son, Captain Quentin Roosevelt II, who had also landed at Normandy on D-Day—the only father-son pair to come ashore together on June 6, 1944.

Around 10:00 p.m., Roosevelt was stricken with chest pains.
Medical help arrived. But his heart had taken all it could take.

At midnight on July 12, 1944—five weeks after leading men onto Utah Beach—Theodore Roosevelt Jr. died in his sleep.
He was fifty-six years old.

Generals Bradley, Patton, and Barton served as honorary pallbearers. Roosevelt was initially buried at Sainte-Mère-Église.

In September 1944, he was posthumously awarded the Medal of Honor. When President Roosevelt handed the medal to Ted’s widow, Eleanor, he said, “His father would have been proudest.”

After the war, Roosevelt’s body was moved to the Normandy American Cemetery at Colleville-sur-Mer—the rows of white crosses overlooking Omaha Beach.

And there’s where the story takes its final, heartbreaking turn.

In 1955, the family made a request: Could Quentin Roosevelt—Ted’s younger brother, killed in World War I, buried in France since 1918—be moved to rest beside his brother?

Permission was granted.

Quentin’s remains were exhumed from Chamery, where he’d been buried near the spot his plane crashed thirty-seven years earlier, and reinterred beside Ted.

Two sons of a president. Two brothers. Two wars. Reunited in foreign soil.

Quentin remains the only World War I soldier buried in that World War II cemetery.

Today, at the Normandy American Cemetery, among the 9,388 white marble crosses and Stars of David, two headstones stand side by side:

THEODORE ROOSEVELT JR.
BRIGADIER GENERAL
MEDAL OF HONOR

QUENTIN ROOSEVELT
SECOND LIEUTENANT
WORLD WAR I

The tide still rolls over Utah Beach. The sand looks the same. Tourists walk where soldiers died.

And somewhere in that vast field of white crosses, two brothers rest together—sons of a president who believed in duty, service, and leading from the front.

Some men lead by orders.

Some lead by rank.

Theodore Roosevelt Jr. led by example—cane in hand, heart failing, utterly unflinching.

He didn’t have to be there.

But he refused to lead from anywhere else.

Understanding Butyrate — The Key to Optimal Health and Well-Being

  • Butyrate is a short-chain fatty acid produced by gut bacteria when they ferment fiber, serving as the primary energy source for colon cells and maintaining gut barrier strength
  • Healthy butyrate levels support weight management, blood sugar control, and brain health, with studies linking butyrate-producing bacteria to reduced Alzheimer’s risk and lower cancer risk
  • A diverse diet rich in various fiber sources, including fruits, vegetables, and whole grains, promotes butyrate production, but increases should be gradual if your gut health is compromised
  • The gut barrier weakens with insufficient butyrate, allowing undigested food, bacteria, and toxins to enter your bloodstream, triggering systemic inflammation and widespread health problems
  • Gradually increasing fiber intake and reducing mitochondrial toxins for increased cellular energy supports gut health and beneficial gut microbes, enhancing butyrate production and overall health

    https://articles.mercola.com/sites/articles/archive/2025/12/08/understanding-butyrate.aspx

Dinner for one: The unexpected health risk no one’s talking about

What if there was a simple daily habit that could dramatically impact your health as you age—and it has nothing to do with exercise, supplements, or expensive treatments? New research reveals this hidden factor: eating alone.

A massive study tracking 80,000 older adults across 12 countries uncovered a startling pattern. Those who frequently dined solo faced significantly higher risks of poor nutrition, dangerous weight loss, and physical frailty compared to their socially-dining counterparts.

The lonely eaters consumed fewer fruits, vegetables, and protein-rich foods essential for maintaining muscle health and physical function. But here’s the fascinating twist: it wasn’t just about the food itself—the social environment fundamentally changed how and what people ate.

The solution might be beautifully simple. Community meal programs, regular family dinners, or even striking up conversations at local cafés could transform health outcomes. Sometimes the most powerful medicine doesn’t come from a pharmacy—it comes from sharing a plate with another human being.

From a newsletter by Sarah Otto at goodnesslover.com.

(Tom: The two correlated data I have seen that are probably resultant effects from this is that married people live longer than single people and those with more social interactions live longer than those with less social interactions.

Personally I have found that it is more difficult shopping and cooking for two than it is for five and even more so shopping and cooking for one than it is for two.

While you have to buy more and spend more time prepping when catering for more people, there is far more reason to do so and you have to be far less concerned about not buying too much of perishables in case they go off before they get consumed.)