Brigadier General Theodore Roosevelt Jr.

Brigadier General Theodore Roosevelt Jr

June 6, 1944.

As the landing craft approached Utah Beach, Brigadier General Theodore Roosevelt Jr. gripped his cane and checked his pistol.

He was fifty-six years old. His heart was failing. Arthritis had crippled his joints from old World War I wounds. Every step hurt.

He wasn’t supposed to be there.

But he had insisted—three times—on going ashore with the first wave of troops. His commanding officer, Major General Raymond “Tubby” Barton, had rejected the request twice. Too dangerous. Too risky. No place for a general.

Roosevelt wrote a letter. Seven bullet points. The last one: “I personally know both officers and men of these advance units and believe that it will steady them to know that I am with them.”

Barton relented.

And so Theodore Roosevelt Jr.—eldest son of President Theodore Roosevelt, veteran of World War I, twice wounded, gassed nearly to blindness—became the only general officer to storm the beaches of Normandy in the first wave.
This wasn’t ancient history. This was June 6, 1944.

The ramp dropped. German guns opened fire. Bullets slapped the water. Artillery shells screamed overhead. Men scrambled onto the sand, some falling before they took three steps.

Roosevelt stepped off the boat, leaning on his cane, carrying only a .45 caliber pistol.

One of his men later recalled: “General Theodore Roosevelt was standing there waving his cane and giving out instructions as only he could do. If we were afraid of the enemy, we were more afraid of him and could not have stopped on the beach had we wanted to.”

Within minutes, Roosevelt realized something was wrong.
The strong tidal currents had pushed the landing craft off course. They’d landed nearly a mile south of their target. The wrong beach. The wrong exits. The whole invasion plan suddenly useless.

Men looked around in confusion. Officers checked maps. The Germans kept firing.

This was the moment that could turn the invasion into a massacre.

Roosevelt calmly surveyed the shoreline. Studied the terrain. Made a decision.

Then he gave one of the most famous orders in D-Day history:

“We’ll start the war from right here!”

For the next four hours, Theodore Roosevelt Jr. stood on that beach under relentless enemy fire, reorganizing units as they came ashore, directing tanks, pointing regiments toward their new objectives. His cane tapping in the sand. His voice steady. His presence unshakable.

A mortar shell landed near him. He looked annoyed. Brushed the sand off his uniform. Kept moving.

Another soldier described seeing him “with a cane in one hand, a map in the other, walking around as if he was looking over some real estate.”

He limped back and forth to the landing craft—back and forth, back and forth—personally greeting each arriving unit, making sure the men kept moving off the beach and inland. The Germans couldn’t figure out what this limping officer with the cane was doing. Neither could they hit him.

By nightfall, Utah Beach was secure. Of the five D-Day landing beaches, Utah had the fewest casualties—fewer than 200 dead compared to over 2,000 at Omaha Beach just miles away.

Commanders credited Roosevelt’s leadership under fire for the success.

Theodore Roosevelt Jr. had been preparing for that day his entire life.

Born September 13, 1887, at the family estate in Oyster Bay, New York, he was the eldest son of Theodore Roosevelt—the larger-than-life president, war hero, and force of nature. Growing up in that shadow was impossible. Meeting that standard seemed even harder.

But Ted tried.

In World War I, he’d been among the first American soldiers to reach France. He fought at the Battle of Cantigny. Got gassed. Got shot. Led his men with such dedication that he bought every soldier in his battalion new combat boots with his own money. He was promoted to lieutenant colonel and awarded the Distinguished Service Cross.

Then, in July 1918, his youngest brother Quentin—a pilot—was shot down and killed over France.

Ted never fully recovered from that loss.

When World War II began, Theodore Roosevelt Jr. was in his fifties. Broken down. Worn out. He could have stayed home. Taken a desk job. No one would have blamed him.

Instead, he fought his way back into combat command. He led troops in North Africa. Sicily. Italy. Four amphibious assaults before Normandy.

And on D-Day, when commanders tried to keep him off that beach, he refused.

“The first men to hit the beach should see the general right there with them.”

After Utah Beach, General Omar Bradley—who commanded all American ground forces in Normandy—called Roosevelt’s actions “the bravest thing I ever saw.”

General George Patton agreed. Days later, Patton wrote to his wife: “He was one of the bravest men I ever knew.”

On July 11, 1944—thirty-six days after D-Day—General Eisenhower approved Roosevelt’s promotion to major general and gave him command of the 90th Infantry Division.

Roosevelt never got the news.

That same day, he spent hours talking with his son, Captain Quentin Roosevelt II, who had also landed at Normandy on D-Day—the only father-son pair to come ashore together on June 6, 1944.

Around 10:00 p.m., Roosevelt was stricken with chest pains.
Medical help arrived. But his heart had taken all it could take.

At midnight on July 12, 1944—five weeks after leading men onto Utah Beach—Theodore Roosevelt Jr. died in his sleep.
He was fifty-six years old.

Generals Bradley, Patton, and Barton served as honorary pallbearers. Roosevelt was initially buried at Sainte-Mère-Église.

In September 1944, he was posthumously awarded the Medal of Honor. When President Roosevelt handed the medal to Ted’s widow, Eleanor, he said, “His father would have been proudest.”

After the war, Roosevelt’s body was moved to the Normandy American Cemetery at Colleville-sur-Mer—the rows of white crosses overlooking Omaha Beach.

And there’s where the story takes its final, heartbreaking turn.

In 1955, the family made a request: Could Quentin Roosevelt—Ted’s younger brother, killed in World War I, buried in France since 1918—be moved to rest beside his brother?

Permission was granted.

Quentin’s remains were exhumed from Chamery, where he’d been buried near the spot his plane crashed thirty-seven years earlier, and reinterred beside Ted.

Two sons of a president. Two brothers. Two wars. Reunited in foreign soil.

Quentin remains the only World War I soldier buried in that World War II cemetery.

Today, at the Normandy American Cemetery, among the 9,388 white marble crosses and Stars of David, two headstones stand side by side:

THEODORE ROOSEVELT JR.
BRIGADIER GENERAL
MEDAL OF HONOR

QUENTIN ROOSEVELT
SECOND LIEUTENANT
WORLD WAR I

The tide still rolls over Utah Beach. The sand looks the same. Tourists walk where soldiers died.

And somewhere in that vast field of white crosses, two brothers rest together—sons of a president who believed in duty, service, and leading from the front.

Some men lead by orders.

Some lead by rank.

Theodore Roosevelt Jr. led by example—cane in hand, heart failing, utterly unflinching.

He didn’t have to be there.

But he refused to lead from anywhere else.

George Lucas

George Lucas

Hollywood executives laughed when he asked for the toy rights. Then he became richer than all of them combined.
A near-fatal car crash led to changing cinema forever.
George Lucas was 18 years old.
Three days before high school graduation, his Fiat got crushed by a Chevy Impala at an intersection.
The impact threw him from the car. His seatbelt snapped. That malfunction saved his life.
He should have died. Doctors didn’t know if he’d make it.
He spent weeks in the hospital. Had to watch graduation from a bed.
Everything changed after that.
Lucas stopped racing cars. Started thinking about what he actually wanted to do with the second chance he’d been given.
He decided to make films.
Everyone said it was a waste.
“You barely graduated high school.”
“You’re not connected to Hollywood.”
“Film school is for dreamers.”
He didn’t listen.
Went to USC film school. Made student films that caught attention. Got a scholarship from Warner Bros.
His first feature film, THX 1138, flopped. Studio hated it. Cut it against his wishes. Lost money.
Then American Graffiti became a hit. Made over $200 million on a tiny budget.
But Lucas had a bigger idea. A space opera. Something nobody had ever seen before.
He shopped Star Wars to every major studio.
Universal passed.
United Artists passed.
Disney passed.
Everybody passed.
They said it was too weird. Too expensive. Too risky.
“Space movies don’t sell.”
“The script is confusing.”
“Nobody wants to see robots and aliens.”
Finally, 20th Century Fox took a chance. But they didn’t believe in it either.
Here’s where Lucas did something nobody understood at the time.
Instead of negotiating for a bigger directing fee, he asked for something else.
The merchandising rights. And the rights to any sequels.
The studio laughed. Merchandising? From a weird space movie? Sure, take it.
They thought they were getting a deal. Paying Lucas less upfront for rights they considered worthless.
That decision made George Lucas a billionaire.
Star Wars opened in 1977. Forty theaters.
Lines wrapped around blocks. People saw it ten, twenty, fifty times.
It became the highest-grossing film in history at that point.
The toys alone generated billions. Action figures, lunchboxes, video games, books, theme parks.
All because Lucas believed in something nobody else could see.
But he wasn’t done.
He built Industrial Light and Magic because no special effects company could do what he needed. Now it’s created effects for most major blockbusters in history.
He built Skywalker Sound. Changed how movies sound.
He built Lucasfilm into an empire.
In 2012, he sold it to Disney for over $4 billion.
Then gave most of it away to education.
Today, the Star Wars franchise has generated tens of billions of dollars across films, merchandise, streaming, and theme parks.
All because a kid who almost died in a car crash decided to chase an idea everyone said was impossible.
What dream are you abandoning because the first few studios said no?
What rights are you giving away because you don’t see their future value?
Lucas nearly died at 18. His first film flopped. Every major studio rejected his biggest idea.
He took less money upfront because he believed in what he was building.
He created technology that didn’t exist because he needed it for his vision.
He proved that the people who reject you don’t get to define you.
Your near-miss might be your wake-up call.
Your rejection letters might be proof you’re onto something.
Your “worthless” idea might be worth billions.
Stop letting studios, investors, and doubters write your story.
Start thinking like George Lucas.
Take the rights everyone else thinks are worthless.
Build what doesn’t exist yet.
And never let “no” be the end of the conversation.
Sometimes the biggest wins come from the deals nobody else wanted.
Because when everyone underestimates you, you get to keep everything.
Think Big.

Question Everything

Fake News

by Jeff Thomas

The average person in the First World receives more information than he would if he lived in a Second or Third World country. In many countries of the world, the very idea of twenty-four hour television news coverage would be unthinkable, yet many Westerners feel that, without this constant input, they would be woefully uninformed.

Not surprising, then, that the average First Worlder feels that he understands current events better than those elsewhere in the world. But, as in other things, quality and quantity are not the same.

The average news programme features a commentator who provides “the news,” or at least that portion of events that the network deems worthy to be presented. In addition, it is presented from the political slant of the controllers of the network. But we are reassured that the reporting is “balanced,” in a portion of the programme that features a panel of “experts.”

Customarily, the panel consists of the moderator plus two pundits who share his political slant and a pundit who has an opposing slant. All are paid by the network for their contributions. The moderator will ask a question on a current issue, and an argument will ensue for a few minutes. Generally, no real conclusion is reached—neither side accedes to the other. The moderator then moves on to another question.

So, the network has aired the issues of the day, and we have received a balanced view that may inform our own opinions.

Or have we?

Shortcomings

In actual fact, there are significant shortcomings in this type of presentation:

The scope of coverage is extremely narrow. Only select facets of each issue are discussed.

Generally, the discussion reveals precious little actual insight and, in fact, only the standard opposing liberal and conservative positions are discussed, implying that the viewer must choose one or the other to adopt as his own opinion.

On a programme that is liberally-oriented, the one conservative pundit on the panel is made to look foolish by the three liberal pundits, ensuring that the liberal viewer’s beliefs are reaffirmed. (The reverse is true on a conservative news programme.)

Each issue facet that is addressed is repeated many times in the course of the day, then extended for as many days, weeks, or months as the issue remains current. The “message,” therefore, is repeated virtually as often as an advert for a brand of laundry powder.

So, what is the net effect of such news reportage? Has the viewer become well-informed?

In actual fact, not at all. What he has become is well-indoctrinated.

A liberal will be inclined to regularly watch a liberal news channel, which will result in the continual reaffirmation of his liberal views. A conservative will, in turn, regularly watch a conservative news channel, which will result in the continual reaffirmation of his conservative views.

Many viewers will agree that this is so, yet not recognise that, essentially, they are being programmed to simply absorb information. Along the way, their inclination to actually question and think for themselves is being eroded.

Alternate Possibilities

The proof of this is that those who have been programmed, tend to react with anger when they encounter a Nigel Farage or a Ron Paul, who might well challenge them to consider a third option—an interpretation beyond the narrow conservative and liberal views of events. In truth, on any issue, there exists a wide field of alternate possibilities.

By contrast, it is not uncommon for people outside the First World to have better instincts when encountering a news item. If they do not receive the BBC, Fox News, or CNN, they are likely, when learning of a political event, to think through, on their own, what the event means to them.

As they are not pre-programmed to follow one narrow line of reasoning or another, they are open to a broad range of possibilities. Each individual, based upon his personal experience, is likely to draw a different conclusion and, thorough discourse with others, is likely to continue to update his opinion each time he receives a new viewpoint.

As a result, it is not uncommon for those who are not “plugged-in” to be not only more open-minded, but more imaginative in their considerations, even when they are less educated and less “informed” than those in the First World.

Whilst those who do not receive the regular barrage that is the norm in the First World are no more intelligent than their European or American counterparts, their views are more often the result of personal objective reasoning and common sense and are often more insightful.

Those in First World countries often point with pride at the advanced technology that allows them a greater volume of news than the rest of the world customarily receives.

Further, they are likely to take pride in their belief that the two opposing views that are presented indicate that they live in a “free” country, where dissent is encouraged.

Unfortunately, what is encouraged is one of two views—either the liberal view or the conservative view. Other views are discouraged.

The liberal view espouses that a powerful liberal government is necessary to control the greed of capitalists, taxing and regulating them as much as possible to limit their ability to victimise the poorer classes.

The conservative view espouses that a powerful conservative government is needed to control the liberals, who threaten to create chaos and moral collapse through such efforts as gay rights, legalised abortion, etc.

What these two dogmatic concepts have in common is that a powerful government is needed.

Each group, therefore, seeks the increase in the power of its group of legislators to overpower the opposing group. This ensures that, regardless of whether the present government is dominated by liberals of conservatives, the one certainty will be that the government will be powerful.

When seen in this light, if the television viewer were to click the remote back and forth regularly from the liberal channel to the conservative channel, he would begin to see a strong similarity between the two.

It’s easy for any viewer to question the opposition group, to consider them disingenuous—the bearers of false information. It is far more difficult to question the pundits who are on our own “team,” to ask ourselves if they, also, are disingenuous.

This is especially difficult when it’s three to one—when three commentators share our political view and all say the same thing to the odd-man-out on the panel. In such a situation, the hardest task is to question our own team, who are clearly succeeding at beating down the odd-man-out.
Evolution of Indoctrination

In bygone eras, the kings of old would tell their minions what to believe and the minions would then either accept or reject the information received. They would rely on their own experience and reasoning powers to inform them.

Later, a better method evolved: the use of media to indoctrinate the populace with government-generated propaganda (think: Josef Goebbels or Uncle Joe Stalin).

Today, a far more effective method exists—one that retains the repetition of the latter method but helps to eliminate the open-ended field of alternate points of view. It does so by providing a choice between “View A” and “View B.”

In a democracy, there is always an “A” and a “B.” This illusion of choice is infinitely more effective in helping the populace to believe that they have been able to choose their leaders and their points of view.

In the modern method, when voting, regardless of what choice the individual makes, he is voting for an all-powerful government. (Whether it calls itself a conservative one or a liberal one is incidental.)

Likewise, through the modern media, when the viewer absorbs what is presented as discourse, regardless of whether he chooses View A or View B, he is endorsing an all-powerful government.

Two Solutions

One solution to avoid being brainwashed by the dogmatic messaging of the media is to simply avoid watching the news. But this is difficult to do, as our associates and neighbours are watching it every day and will want to discuss with us what they have been taught.

The other choice is to question everything.

To consider that the event that is being discussed may not only be being falsely reported, but that the message being provided by the pundits may be consciously planned for our consumption.

This is difficult to do at first but can eventually become habit. If so, the likelihood of being led down the garden path by the powers-that-be may be greatly diminished. In truth, on any issue, there exists a wide field of alternate possibilities.

Developing your own view may, in the coming years, be vital to your well-being.

Source: https://internationalman.com/articles/question-everything/

Snowman

Snowman

In February 1956, Harry deLeyer arrived late to a horse auction in Pennsylvania.

The auction was over. The valuable horses were gone. The only animals left were the ones nobody wanted—skinny, used-up horses being loaded onto a truck bound for the slaughterhouse in Northport.

Harry was a 28-year-old Dutch immigrant who taught riding at a private school on Long Island. He needed quiet horses for his beginner students. Nothing fancy. Just something safe.

Then he saw him.

A gray gelding, eight years old, filthy and covered in scars from years pulling an Amish plow. The owner warned Harry against buying him. “He’s not sound. He has a hole in his shoulder from the plow harness.”

Harry looked at the horse anyway.

Wide body. Calm demeanor. Intelligent eyes. Good legs despite everything.

“How much?”

“Eighty dollars.”

Harry paid it. The horse stepped off the slaughter truck and into history.

His daughter named him Snowman.

For a few months, Snowman was exactly what Harry needed—a gentle lesson horse the children loved. So gentle, in fact, that Harry eventually sold him to a local doctor for double what he’d paid.

The doctor took Snowman home.

Snowman had other plans.

The next morning, Snowman was back in Harry’s barn.

The doctor took him home again. Built higher fences.
Snowman jumped them. Came back.

Five-foot fences. The horse who’d spent his life pulling a plow was clearing five-foot fences like they were nothing.
Harry stared at this $80 plow horse and saw something nobody else had seen.

Maybe this horse could jump.

In 1958—exactly two years after Harry pulled him off that slaughter truck—Snowman and Harry deLeyer walked into Madison Square Garden.

They were competing against America’s elite show jumpers. Horses with perfect bloodlines. Horses worth tens of thousands of dollars. Horses owned by millionaires who’d never looked at a plow, much less pulled one.

Snowman was still that wide, plain gray gelding.

Still had scars on his shoulder.

Still had the thick neck and powerful hindquarters of a working farm horse.

He won.

Not just won—dominated. The AHSA Horse of the Year. The Professional Horsemen’s Association championship. The National Horse Show championship. Show jumping’s triple crown.

The press went wild. LIFE Magazine called it “the greatest ‘nags-to-riches’ story since Black Beauty.”

They called Snowman “The Cinderella Horse.”

In 1959, they did it again.
Back to Madison Square Garden. Back against the blue-blood horses and their millionaire owners.

Snowman won again. Horse of the Year. Again.

The crowd couldn’t get enough of them. This immigrant riding instructor and his $80 rescue horse, beating horses that cost more than houses.

Snowman jumped obstacles up to seven feet, two inches high. He jumped over other horses. He jumped with a care and precision that made it look easy, even when it wasn’t.

And here’s the part that made people love him even more: the same horse who cleared seven-foot jumps on Saturday could lead a child around the ring on Sunday. Snowman could win an open jumper championship in the morning and a leadline class in the afternoon.

He was called “the people’s horse.”

Snowman and Harry traveled the world. They appeared on television shows. Johnny Carson. National broadcasts. Snowman became as famous as any human athlete.

Secretariat wouldn’t be born for another decade. But people compared Snowman to Seabiscuit—another long-shot champion who’d captured America’s heart in darker times.

The Cold War was raging. The country was anxious. And here was this story: an immigrant and a rescue horse, proving that being born into nothing didn’t mean you were worth nothing.

That the $80 horse could beat the $30,000 horses.

That where you came from mattered less than where you were willing to go.

Snowman competed until 1969.

His final performance was at Madison Square Garden, where it had all started. He was 21 years old—elderly for a show jumper. The crowd gave him a standing ovation. They sang “Auld Lang Syne.”

He retired to Harry’s farm in Virginia, where he lived peacefully for five more years.

Children still came to see him. To touch the horse who’d become a legend. To feed carrots to the champion who’d once been hours away from becoming dog food.

Snowman died on September 24, 1974.

He was 26 years old. Kidney failure.

Harry deLeyer kept teaching, kept training, kept competing. He never found another horse like Snowman. Nobody did.

In 1992—eighteen years after Snowman’s death—the horse was inducted into the Show Jumping Hall of Fame.

In 2011, author Elizabeth Letts wrote “The Eighty-Dollar Champion: Snowman, the Horse That Inspired a Nation.” It became a #1 New York Times bestseller.

In 2015, when Harry was 86 years old, a documentary premiered: “Harry & Snowman.” For the first time, Harry told the whole story himself. His childhood in Nazi-occupied Netherlands, where his family hid Jews in a secret cellar beneath the barn. His immigration to America with nothing. His late arrival at that Pennsylvania auction.

That gray horse on the slaughter truck.

Eighty dollars.

That’s what it cost to save Snowman’s life.

It’s also what it cost to prove that champions aren’t always born in fancy stables with perfect bloodlines.

Sometimes they’re born in Amish fields, pulling plows until their shoulders scar.

Sometimes they’re saved by immigrants who arrive late to auctions and see something nobody else saw.

Sometimes the longest long shot becomes the surest thing.

Harry deLeyer died on June 25, 2021, at age 93.

The obituaries called him many things: riding instructor, champion, immigrant, hero.

But the title he probably loved most was simple.
Snowman’s rider.

The man who paid $80 for a plow horse and got a friend, a champion, and a story that would last forever.

You Are Valuable, Unique and Important

“What each must seek in his life never was on land or sea. It is something out of his own unique potentiality for experience, something that never has been and never could have been experienced by anyone else.”
Joseph Campbell – Author (1904 – 1987)

I have collected some ideas that will help you work out your basic purpose in life based on you personality, what you like doing and what your are good at. Check them out:
How To Work Out Your Basic Purpose
https://www.tomgrimshaw.com/tomsblog/?p=37862

Dr Brian May

Dr Brian May

In 1970, a 23-year-old physics student at Imperial College London was deep into his doctoral research on cosmic dust when he faced an impossible choice.
Brian May had spent three years studying the zodiacal dust cloud—the faint glow of sunlight reflecting off tiny particles scattered throughout the solar system. He’d built his own equipment, collected data, analyzed measurements, and was making genuine progress toward his PhD in astrophysics.
But he was also the guitarist for a rock band that was starting to gain serious attention.
The band was called Queen. They’d just signed a record deal. Tours were being planned. The opportunity was real, immediate, and unlikely to wait while May finished his academic work.
Standing at that crossroads, May made a decision that would leave a question unanswered for 36 years: he chose the guitar over the telescope.
Queen’s rise was meteoric. By the mid-1970s, they were one of the biggest bands in the world. “Boheman Rhapsody” became one of rock’s most iconic songs. May’s guitar work—his distinctive tone created using a homemade guitar called the Red Special—became instantly recognizable. Albums sold millions. Stadiums filled with fans singing along to “We Will Rock You” and “We Are the Champions.”
May’s academic work sat unfinished, his thesis incomplete, his research abandoned but never quite forgotten.
For most people, that would have been the end of the story. A promising academic career sacrificed for rock stardom—a trade-off that millions would gladly make. The PhD simply wasn’t meant to be.
But Brian May wasn’t most people.
Even as Queen dominated the rock world throughout the 1970s and 80s, May maintained his interest in astronomy and astrophysics. He read scientific journals. He attended lectures when touring schedules allowed. He stayed connected to the academic world he’d left behind, following developments in his field, watching as technology advanced and understanding of the solar system deepened.
His thesis supervisor, Professor Michael Rowan-Robinson, had told him decades earlier: “You can always come back and finish.”
May had never forgotten those words.
In 2006, more than three decades after walking away from Imperial College to tour with Queen, Brian May decided it was time.
He contacted Professor Rowan-Robinson, who was still at Imperial College and still remembered his former student who’d left to become a rock star. They discussed whether it was feasible to complete the work May had started in 1970.
The challenge was significant. Astrophysics had advanced enormously in 36 years. The technology May had used for his original observations was obsolete. The data he’d collected was valuable but incomplete by modern standards. Simply picking up where he left off wouldn’t work—he’d need to update his research, incorporate decades of new discoveries, and meet current academic standards.
But the core of his original work remained valid. His observations of the zodiacal dust cloud were still relevant. His research questions were still meaningful. And Rowan-Robinson was willing to supervise him to completion.
May threw himself into the work with the same intensity he’d brought to Queen’s music.
While still maintaining his music career—performing with Queen + Paul Rodgers and working on various projects—May carved out time to update his thesis. He revisited his original data from the early 1970s. He studied the decades of subsequent research on zodiacal dust. He incorporated modern measurements and refined his analysis using contemporary techniques.
The thesis he ultimately submitted was titled “A Survey of Radial Velocities in the Zodiacal Dust Cloud.” It examined the motion of dust particles in the plane of the solar system, work that contributed to understanding how dust behaves in space—research relevant to everything from asteroid studies to the formation of planetary systems.
In August 2007, Imperial College London awarded Brian May a PhD in astrophysics.
Not an honorary degree—universities frequently give those to celebrities and donors without requiring actual academic work. This was a real PhD, earned through genuine research, peer review, and the same rigorous standards applied to any doctoral candidate.
The examination was conducted by experts in the field who evaluated his work on its scientific merits, not his fame as a guitarist. The thesis had to withstand the same scrutiny any astrophysics PhD would face. May had to defend his research, answer technical questions, and demonstrate mastery of his subject.
He passed.
At age 60, Brian May—rock legend, guitarist whose solos had been heard by hundreds of millions—became Dr. Brian May, astrophysicist.
The accomplishment made headlines around the world, but not because a celebrity had purchased a credential or received an honorary title. It made news because it was genuinely remarkable: a world-famous musician had returned to complete legitimate academic work abandoned 36 years earlier, proving that it’s never too late to finish what you started.
The story resonated because it defied easy categorization. We’re used to dividing people into categories: artists versus scientists, creative types versus analytical minds, rock stars versus academics. Brian May refused to fit into any single box.
He’d always been both.
As a child, May had been fascinated by the night sky. He built telescopes with his father. He studied physics and mathematics not because he had to, but because he loved understanding how the universe worked. When he got to Imperial College—one of the world’s top science universities—he excelled academically while also playing guitar in bands.
The guitar he played, the legendary Red Special, was itself a fusion of science and art. May and his father had built it by hand when Brian was a teenager, using materials including parts of an old fireplace mantle, motorcycle springs, and knitting needles. Every design choice was carefully calculated for acoustic properties and tonal qualities. The result was an instrument with a unique sound that would become part of rock history.
That blend of scientific thinking and artistic creativity defined everything May did. His guitar solos were technically complex but emotionally powerful. His approach to music was both intuitive and analytical. He didn’t see science and art as opposites—to him, they were different expressions of the same curiosity about the world.
Earning the PhD wasn’t about proving anything to critics or adding credentials to his resume. May didn’t need the degree for career advancement—he was already one of the most successful musicians in history. He pursued it because the unfinished work bothered him, because he’d always wondered what conclusions his research would reach, because he valued knowledge for its own sake.
After earning his PhD, May didn’t treat it as a culmination but as a beginning. He became increasingly active in science advocacy and public education about astronomy. He served as Chancellor of Liverpool John Moores University for over a decade. He co-founded Asteroid Day, an annual event raising awareness about asteroid impacts. He collaborated with NASA on various projects, including creating stereoscopic images from the New Horizons mission to Pluto.
He published books combining his interests, including academic books about stereoscopy and popular books about astronomy illustrated with historic 3D photographs. He gave lectures at universities worldwide, speaking about both his astrophysics research and the intersection of science and creativity.
And he continued making music, because he never had to choose between being a scientist and being an artist—he was always both.
The 36-year gap in his academic career became part of his story, not a failure but proof that paths don’t have to be linear. You can start something, set it aside for a valid reason, and come back to it decades later if it still matters to you.
That message resonated far beyond the worlds of rock music and astrophysics. Students who’d left school to work could see that returning was possible. People who’d abandoned dreams for practical reasons found encouragement. Anyone who’d ever felt they had to choose between two passions saw an example of someone who ultimately refused to choose.
When May received his doctorate, he joked in interviews that his thesis was “the world’s longest delayed homework assignment.” But beneath the humor was a serious point: intellectual curiosity doesn’t expire. Knowledge you once pursued remains valuable even if you step away from it. And completing something you started, even decades later, brings its own satisfaction independent of external recognition.
The story of Dr. Brian May, astrophysicist and rock legend, stands as a reminder that human beings are not meant to fit into single categories. We can contain multitudes. We can excel in completely different domains. We can be both the person shredding guitar solos in front of 80,000 fans and the person quietly analyzing data about cosmic dust.
In fact, the same qualities that made May an exceptional musician—attention to detail, pattern recognition, creative problem-solving, dedication to craft—translated directly to his scientific work. The disciplines weren’t as separate as they seemed.
Today, when astrophysicists discuss zodiacal dust or musicians analyze Brian May’s guitar technique, they’re talking about the same person—someone who proved that you don’t have to choose between passion and profession, between art and science, between finishing what you started and embracing new opportunities.
You can have both. It might just take 36 years.
But as Dr. Brian May demonstrated: some things are worth coming back to finish, no matter how long the journey takes.

Albert Battel

Albert Battel

On that day, the rules of war were broken. For one shocking, unbelievable moment, the unthinkable happened: German soldiers aimed their rifles directly at the notorious SS. The Nazi regime was suddenly fighting itself.
In the middle of World War II, a strange and tense standoff took place on a bridge in Przemyśl, Poland.
At the center of this conflict was a 51-year-old lawyer turned army officer named Albert Battel.
He was wearing the wrong uniform for a hero. But on that day, he decided that saving lives was more important than following orders.
The Jewish quarter of Przemyśl had been closed off with barbed wire for a long time. The people inside were terrified.
Everyone knew that when the SS trucks arrived, it meant “resettlement”—a polite word the Nazis used for deportation to death camps.
In July 1942, the order came down. The SS was coming to empty the ghetto.
Albert Battel was a Wehrmacht (regular army) officer stationed in the town.
He wasn’t a young, hot-headed soldier. He was a middle-aged man who had lived a quiet life practicing law before the war.
But when he heard the SS was coming to take the Jewish workers and their families, something inside him refused to accept it.
As the SS convoy roared toward the bridge over the River San, which was the only entrance to the ghetto, they found the way blocked.
Battel had ordered his own soldiers to lower the barrier.
When the SS commander demanded to pass, Battel refused. He didn’t have permission from his superiors.
He didn’t have orders from Berlin. He simply stood his ground. The situation became incredibly dangerous. The SS threatened him, but Battel played his final card.
He ordered his machine-gunners to aim their weapons. He told the SS that if they tried to cross the bridge, his men would open fire.
It was a moment of total silence. German soldiers aiming at German police. The SS commander, realizing Battel was serious, backed down. The trucks turned around.
Blocking the bridge was only the first step. Battel knew the SS would come back eventually. He needed to act fast.
He took his own military trucks and drove straight into the Jewish ghetto. He wasn’t there to arrest people; he was there to save them.
He knocked on doors and told families to grab what they could.
Using a loophole in the rules, he claimed these people were “essential” to the war effort. He loaded up to 100 Jewish families—men, women, and children into the army trucks.
He drove them out of the ghetto and into the safety of the local military barracks.
For that day, and the days that followed, those families were safe under the protection of the Wehrmacht.
News of what happened reached the highest levels of the Nazi government. Heinrich Himmler, the head of the SS, was furious. He ordered an investigation into Battel.
Himmler wrote a note in Battel’s file, promising to have him arrested and expelled from the Nazi party the moment the war was over.
Battel was eventually removed from his command and forced into retirement early due to heart problems.
He lost his career and his reputation among his peers.
Albert Battel survived the war, he died in 1952 in West Germany. At the time of his death, he was largely forgotten.
He never wrote a book about his actions or bragged about standing up to the SS.
However, the people he saved did not forget.
Years later, survivors began to tell the story of the officer who blocked the bridge.
In 1981, Yad Vashem (The World Holocaust Remembrance Center) recognized Albert Battel as Righteous Among the Nations.
Albert Battel’s story teaches us a powerful lesson about courage.
Courage is not about being fearless. Battel was likely terrified of being shot for treason
He was operating inside a system built on total, terrifying obedience. In Nazi Germany, the principle was rigid: Befehl ist Befehl (An order is an order), and questioning authority meant execution. Yet, in that impossible vacuum, Battel found the tiny, crucial space to rebel.
His action shatters every excuse used to justify inaction during the war. It proves that the final, most powerful authority belongs not to the general, the state, or the uniform, but to the individual conscience.
In a time of darkness, one man stopped a convoy of death simply by saying, “Not today.”
Even when the entire world is screaming at you to conform, the choice between simple obedience and fundamental decency remains entirely, beautifully, and terrifyingly yours.
Battel showed that even in the worst circumstances, we always have a choice between doing what is told and doing what is right.
We Are Human Angels
Authors
Awakening the Human Spirit
We are the authors of ‘We Are Human Angels,’ the book that has spread a new vision of the human experience and has been spontaneously translated into 14 languages by readers.
We hope our writing sparks something in you!

Edwin Moses

Edwin Moses

Edwin Moses walked onto a dusty Ohio State track in 1976, set down his physics notebook beside the hurdles, and told his coach he was going to rewrite the rhythm of the race by taking thirteen steps between hurdles instead of the fourteen every expert insisted was physically impossible.
He was an engineering student, not a sports prodigy. He had no scholarship. He had no elite training. He ran workouts alone because there was no hurdling coach for him. Moses studied film like a scientist. He measured stride angles. He calculated force and drag. He scribbled equations on scrap paper and taped them above his dorm room desk. He believed the event followed predictable laws of motion and he could break them if he learned the math.
The experiment worked on his very first try. Moses glided through ten hurdles with control that startled teammates. The longer stride meant fewer adjustments. Fewer adjustments meant no hesitation. By spring he won the Olympic Trials. By summer he was in Montreal wearing United States colors that had been a fantasy months earlier.
Then he ran the four hundred hurdles faster than any human in history.
He did not stop there. After Montreal he built a training schedule with the same logic he once used for lab work. He timed every run. He tracked heart rate, fatigue, and oxygen levels. He studied race film frame by frame to eliminate wasted motion. When he returned to competition, he began a streak the world still struggles to comprehend.
Nine years. One hundred and twenty two consecutive wins. No false starts. No collapses. No excuses.
Competitors tried to match his stride pattern. They failed. Coaches tried to decode his rhythm. They failed. Reporters waited for arrogance. Moses gave them none. He spoke about discipline, spacing, timing, and respect for the craft. He believed mastery came from patience and relentless analysis, not talent alone.
Even at the Los Angeles Olympics in 1984, with the weight of a nation on his shoulders, he lined up with the same quiet expression. He ran his race, held the thirteen step pattern, and crossed the finish with daylight between him and the field.
Edwin Moses did not dominate through force.
He used clarity, intelligence, and perfect rhythm to turn an event into a system only he understood.