In full disclosure, I am an investment advisor, and I practice what I preach. I have a method that I believe works, but as every compliance officer will remind you, past performance is no guarantee of future performance. At the end of this podcast, you will hear the full disclaimer, and I encourage you to listen carefully. But let me set the stage for where I am coming from. I favor large, well-capitalized companies over small speculative ones. I focus on megatrends, not on flipping stocks, day trading, or chasing the latest meme stock making headlines. I am old-fashioned, old-school, and unapologetically traditional in my approach.
My emphasis is on income over growth, because the people I work with are not 22-year-old TikTok traders. Nearly every one of my clients is over the age of fifty, many are fully retired, and the average client is probably somewhere between their mid-seventies and early eighties. That perspective matters. When you are living on the results of decades of hard work, your primary concern is cash flow, preservation, and stability—not wild speculation.
Because of that, I have no problem calling things the way I see them, even when it ruffles feathers. In this discussion, you are going to see that I am willing to take on the medical industry, the insurance industry, Big Pharma, and any other entrenched interest that profits at the expense of the public. If you are ready for a frank, fact-driven conversation about health, medicine, industry, and how these forces intersect with your financial life, buckle up—this is going to be a good one.
From a historical standpoint, there is no doubt that human longevity has been influenced more by evolving data, environmental awareness, and fundamental healthcare advances than by any modern medical breakthrough. Disease, famine, unsanitary conditions, and the lack of emergency medical care once shaped the very structure of society. The most profound improvements in public health came not from miracle cures, but from sanitation. Clean, running water, the recognition of basic nutritional needs, and the simple but powerful use of vitamins radically altered survival rates and extended lifespans across populations.
Consider polio as an example. Statistically, between 95 and 99 percent of those infected were asymptomatic—meaning they carried the virus but showed no outward signs of illness. They did not develop paralysis, fever, or the visible markers often associated with polio, yet they were counted in the broad measures of infection. This reality created a significant divide between perception and actual risk. The majority experienced no symptoms, but the minority who did suffer visible and lasting effects shaped the public’s understanding and fueled fear. The same dynamic resurfaced with COVID-19. A large percentage of individuals who tested positive were asymptomatic, experiencing no meaningful illness or impairment. Yet fear—driven by daily case counts, political decisions, and media amplification—overrode the statistical reality. Public policy was shaped less by the weight of evidence and more by the amplification of fear, much as it had been during the polio era.
Another dimension of the polio story is less frequently discussed: the role of vaccination itself in producing harm. Many individuals suffered adverse effects from early polio vaccination campaigns. In certain cases, recipients of the vaccine not only developed illness but contracted polio as a direct result of the vaccination process. This was most notably exposed in the Cutter Incident of 1955, when a batch of improperly inactivated vaccines led to thousands of cases of vaccine-induced polio. Some children were permanently paralyzed, and others died. When the numbers are compared, the sobering reality emerges: for a measurable period, the number of individuals harmed or infected by the vaccination process itself rivaled or even exceeded the number who would otherwise have been naturally infected and symptomatic. While the intention was prevention, the outcome was a tragic inversion—the cure, in some cases, became the cause.
The Ford Administration’s swine flu program in 1976 reinforces the pattern. After the death of a soldier at Fort Dix, New Jersey, health officials feared a repeat of the 1918 pandemic. President Gerald Ford authorized an ambitious plan to vaccinate every American. Within months, more than forty million people received the shot. But reports surfaced of Guillain–Barré Syndrome, a serious neurological disorder, developing in vaccine recipients. Although the numbers were small in relative terms, the pattern was undeniable, and the program was halted in December of that year. The anticipated pandemic never arrived, but the damage was done. Litigation followed, and Congress had already moved to indemnify vaccine manufacturers under the National Swine Flu Immunization Program Act. Liability shifted from industry to government, and taxpayers ultimately paid tens of millions in settlements. This episode set a lasting precedent: pharmaceutical companies would participate in rapid or large-scale vaccination campaigns only if shielded from responsibility. The corporate shield was reinforced, and public trust was shaken when harm came not from the disease itself but from the intervention meant to prevent it.
When we step back from these episodes, a broader pattern emerges. It is not simply about one disease or one administration’s failure. It is about the origins and trajectory of the very institutions that came to dominate American public health. From their inception, these organizations were not neutral arbiters of science. They were born out of crisis, funded through political channels, and quickly intertwined with corporate interests that shaped both their direction and their credibility.
The Centers for Disease Control and Prevention, for example, was created during World War II to stop malaria from spreading through the American South and undermining wartime productivity. Its first major initiative was not a medical breakthrough—it was a chemical campaign. In partnership with Monsanto, the CDC promoted and deployed DDT, spraying vast swaths of land and exposing generations to compounds later tied to cancer and other long-term health issues. Few people realize that the CDC’s headquarters in Atlanta sits on land gifted by Coca-Cola. Fewer still are aware that a Freedom of Information Act request uncovered emails showing CDC officials coordinating with Coca-Cola executives to suppress damaging information about sugar.
The same story repeats itself with the Environmental Protection Agency. In FOIA documents, a senior EPA official boasted in writing to Monsanto that if he could stall or derail disclosures about the risks of glyphosate, they owed him a medal. This was not regulation—it was collusion. And the pattern is universal. The FDA, CDC, EPA, NIH—all developed hand in hand with the industries they were supposed to oversee. Their budgets, their buildings, and often their intellectual frameworks were underwritten by the same corporations whose products they regulated. Oversight became a closed loop: industry produced, regulators approved, industry profited, and regulators enjoyed influence, funding, and revolving-door career opportunities.
Now let us go back and talk about sugar. For all the attention given to oil barons and railroads, one of the most powerful forces in shaping both American law and American health has been sugar. In the late nineteenth century, the American Sugar Refining Company controlled up to 99 percent of the nation’s refined sugar. This dominance was the catalyst for the Sherman Antitrust Act of 1890. People assume antitrust began with oil or railroads, but the sugar trust was the original trigger. The reason it was called “antitrust” is because, in those days, corporations as we know them did not exist. Large business entities were structured as trusts—...