. . . and it's not their genes either. – Dr. C

Archive for November, 2021

Zombie Theory – Part 1

Part 1 – A Balancing Act

 “The body of man has in itself blood, phlegm, yellow bile and black bile; these make up the nature of this body, and through these he feels pain or enjoys health.  Now he enjoys the most perfect health when these elements are duly proportioned to one another in respect of compounding, power and bulk, and when they are perfectly mingled.”  – Hippocrates, On the Nature of Man, circa 5th century BC


21st century schools of medicine teach their graduates human behavior – what you and I do in our own bodies in the next moment of our lives – is understood by microscopically studying invented and invisible anatomical brain centers, each center equipped with miraculous behavioral functions.  Students of psychiatry are instructed undesirable behavior (psychiatry’s symptoms) is caused by a chemical imbalance effecting the brain centers.  Treatment, then, is to prescribe laboratory chemicals to intermingle with natural body chemicals that are, somehow, out of balance.  Worth noting, the Psychiatric Medical Model (PMM: pronounced “pim”) offers no definition, or explanation, of balance. 

Practitioners of PMM believe you and I are victims of our excessive or insufficient chemicals – usually serotonin or dopamine – as well as our faulty “neuro-connectors” that negatively impact the brain centers where all behavior is neatly stored.  In turn, psychiatric patients are taught by doctors that undesirable and life altering behaviors are caused by a disease, or disorder, or deficiency, or delay, or disability, or derangement, or disturbance, or a dysfunction (the 8 D’s of PMM).  Patients inadvertently suffer the consequences of these brain flaws. “It’s not you,” the doctor confidently instructs the patient, “it’s your disorder that’s causing you to experience the undesirable behavior.”  Hence the title:  The Zombie Theory.

Humankind has witnessed miraculous progress in medical science during the past 100 years.  Thanks to new discoveries, new medicines and new procedures, we have benefitted greatly when it comes to our physical well-being.  Especially true in modern countries, we live longer, our quality of life is better, our physical maladies better treated.  Our flesh, blood and bones have never been in better hands.  However, when it comes to human conduct, medicine has failed miserably throughout human history, up to and including today, without exception.  

Medicine as Art [i]
Twenty-five hundred years ago, the Hippocratic School of Medicine began to understand medicine as an art form, and healers as practicing artists. Disease was no longer divine wrath, nor healing a gift from God.  Over time, heavenly punishments and gifts were replaced with the idea of cause and effect, providing the underpinnings for this burgeoning artform to become a rational science in the physical world.  Along the way, medical ethics and standards of care were crafted, both keystones of the ancient Greeks that continue to guide modern day medicine.  Most important, three primary conditions were identified and interconnected:  the disease, the patient and the healer. 

Healers were trained in the Seven Natural Factors in this holistic system of medicine: The Four Elements, The Four Humors, The Four Temperaments, The Four Faculties, The Vital Principles, The Organs and Parts and The Forces.  The underlying theory was simple.  When there is balance and harmony within the Seven Natural Factors, there is health.  When they are not in balance, there is dysfunction and disease.  When any one of the Seven Natural Factors ceases to function, there is death.  Balance then, as it is now, was the pursuit.

For the first time detailed experiments were conceived and conducted, data was collected, results were assessed and treatments were formalized.  Six healing pathways emerged, from mild to severely invasive.  The first treatment of choice?  Diet, considered the most gentle and safest path to restore balance.  Second in this formal progression of treatment paths focused on altering the patient’s lifestyle and hygiene habits that were causing the imbalance of body humors (fluids).  Only when the first two treatments were found to be ineffective did the healer select the third treatment path:  medicine.  Chemical concoctions with professed healing powers were dispensed in the form of supplements, potions, tonics and herbs.  These first three paths were self-administered, under the guidance of a healer.  

Paths four, five and six were administered by the healer.  The first of these, physiotherapy, included heat treatments to induce sweating, massages with medicated oils, and a variety of muscle, bone and body manipulations designed to release trapped toxins.  Physiotherapy was often a preparative stage for detoxification, the fifth path of treatment.  Common purification methods included emetics (to induce vomiting), enemas, diuretics and, frequently, bloodletting.  Only when the first five paths fail does the healer turn to surgery, the last treatment path.  Surgery was seen as the most invasive and, except for immediate trauma and other emergencies, the last resort for the trained healer.

Also historic, diseases were systematically classified according to similarities and differences as the disciplines of etiology and pathology began to emerge.  The goal was to ensure the healer’s diagnosis and choice of treatment was supported by fact-based information.  As momentous, individualized treatment was a core value.  “It’s more important,” professed Hippocrates in one of his famous aphorisms, “to know what kind of person has a disease than what kind of disease a person has.”  Healers were trained to understand individual patients as living in a dynamic relationship with their environment.  This new art form treated the patient, not just the disease.

What About Madness?
For epochs before this new medicine ancient civilizations viewed madness as punishment from an angry God for divine trespasses.  The healers of 10,000 years ago treated their patients with music, prayer, charms, spells and other incantations.  This new school of medicine declared madness to be the result of natural occurrences in the brain, centered around the four essential humors.  To treat madness, patients would routinely be bled from the forehead, or from a large vein in the arm or leg or rectum to draw away corrupted fluids from the brain in order to bring the body back into balance. 

Thus, Greek medicine began with two interrelated principles.  The first is to provide the body its natural beneficial cravings:  a wholesome diet, healthy habits, adequate exercise, and sufficient rest and sleep.  The second is to cleanse the body of wastes and pathogenic matter inside and out, creating a healthy body balance.  Though form and fashion may be different, modern medicine embraces these same basic principles.  Unfortunately, as you will see, so does PMM.

Primum Non Nocere – “First, do no harm” [ii]
The Hippocratic Oath is the first expression of medical ethics in human history and it remains a rite of passage for medical graduates around the globe.  This oath reminds the healer to be aware of the possible harm that can occur from any kind of intervention.  “Practice two things in your dealings with disease,” reiterated Thomas Inman, a 19th century Liverpool surgeon, “either help or do not harm the patient.”  Nonetheless, history has many examples of this oath being violated, and much harm done. 

Here’s the longest lasting. 

Bloodletting
For more than two millennia the treatment of choice for healers around the world was bloodletting, mainly because of its versatility.  In addition to madness, this medical technique was prescribed for acne, asthma, cancer, cholera, coma, convulsions, diabetes, epilepsy, gangrene, gout, herpes, indigestion, jaundice, leprosy, ophthalmia, plague, pneumonia, scurvy, smallpox, stroke, tetanus, tuberculosis – and a hundred more condetions, including heartbreak.  

The Talmud was cited by healers to proclaim the most beneficial days and times of the month to use this procedure.  Christian healers gave guidance to their followers by declaring the specific Saints’ Days favorable for this medical technique.  Islamic healers too heralded bloodletting, particularly for fevers.  During medieval times bleeding charts were common, designating specific bleeding sites on the body.  The vein in the right hand, for example, was bled for liver problems, the vein in the left hand for spleen problems.  “Do-it-yourself” instructions were created and distributed worldwide. 

Bloodletting was in its heyday during the Middle Ages, prescribed by healers as both a curative and preventative medical intervention.  The actual procedure was often done by a trained barber-surgeon, the red and white barber pole symbols of blood and bandages.  There were a variety of techniques too.  A phlebotomy occurred when blood was drawn from the larger external veins, an arteriotomy from arteries usually from the temple.  Some healers used a scarificator, a specially crafted tool cast in a brass case that enclosed a spring-loaded mechanism with blades of steel.  Leeches were commonly used too.

Bloodletting theory – the science of how it “works” – was based on two ideas; (1) blood did not circulate and would stagnate in the extremities; (2) removal was done to attain humoral balance to fight off illness and to restore health.  The more severe the disease, the more blood that needed to be depleted, fevers requiring the largest amount of drainage.  Importantly, six hundred years after this medical art form was born, Galen, a philosopher-physician from the Roman Empire, revitalized, reinvented and “rebooted” Hippocratic humoralism as a meticulously detailed, rational, technique-focused medical theory that retained its popularity in cultures around the world for another seventeen centuries.

The End of Bloodletting    
The 19th century was revolutionary for medical science.  During the first half a British chemist, Humphry Davy, discovered the anesthetic properties of nitrous oxide, a French doctor, Rene Laennec, invented the stethoscope, and James Blundell, a British obstetrician, performed the first successful blood transfusion.  In the fourth decade Crawford Long, an American surgeon, used ether for the first time, and a Hungarian doctor, Ignaz Semmelweis, discovered that disinfecting the hands of medics, midwives and nurses drastically reduced the incidence of death from childbed fever that was killing nearly a third of infected mothers.

The second half of the century was equally impressive.  Joseph Lister, a British surgeon, introduced phenol to clean wounds and to sterilize surgical instruments.  Louis Pasteur published the Germ Theory of Disease in 1870 and within 12 years his labs produced vaccines for chicken cholera, anthrax and rabies.  The first Nobel Prize in Medicine was awarded to German physiologist Emil von Behring for creating vaccines for diphtheria and tetanus.  X-rays were discovered by German physicist Wilhelm Röntgen, earning him the Nobel Prize in Physics.  Finally, in 1897, a German pharmaceutical company, Bayer AG, created a new wonder-drug:  aspirin.  Within two years aspirin was a global phenomenon.

The successes of this maturing art of medicine spelled the end to the ancient and barbarous practice of bloodletting.  Centuries old theories and traditions as well as healer and patient testimonials could not stand up to the new and emerging regimen found in medical science.  By the end of the 19th century bloodletting was nearly extinguished worldwide (though not completely![iii]). 

In its place was a new awakening:  modern medical science. 

NEXT:  Part 2:  The Era of Medical “Experimentalism”

Life is short; and the art long. – Hippocrates


Endnotes

[i] History of Greek Medicine: http://www.greekmedicine.net/b_p/Standards_of_Health.html

[ii] The Hippocratic Oath does not include the words “First, do no harm.”  The oath is nearly 400 words and certainly includes the sentiment.  The actual quote is attributed to a Parisian pathologist and clinician Auguste François Chomel (1788–1858).  Please see the entire oath here:  https://en.wikipedia.org/wiki/Hippocratic_Oath

[iii] Bloodletting is still indicated for a few indications such as polycythemia, haemochromatosis, and porphyria cutanea tarda, while leeches are still used in plastic surgery, replantation and other reconstructive surgery, and very rarely for other specific indications.

Zombie Theory – Part 2

Part 2:  The Era of Medical Experimentalism

“The reproducibility of published experiments is the foundation of science. No reproducibility – no science.”                                                         –  Moshe Pritsker, Ph.D., CEO of JoVE


By the turn of the 20th century medical science had fully embraced empiricism – the philosophy that knowledge is determined by rational experiments perceived by our senses.  Proof rather than deduction or revelation was the new measuring stick.  Experiments were designed, theories created, measurements taken, successes heralded, experimenters often rewarded with fame and fortune.  As important, empiricism brought with it the process by which all modern science is evaluated:  the scientific method.  The formality and rigor of this process was transformational in science.  It’s worth a quick review.

The Scientific Method
The 5-step scientific method is simple to describe, and difficult to implement – and that is the point of this exacting process.  The technique is designed to create empirical evidence – sometimes referred to as sense experience – utilizing the tools of observation and experiment.  Results must be measurable in the physical world.  When done as designed, the method provides quantifiable observations to the scientist – the facts of an experiment.  In turn, the scientist provides an explanation of the facts – the theory of an experiment. 

Step 1 of the scientific method requires the scientist ask a question about nature, make detailed observations and gather information.  In Step 2 the scientist forms a hypothesis (theory) about the observations and creates specific predictions.  Next, in Step 3 the scientist tests the predictions with a detailed, observable, quantifiable experiment.  Step 4 requires the scientist to analyze the data, to draw conclusions, and to accept, reject or modify the hypothesis.  Finally, and most importantly, Step 5 compels the scientist to provide step-by-step directions to duplicate the experiment, and a new scientist must independently reproduce the experiment and find the same results before any knowledge can be proclaimed

Turn-of-the-century medics must have been truly inspired.  For the first time they could listen to – and see – telltale signs of health inside a living body.  They could anesthetize their patients prior to surgery, and they used sterilized instruments in a disinfected operating room, blood transfusions available as needed.  As important, given the overwhelming success of Pasteur’s germ theory, new hypotheses were being introduced at a fast pace, each theory looking for other likely germs that were the root cause of so much human suffering. 

Thus, 20th century medical experimentalism launched with new tools, a new paradigm and a multitude of exciting projects.  To kick off this new era, two medical devices were revamped during the last decade of the 19th century, setting the stage for the incredible 100 years to follow:  the microscope and the culture dish.

Let There Be Light
August Köhler[i] was a student of zoology, botany, mineralogy, physics, and chemistry in late 19th century Germany.  As a young, post-graduate staff member at Carl Zeiss AG (an optical systems manufacturer) he developed a new technique, dubbed Köhler Illumination.  Kohler’s invention produced even lighting across the field of view and greatly enhanced the contrast of the light microscope. During the next 45 years Kohler contributed to numerous other innovations including fluorescence microscopy and grid illumination, a method used in the treatment of tumors. 

Around the same time, Julius Richard Petri was working for the Imperial Health Office in Berlin.  Lab scientists were uniformly frustrated.  In order to observe cultures through a microscope the cover had to be removed, exposing the bacteria to contaminants like dust, hair, and human breath.  Petri had the simple idea of placing a slightly larger clear glass dish upside down over the culture dish to protect it from the external environment and, according to one science writer, “changed medical history.”[ii]  Petri moved on to work in a lab in Germany for the rest of his career where he published nearly 150 papers about the spread of diseases. 

“Magic Bullets”
Another German, Paul Erlich[iii], coined the term “chemotherapy” in 1900.  Erlich theorized toxic compounds could be created to selectively target a variety of disease-causing organisms.  He predicted future chemists would produce substances to seek out these disease-causing agents, dubbing these new substances “magic bullets.” He was right.  Magic bullets and other microscope observations began to materialize in science labs around the world.  By 1901 blood types were discovered by Austrian Karl Landsteiner, in 1906 Frederick Hopkins discovered vitamins in England, and a Canadian, Sir Frederick Banting, discovered insulin in 1921.

It was a banner century for another magic bullet:  the vaccine.  The most celebrated was Jonas Salk’s polio vaccine.  Once introduced in the United States (some may remember the March of Dimes immunization campaign in the early 1950’s) the annual number of polio cases fell from 35,000 in 1953 to 5,600 by 1957.  By 1961 only 161 cases were recorded in the United States.  Medical science also gave us vaccines for bacterial meningitis, chickenpox, haemophilus influenza, hepatitis A, hepatitis B, Japanese encephalitis, measles, mumps, papillomavirus, pneumococcus, rotavirus, rubella, tetanus, typhoid, tick encephalitis, whooping cough and yellow fever – saving and changing the lives of millions of people. 

The Century’s Preeminent Magic Bullet – Penicillin
Before antibiotics (lit. against-life), 90% of children with bacterial meningitis died, strep throat was often fatal, and even minor infections would often lead to serious illness and death.  Then in 1928, Sir Alexander Fleming[iv], a Scottish biologist and pharmacologist, made a fortuitous discovery from a discarded Petri dish.  The mold that had contaminated an experiment turned out to contain a powerful antibiotic:  penicillin.  This one discovery, and the analogues to follow, has saved hundreds of millions of lives around the world.  Fleming also predicted science would find many new “bacteria killers.”  He was right too.  Today there are thousands of antibiotics, more created every year. 

More “Magic Bullets”
Here is a selection of other magic bullets discovered and invented during the 20th century (there are many others):
• Arsphenamine  syphilis (1910)                         • Oral contraception –“the pill” (1960)
• Nitrogen mustard (1946) – 1st  cancer drug    • Propranolol – 1st beta blocker (1962)
• Acetaminophen (1948)                                     • Cyclosporine – immunosuppressant (1970)
• Tetracycline (1955)                                            • Lovastatin (Mevacor) – 1st statin (1987)

Procedures
There were an amazing number of new medical procedures too that were created by modern medicine over these 100 years too.  Here’s a list of some of the “firsts”:
• Electrocardiogram (1903)         • Kidney transplant (1954)      • MRI (1971)
• Stereotactic surgery (1908)      • Pacemaker (1958)                 • CAT Scan, (1971)
• Laparoscopy (1910)                   • “Test Tube Baby” (1959)      • Insulin pump (1972
• Electroencephalogram (1929) • Liver transplant (1963)        • Laser eye surgery (1973)
• Dialysis machine (1943)           • Lung transplant (1963)        • Liposuction (1974)
• Heart-Lung Machine (1953     • Pancreas transplant (1966) •Heart-lung transplant (1981)
• Ultrasound (1953)                    • Heart transplant (1967)       • Surgical Robot (1985)

Mankind has been the beneficiary of these creations and we gratefully acknowledge and salute medical science for their wondrous contributions, inclusive of all medical specialties – save one.

Again – What About Madness?
Medical scientists addressing madness contributed to this otherwise spectacular century with four magic bullets of their own during the first 50 years of the century, each an unmitigated disaster.  The first three are collectively called Shock Therapies and include, Deep Sleep Therapy, Convulsive Therapy, and Insulin Shock Therapy.  The fourth is Psychosurgery.  Here’s a review.

Deep Sleep Therapy (DST)
Jakob Klaesi, a Swiss psychiatrist re-popularized DST in 1920 (after two failed attempts earlier in the century) using Sonmifen (a sedative) for his schizophrenia patients.  For the next 20 years Klaesi and his colleagues dominated the mental health hospital circuit in Zurich using DST and found worldwide fame among colleagues, despite high mortality rates and never ending doubts about efficacy.  Undeterred, DST was promoted by many eminent psychiatrists of the time, including William Sargant of Great Britain:

“All sorts of treatment can be given while the patient is kept sleeping, including a variety of drugs. . . the patient does not know how long he has been asleep, or what treatment, even including ECT, he has been given. . . a new exciting beginning in psychiatry and the possibility of a treatment era such as followed the introduction of anesthesia in surgery.”

The Australian Chelmsford scandal of 1983 finally put an end to this toxic procedure.  Dr. Harry Bailey was in charge of Chelmsford Private Hospital in Australia and DST was the primary treatment for madness.  Over sixteen years, 27 deaths were directly connected to DST with another 24 reports of suicide in the same year patients received treatment.  Facing condemnation from families, the general public and the government, Bailey committed suicide in 1985.[i]  The scandal brought about new stringent laws and regulations regarding psychiatric care in Australia.[ii]

Insulin Shock Therapy (or Insulin Coma Therapy)[iii]

Dr. Manfred Sakel was a young doctor in Vienna in 1928 when he was given the task to reduce the unpleasant withdrawal symptoms of opiates.  Experimenting with a newly discovered pancreatic hormone – insulin – he unexpectedly found a large dose would cause his patients to go into a stupor.  Once recovered, he noted, his patients were less argumentative, less hostile, and less aggressive.  Thus, Insulin Shock Therapy (IST) was born.  For the next 30 years IST was the go-to method for hundreds of thousands of mental health patients.  IST doctors proudly proclaimed an “80 per cent cure rate for schizophrenia.” [iv]

The actual procedure was intense.  Insulin injections were administered six days a week for two months or more as the daily dose was gradually increased until hour-long comas were produced.  Seizures before or during the coma were common as were hypoglycemic aftershocks.  Often patients were subjected to electric shock therapy while comatized.  Given the profuse cases of brain damage – and an estimated mortality risk rate ranging from 1-5%[v] – IST fell out of use in the United States, and nearly everywhere else, by the 1970s.[vi]

Convulsive Therapy
Convulsive therapy took hold quickly.  In 1934 Ladislas J. Meduna, a Hungarian neuropsychiatrist known as the “father of convulsive therapy,” used metrazol (a stimulant) to induce seizures in patients with schizophrenia and epilepsy.  By 1937, the first international meeting on convulsive therapy was convened in Switzerland, and by 1940 metrazol-convulsive therapy was being used worldwide.  

Around the same time Ugo Cerletti, an Italian neuropsychiatrist, was using electric shocks to produce seizures in his animal experiments.  He noticed when pigs were given an electric shock before being butchered, they were in an “anesthetized state.”  With his colleague Lucio Bini, they replaced metrazol and other chemicals used to create convulsions with electricity.  As a bonus, they surmised, ECT brought about retrograde amnesia so patients had no ill feelings about a treatment they could not remember.  Cheaper and more convenient, ECT replaced chemical-induced convulsive therapy and by 1940 was being used in England, Germany, Austria, and the United States.  (NOTE:  Cerletti and Bini were nominated, though not selected, for a Nobel Prize.)

There was a marked decline in the use of ECT from 1950s to the 1970s because the public perceived the procedure as dangerous, inhumane and overused.[vii]  However, because ECT was convenient and cost-effective, mental health providers balked.  By 1985, the National Institute of Mental Health (NIMH) and the National Institutes of Health (NIH) convened a conference on ECT and concluded, though controversial, ECT was effective for a narrow range of psychiatric disorders.  In 2001 the American Psychiatric Association expanded the role of ECT and, by 2017, ECT was covered by most insurance companies.  This incredibly cruel and torturous “treatment procedure” has gained popularity – again.[viii]

Psychosurgery
In 1930 Antonio Egas Moniz, a Portuguese neurologist, used the term “leucotomy” (lobotomy) for the first time to describe a surgical operation that destroys brain tissue by extraction, burning, freezing, electrical current or radiation.  The objective is to sever the connections between the frontal lobes and deeper structures in the brain.  Approximately 40,000 lobotomies were performed in the United States alone from 1930 to 1970.  By the way, the majority (nearly two thirds) were performed on women.[ix]

Use declined rapidly due to increased concern about deaths and brain damage caused by the operation, and by the introduction of neuroleptic (lit. nerve-seizing) drugs.  By the mid-1970s the use of psychosurgery had declined to about 100–150 operations a year and disappeared completely by the 1980’s.  Remarkably, Moniz (and Walter Rudolf Hess) shared the Nobel Prize in 1949 for this discovery, though not without controversy that still exists in the scientific community today.[x]

What A Century
For medical scientists focused on physical human ailments, it was a stupendous century.  Life expectancy is approaching 80 years in the United States, up from 50 years at the beginning of the 20th century.  We are routinely treated with medicines and procedures for debilitating diseases that diminished, disfigured and killed our ancestors not that long ago.  As important, the consistency and precision provided by empiricism and the scientific method paid off for all medical specialties during those amazing 100 years – save one.

For medical scientists focused on madness it’s been one grotesque failure after another.  These scientists put us to sleep for weeks at a time, induced comas for months at a time, used chemicals and electricity to convulse us, and surgically destroyed our brain tissue, all in an effort to fix our brain diseases.  Along the way all of them ballyhooed their successes, using their special brand of science and patient testimonials to convince us of the medical necessity and efficacy of their magic bullets.

Then in 1950, as if a reprieve for past travesties, a fifth magic bullet appeared:  psychiatric drugs.  This new state-of-the-art medicine was primed to replace the first four fiascos.  Given the track record of PMM, it’s a wonder anyone took them serious. 

Unfortunately, nearly everyone did.

“Medicine men devised all manners of disabling methods—for three centuries—finally discovering drugs as an easy and efficient means of achieving disability.  

                                                –  David West Keirsey, Disable Madmen[i] 

NEXT:  Part 3:  Thorazine to the Rescue

[i] (https://professorkeirsey.wordpress.com/2011/08/17/disable-madman-part-i/)

[i] You can read more about this @ https://chelmsfordblog.wordpress.com/aftermath-of-the-scandal/

[ii] In her book First Half, Toni Lamond described her experience at Chelmsford:  I was given a semi-private room. On the way to it I saw several beds along the corridors with sleeping patients. The patient in the other bed in my room was also asleep. I thought nothing of it at the time. Although it was mid-morning, the stillness was eerie for a hospital that looked to be full to overflowing. I was given a handful of pills to take and the next thing I remember was Dr Bailey standing by the bed asking how I felt. I told him I’d had a good night’s sleep. He laughed and informed me it was ten days later and, what’s more, he had taken some weight off me. I was checked out of the hospital and this time noticed the other patients were still asleep or being taken to the bathroom while out on their feet.  https://en.wikipedia.org/wiki/Deep_sleep_therapy

[iii] https://en.wikipedia.org/wiki/Insulin_shock_therapy#Decline

[iv] https://en.wikipedia.org/wiki/Insulin_shock_therapy

[v] Ebaugh, FG (1943). “A review of the drastic shock therapies in the treatment of the psychoses”. Annals of Internal Medicine. 18 (3): 279–296. doi:10.7326/0003-4819-18-3-279.

[vi] In 1953, British psychiatrist Harold Bourne published “The Insulin Myth,” arguing there was no sound basis for believing that insulin counteracted the schizophrenic process.  He said treatment “worked” because patients were chosen for their good prognosis and were given special treatment.  Bourne submitted the article to the Journal of Mental Science.  After a 12-month delay, Bourne received a rejection, telling him to “get more experience.”  https://ipfs.io/ipfs/QmXoypizjW3WknFiJnKLwHCnL72vedxjQkDDP1mXWo6uco/wiki/Insulin_shock_therapy.html

[vii] Later on, the public’s negative perception of ECT was further tarnished by the movie One Flew Over the Cuckoo’s Nest

[viii] The most egregious ECT provider in the United States?  Read about Pennsylvania’s Rotenberg Centerhttps://doctorcima.com/2012/06/

[ix] https://en.wikipedia.org/wiki/Lobotomy.  In addition, in Japan the majority of lobotomies were performed on children with behavior problems.

[x] There have been calls in the early 21st century for the Nobel Foundation to rescind the prize it awarded to Moniz, characterizing the decision at the time as an astounding error in judgment.  To date, the foundation has declined to take action and has continued to defend the results of the procedure.

[i] https://en.wikipedia.org/wiki/August_K%C3%B6hler

[ii] How Julius Richard Petri’s Dishes Changed Medical History

 https://www.medicaldaily.com/how-julius-richard-petris-dishes-changed-medical-history-246396

[iii] Erlich shared the 1908 Nobel Prize in Physiology or Medicine with Élie Metchnikoff for their contributions to the field of immunology. 

[iv] Fleming, Howard Florey and Ernst Boris Chain jointly shared the Nobel Prize in Medicine in 1945.

Zombie Theory – Part 3

WHO’s Counting
Within the last 70 years, with the assistance of a trillion dollar worldwide pharmaceutical business and their partners in academia – and a willing populace – human beings are being drugged into“balance” to treat fictitious brain diseases in astronomical numbers.  In 2018, the World Health Organization (WHO) estimates 300 million people around the world have depression disorder, 60 million have bi-polar disorder, and another 23 million have schizophrenia disorder and other psychoses[i] – all of them in need of psychiatric medication. According to WHO, it’s likely these numbers are significantly higher as poorer countries have no way to record mental illness.  Leading the way – the United States.

The Journal of the American Medical Association (JAMA) reported in 2017 more than 40 million adults were prescribed one or more psychiatric prescriptions in America.[ii]  Race, you should know, is a factor.  One in five white adults, one in ten black adults, one in twelve Hispanic adults, and one in 20 Asian adults are prescribed psychiatric medications.  By gender, the difference is as significant. Nearly twice as many women (20.8%) are taking psychotropics than men (11.9%).  Age matters too.  About one in ten 18-39 year-olds are psychiatric patients, nearly one in five 40-59 year-olds, and, out in front by a wide margin, a solid one fourth of adults between the ages of 60-85 are prescribed psychiatric medications.  By the way, why is “mental illness” dependent on race, gender and age?  How does the (PMM) scientist explain this?  They can’t, so they don’t.

Children
There are about 74 million children in the United States in 2021[iii].  Nearly 17 million are diagnosed with a brain disease.  The Center for Disease Control (CDC) reports 6.1 million children have been diagnosed with ADHD disorder, 4.5 million with a behavior disorder, another 4.4 million with anxiety disorder, and 1.9 million with depression disorder[iv].  And why are nearly one in five children being drugged?  The American Academy of Child & Adolescent Psychiatry (AACAP) declares there are eleven psychiatric symptoms and disorders for which psychiatric medication may be prescribed for children.  The list includes bedwetting, anxiety, attention-deficit/hyperactivity disorder, obsessive-compulsive disorder, depression disorder, eating disorder, bi-polar disorder, psychosis,
autism spectrum disorders, severe aggression and sleep problems[v].

Toddler & Infants Too
From the New York Times, May 2014:
“About 15,000 American toddlers 2 or 3 years old, many on Medicaid, are being medicated for attention deficit hyperactivity disorder, according to data presented Friday by an official at the federal Centers for Disease Control and Prevention[vi].”

From Medical Daily, December 2015:
“The report shows that psychotropic drug prescriptions among babies nearly doubled in one year, from 13,000 prescriptions in 2013 to 20,000 in 2014, despite the lack of evidence that shows they are effective and safe for young children . . .  psychiatrists often prescribe these drugs . . . for behavioral issues like unusual aggression, temper tantrums, or lethargy[vii].”

What Are We Taking – and Why Are We Taking Them?
From PsychCentral, here of the top 25 psychiatric drugs that were sold in America – and the reasons we take them – in 2016[viii]:

UntitledImage

Knot in the Mood
More than 338 million prescriptions were written just for anti-depressant medications in 2016, by far the winning diagnosis.  And depression isn’t as simple as you may think.  There are all kinds of depression including atypical depression, bipolar disorder I, bi-polar disorder II, catatonic depression, cyclothymia, depressive personality disorder, double depression disorder, dysthymia, melancholic depression, minor depressive disorder, postpartum depression, premenstrual dysphoric disorder, psychotic major depression, recurrent brief depression, and last, but not least, seasonal affective disorder, affectionately known as SAD.

Is there a common denominator for all of these chemicals?  Of course, and it’s easy to see.  All 25 chemicals address the same innate, unavoidable, uncomfortable, ever-changing, seemingly whimsical and sometimes-hard-to-shake-life-altering-human-experience:  mood.  And yes, that includes ADHD, including the effect ADHD has on the mood of others.  Ok, you may notice, schizophrenia is about consciousness, not mood.
Nonetheless, the PMM provides treatment for schizophrenia and other aspects of consciousness with the same mood medications:  tranquilizers.  Anything else these chemicals have in common?  Yes, of course, and it’s easy to see too.  Twenty-two of them are central nervous system depressants (CNSD), and three of them are central nervous system stimulants (CNSS).  What’s that about and why is it important?

Lost In the Shuffle
Rhône-Poulenc, a French pharmaceutical company, was developing antihistamines for nausea and allergies in the late 1940’s.  Scientists noticed some chemical compounds exhibited unusual calming effects. Dr. Henri Laborit, a French surgeon, called this effect artificial hibernation, and described it as “sedation without narcosis (unconsciousness).”  By 1951 Laborit was conducting clinical trials of the newest compound – chlorpromazine (CPZ) –  for use as an anesthetic booster for surgery patients.  He proclaimed CPZ the “best drug to date” in calming and reducing shock during surgery. Known as Laborit’s drug among colleagues[i], by 1953 CPZ was released for use in the operating room.

Laborit was also a persistent advocate for clinical trials for psychiatric patients using this new wonder chemical.  His persistence paid off.  On January 19, 1952, for the first time, CPZ was administered to a manic patient named Jacque.  Jacque’s improvement was reported to be “dramatic.”  After three weeks – and 855 mg of CPZ – Jacque was released from the hospital. Word spread quickly about the “breakthrough.”

Dr. Pierre Deniker[ii], another French surgeon, ordered CPZ for a clinical trial at the Sainte-Anne Hospital Center in Paris.  His findings were even more dramatic.  Often doubling Laborit’s doses, Deniker found CPZ had much more than calming effects.  His patients showed “remarkable improvement in thinking and emotional behavior.”  By the end of 1952, Deniker abandoned old, ineffective, and harmful shock methods and began to treat mental illness with CPZ.

Soon after, Kline & French Pharmaceuticals (today’s GlaxoSmithKline) purchased the rights to CPZ from Rhône-Poulenc.  By 1954, Smith-Kline & French received FDA approval to market CPZ as Thorazine to treat schizophrenia.  The world’s first psychiatric medication was created – and marketed.  Advertisements and professionals soon were boasting how “Thorazine helps to keep more patients out of mental hospitals.”  Please remember, hospital beds were required because psychiatric patients needed time to recover from electrocution, the surgeon’s knife, or chemically induced, months-long comas.  While a chorus of public outcries about the inhumane treatment of psychiatric patients had already begun to empty the beds of these torture asylums, psychiatric scientists and drug company marketers at the time attributed this exodus to the “dramatic success of Thorazine.”

Then, in October 1955, Deniker’s Saint-Anne Hospital Center convened the first international Thorazine (CPZ) conference.  Attendees included scientists from Austria, Belgium, Brazil, Canada, Cuba, France, Germany, Great Britain, Holland, Luxembourg, Peru, Portugal, Spain, Sweden, Switzerland, Turkey, United Kingdom, United States and Venezuela.  Soon to follow were thousands of papers from scientists around the world publicizing their own “dramatic successes” with Thorazine for an ever-widening variety of brain disorders, affecting millions of patients. By 1957, Laborit, Deniker (and Heinz Lehmann) were awarded the prestigious Albert Lasker Award for their contributions to the clinical development and use of Thorazine – dubbed “the world’s first anti-psychotic medication[iii].”

During the 1950’s and 1960’s, the pharmaceutical ads for Thorazine were ubiquitous.
Thorazine was prescribed for bursitis pain, cancer pain, emotional stress, anxiety, nausea and vomiting, “management of menopausal patients,” child behavior disorders, acute alcoholism, severe asthma, depression, hiccups, catatonic schizophrenia, schizoaffective conditions, epileptic clouded states, agitation in lobotomized patients, confusional states, senile psychoses, gastrointestinal disorders, psoriasis, and more[iv].  By 1964, fifty million people around the world had used Thorazine[v].  All this, from one calming tranquilizer.

In his book The Creation of Psychopharmacology (2002)[vi], David Healy, the renowned British psychiatrist, professor, scientist, author – and current director of an ECT clinic in Wales – proclaims the discovery of Thorazine as significant to medicine as the discovery of penicillin.  As important, he asserts, Thorazine was also the first profitable psychiatric medication for pharmaceutical companies.  He marks the convergence of these two events – a wonder treatment and profitability – as the genesis of what he terms “biological psychiatry,” and the 1980 publication of DSM-III as bonding psychiatry to the biological
cause of mental illness, forever.  Healy also details the prodigious growth of pharmaceutical companies and their promotional strategies, including coordination with academia to find new mental illnesses, and to manufacture the medications to treat them.

There were huge profits in the making for this burgeoning “take-a-pill-for-it” market, and Big Pharma began to flourish.  By the end of the 1960’s pharmaceutical companies had created dozens of “new and improved” medications for a growing number of new mental illnesses.  By then, Thorazine was regarded as just another, less effective medication, now criticized by its competitors for its notorious side effects.  And what were these “new and improved” medications created by Big Pharma? More tranquilizers.

Was the discovery of Thorazine really as significant as penicillin?  Yes, it was – if you are a proponent of the PMM.  Dr. Healy is, and he has company.  So is 94%[vii] of the general public and, presumably, 99+% of professionals.  However, if you are a PMM antagonist, then Thorazine was – nothing more and nothing less – the world’s first chemical tranquilizer, and a precursor to the hundreds of tranquilizers to follow.

A Lost Cause
Take a look at this chart of the top ten diagnosed brain disorders, and their causes:

UntitledImage

There are a total of 713 medications manufactured by drug companies for the top ten brain disorders, for which there are no known causes.  How is that possible?  By the way, these are just the top ten diagnoses.  You can see the entire list of Medications for Psychiatric Disorders at drugs.com[i].  Please be warned.  If you are looking for a cause for any of the brain disorders of PMM, you will be disappointed.  There are none.

So, do you wonder too?  What the hell are they treating?

Jacque to the Future
When Jacque took that first dose of CPZ in 1952, everything changed in psychology and psychiatry.  In just a few decades, psychological diagnoses became medical diagnoses, needing medical oversight, medical solutions, and medical doctors to do so.  Now, a psychiatric medical patient sees a medical doctor for psychiatric medication to address a brain disorder.  Behaviors once considered understandable responses to the challenges of life by psychology became symptoms to the doctors of the PMM, and the symptoms became evidence of the underlying disease, or disorder, or deficiency, or delay, or disability, or derangement, or disturbance, or dysfunction, or – and may these children forgive us some day – a handicap.

We are approaching 50 million men, women, children, toddlers and infants in the United States who are taking pills for brain disorders.  And business is booming.  Psychiatric medications – the majority tranquilizers, a handful of stimulants, and an occasional analgesic – now number in the thousands, more created every year.  Not a single cure, and not a single cause for any of the ever-growing brain disorders of the PMM.

Did I mention business is booming?

Welcome to Zombieland
Psychiatry entered the last half of the 20th Century on an upnote, despite a horrendous track record for the first 50 years.  The PMM scientists created a new, simple, humane, and easily administered treatment solution for mental illness, and by the 1960’s big pharma was making money hand over fist.  Only one thing was missing to unify this marriage.  Healy’s “biological psychiatry” needed a coherent, science-based, peer-reviewed theory to explain how all these miracle drugs worked.

Well, they found one.  It’s about bad brain parts.  And a Zombie comes with it.

NEXT:   Part 4:  The Dope About Dopamine – and Other Ridiculous Notions

“When the doors to that dorm opened up a strange group of men would exit. They would seem to be in a hurry,
but unable to coordinate
their movements. Their heads would hang down and half expressions would ripple across
their faces. They would run their hands over their heads over and over,
and open and close their mouths while sticking
their thick tongues out. Their
gait was particularly peculiar, with stiff legs dragging their feet along, all the while
seeming about to topple. We called this the ‘thorazine shuffle.’”
– John Lash – Behind the Thorazine Shuffle, the Criminalization of Mental Illness (2012)

~~~~~~~~~~

NOTES
[i] Some of his colleagues referrered to the effect as “non-permanent, pharmacological lobotomy.”  https://en.wikipedia.org/wiki/Antipsychotic#History

[ii] Pierre Deniker Foundation – for research and prevention of mental illness
https://www.fondationpierredeniker.org/what-is-it

[iii] https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2655089/

[iv] For a thorough review of thorazine advertisements see:  http://www.bonkersinstitute.org/medshow/thorazine.html

[v] https://en.wikipedia.org/wiki/Chlorpromazine#History

[vi] See a review of The Creation of Psychopharmacology @https://www.ncbi.nlm.nih.gov/pmc/articles/PMC1279263/

[vii] https://www.cbsnews.com/news/americans-big-bang-evolution-ap-poll/

[i]  WHO – https://www.who.int/en/news-room/fact-sheets/detail/mental-disorders

[ii] https://jamanetwork.com/journals/jamainternalmedicine/fullarticle/2592697

[iii] ChildStats.gov:  https://www.childstats.gov/AMERICASCHILDREN/tables/pop1.asp

[iv]  CDC:  https://www.cdc.gov/childrensmentalhealth/data.html

[v]  American Academy of Child & Adolescent Psychiatry (AACAP) https://www.aacap.org/AACAP/Families_and_Youth/Facts_for_Families/FFF-Guide/Psychiatric-Medication-For-Children-And-Adolescents-Part-I-How-Medications-Are-Used-021.aspx

[vi] hhttps://www.nytimes.com/2014/05/17/us/among-experts-scrutiny-of-attention-disorder-diagnoses-in-2-and-3-year-olds.html