Those readers familiar with the BBC (PBS in the US) series, Call the Midwife, may recall in Season 6 there is an episode that begins with an explosion at a dockside warehouse. There was no nearby water to clean burns or wounds and no first aid kit to dress them. One man died; another lost his sight. Tragic. The best that could be done was to persuade local authorities, after an emotional coroner’s inquest, to provide water and first aid kits at all such locations. Nearly sixty years later, such incidents are rare and the combination of trades unions, legislation and enabling regulations have raised consciousness of workplace safety to what some might call an excessive level (more on that later).
In different degrees and in different forms, workers’ compensation insurance has existed in the United Kingdom since the turn of the 19th century. In its most comprehensive form, codification took place in 1965 with the National Insurance Injuries Act 1965. This legislation was part of the general platform of laws and regulations that set up what has come to be known as the Welfare State. The thinking that lay behind this change of emphasis – ironically, one of the driving forces behind the change was to limit the liability that employers might face through tort actions – was a desire to move toward a state-sponsored insurance system rather than one based around the standards of the legal system.
The United Kingdom set the pace for the industrial world, triggering a similar approach in the United States, starting in 1910 when Workers Compensation laws were enacted in the state of New York. Mississippi brought up the rear in 1948. Federal workers of all ages were finally covered by 1962. Workers’ compensation has provided both a safety net for injured workers and a limitation of liability for employers. In 2013, workers’ benefits and employers’ costs totalled $88.5bn, or 1.37% of Covered Payroll. According the NAIC, net premiums written for workers’ compensation represented just under 9% of total net premiums written in the property and casualty world.
The effect of this safety net infrastructure is to increase the cost of doing business for all businesses. So it should. The cost of this layer of security is around 0.5% of GDP. The burden of a workplace injury on an individual would, in many cases, be unbearable. Many approaches have been attempted to mitigate the potential for cost inflation where workers who benefit do not bear the cost of coverage. The most effective approach has been to seek to make the workplace safer. The Occupational Safety and Health Administration was established in 1971. Since its establishment, fatality and injury rates have dropped from 11 per 100 workers in 1972 to 3.6 per 100 workers in 2009. US employment doubled over that period.
In 1957 in Germany and, by 1960, in 46 other countries including the United Kingdom, the drug of choice for those seeking a non-barbiturate sedative and for women suffering from morning sickness (the latter because of the work of an Australian doctor) was thalidomide. Tragically, the drug was also responsible for a devastating birth deformity called phocomelia, a condition where children were born with shortened, missing or flipper-like limbs. By 1962, the drug was withdrawn. (The author’s mother had a prescription in her medicine cabinet – unopened).
Thanks to the efforts of Frances Kelsey, an FDA inspector, the drug never made it to broad distribution in the US – and all as a result of her caution rather than (as some might have thought) the FDA’s protocols. At the time, clinical trials in the US did not require FDA approval, nor did they require oversight. Largely as a result of the thalidomide tragedy, the US passed the Kefauver-Harris Drug Amendments Act in 1962. This legislation required that manufacturers prove that drugs are both safe and effective before they are released. Drugs can now take from eight to twelve years to be approved.
In 1963, a vaccine was developed for widespread use in order to immunise children against measles. By 1967, this vaccine had been expanded to include mumps and rubella. In 1960 there were, according to the CDC, 442,000 reported measles cases and 380 related deaths. After only 15 cases were reported between 1999 and 2001 and with more than 90% of school children having received two doses of the MMR vaccine, public health officials declared that measles was no longer endemic in the United States.
The direct and societal costs associated with the MMR vaccine were analysed and published in a 2004 article in the Journal of Infectious Diseases and estimated the reduction in direct and societal costs over a 40 year period to be approximately 92% (or $3.5bn) and 97% (or $7.6bn) respectively – quite an achievement for a humble vaccine.
In 1998, a fraudulent paper was published by a UK physician, Andrew Wakefield, claiming a link between the MMR vaccine and autism. As a result, in part, of this paper – notwithstanding that its assertions have been proven to be false and the physician struck off the medical rolls – some celebrities in the US, notably Jenny McCarthy and Jim Carrey, have become anti-vaccine proponents.
One of the underlying assertions is to question the wisdom of taking the risk of vaccination as a cause of autism when the incidence of MMR is so low. This theory has the potential to compromise the herd immunity of a population at large. Herd immunity is achieved at different thresholds for different diseases and is a phenomenon that can be calculated with a high degree of accuracy. So important is this immunity to protect that California Governor, Jerry Brown introduced legislation mandating vaccination and permitting no exceptions for religious beliefs.
How Does this Connect to Regulation?
Almost sixty years after the groundbreaking advances that occurred and which have led to the near-eradication of certain contagious diseases, and almost 100 years after the introduction of legislation that has dramatically increased workplace safety and levels of compensation for workplace injuries, the current administration now apparently believes that much regulation needs to be swept aside.
President Trump has signed executive orders triggering deregulation initiatives in the areas of energy, clean power, federal contracting, tax regulations on corporate inversions, Dodd-Frank and regulations themselves. The President has also appointed anti-vaxxer, Robert F Kennedy to head a “Vaccine Safety” Commission. It is worth reflecting on whether, similar to the learning on vaccination, there may be a level of herd immunity in the regulation of businesses and the workplace that is in danger of being compromised.
Businesses typically do not like government regulation. It slows things down and increases costs, at least in the short term. Workplace safety, as mentioned above has increased measurably and substantially. As a consequence, ‘herd immunity’ is high and the risks of non-compliance seem low. Accordingly, the bias towards deregulation is high. Certainly, while OHSA has done a great job in many areas, it is not without problems. The mandate from President Trump at the beginning of the year that no new regulations may be introduced without first eliminating two existing regulations seems to have had a chilling effect. The IRS, as an example, is paralysed in certain pressing regulatory efforts. Perhaps not a bad thing – provided a more coherent test is developed to screen the introduction of new regulations.
The FDA standard of proving safety and effectiveness in advance of introducing drugs may not translate completely, but the concepts of “do no harm” and proving benefits seem sound. Before embracing President Trump’s de-regulatory agenda with unbridled enthusiasm, a note of caution is appropriate – especially since, among tax reform, infrastructure spending, and healthcare reform, it seems to be the one thing that stands the greatest chance of being implemented. It is to be hoped that a useful standard will evolve.