Clearly, the private market is not working well for health insurance or health care in the U.S. Costs are rapidly escalating in a system that is already the most expensive in the world, but that has mediocre to poor outcomes. Many private health insurance, pharmaceutical, and health care corporations are putting profits before patients.

Increasing premiums for health insurance and high drug prices (see my previous post on drug prices) are undermining efforts to control health care costs. The exorbitant and fast growing costs of U.S. health care are squeezing state and federal governments’ budgets, as well as employers and individuals. In many states’ budgets, increased costs for health care for poor families and seniors through Medicaid, as well as for employees and retirees, are eating up all the increases in revenue from economic growth – and then some. This means that without tax increases or other sources of increased revenue, states and the federal government are having to cut spending in other areas of their budgets.

Increasing costs for employees’ health insurance are hurting employers’ competitiveness with foreign companies and reducing their profitability. Some employers have dropped health insurance as an employee benefit, while others have increased the portion of health insurance premiums employees must pay or are offering insurance plans with less comprehensive coverage as well as higher deductibles and co-pays. As fewer employees get health insurance through their employers, the number of people in subsidized government health programs increases, further increasing costs for governments.

Individuals are not getting the health care they need because insurance is not making it accessible and affordable. Many people are suffering financial hardship and some file for bankruptcy because of the costs of health care.

The clear solution to these problems is to provide everyone access to what’s referred to as a “public option” or a Medicare-for-all type health insurance plan. This would be a government run insurance pool, which is what Medicare is for seniors. When the Affordable Care Act (ACA) was being considered by Congress, a public option was initially included. In other words, a government run insurance plan would have been offered by each of the ACA’s state-level health insurance marketplaces (aka exchanges) where people without health insurance would buy coverage. A public option was vehemently opposed by the private insurers and was eventually dropped from the ACA legislation. They opposed it because they didn’t want the competition from a Medicare-type program that would be likely to expose their inefficiencies – despite, of course, these private corporations’ dedication to free markets and competition whenever any government regulation is proposed.

With problems in our privatized health care system becoming increasingly apparent, including a public option in the ACA exchanges is gaining increased support. [1] With some private health insurers abandoning the exchanges, it is projected that 7 states will have only one private insurer offering coverage. [2] In these state, having a public health insurance plan as an option would mean that there was still competition. This would serve as a check on the sole private insurer, ensuring that its coverage and pricing remained competitive and that it didn’t exploit a monopoly situation.

More broadly, there have been numerous proposals over many years to allow anyone over 50 or 55 years old to “buy into” Medicare. In other words, although they hadn’t yet reached the normal Medicare eligibility age of 65, these individuals would be allowed to pay an appropriate premium to buy health insurance as part of the Medicare insurance pool.

Senator Sanders, in his presidential campaign, highlighted his proposal for Medicare-for-All. This proposal would allow anyone to pay an appropriate premium to buy health insurance as part of a large, Medicare-like, government insurance pool. This proposal received broad and often enthusiastic support. [3]

A public option in the ACA exchanges or a Medicare-for-All option for everyone is the only way to realistically address the shortcomings of our privatized system of health care. By providing real competition for the private insurers, this would ensure the quality and affordability of health insurance. By giving the public option or Medicare-for-All insurance pools the right to negotiate with the pharmaceutical corporations over drug prices, prescription drug costs could be brought under control. (The Medicare drug benefit should also be changed to allow Medicare to negotiate drug prices.)

If we want quality and affordability in our health care system, a public option or Medicare-for-All program is essential as a check on the private corporations that currently dominate our health care system. Currently, a proposal in the U.S. Senate would add a public option to the ACA exchanges. It already has the support of over 30 Senators, including Senators Bernie Sanders (VT), Elizabeth Warren (MA), Jeff Merkley (OR), Charles Schumer (NY), Patty Murray (WA), and Dick Durbin (IL).

I encourage you to contact your U.S. Senators and other elected officials to tell them you support a public option under the Affordable Care Act specifically and a Medicare-for-All program in general. The for-profit health insurance, pharmaceutical, and health care corporations will fight tooth and nail to stop this competition. They will make huge campaign and lobbying expenditures to try to maintain their ability to manipulate our health care system to generate large profits and exorbitant executive compensation. Only a huge outcry and sustained pressure from the grassroots – from we the people – will get our policy makers to enact the significant reforms needed to create a health care system that delivers affordable, high quality care for all.

[1]       Willies, E., 8/28/16, “Recent headlines signal need for single-payer Medicare for All – now,” Daily Kos (

[2]       Alonso-Zaldivar, R., 8/29/16, “Challenges mount for health law,” The Boston Globe from the Associated Press

[3]       Nichols, J., 9/16/16, “Make the public option a central focus of the 2016 campaign,” The Nation (



The goals of health insurance are to provide affordable access to health care and to protect people from the catastrophic costs of serious health problems. The health insurance system in the US is failing to meet these goals for many Americans.

The most recent and newsworthy issues with private health insurance are occurring in the so-called health insurance exchanges. These are state-level marketplaces created by the Affordable Care Act (ACA, aka Obama Care) where individuals without health insurance can buy coverage.

Many of the private health insurers offering policies through the exchanges are increasing the premiums they charge; some by as much as 62%. This is happening in part because some insurers initially set premiums unrealistically low in order to attract customers and gain market share. In addition, health care costs for those enrolling through the exchanges have been greater than some insurers estimated. [1]

As a result of these increased premiums, customers may switch to less expensive policies with less comprehensive coverage as well as higher deductible and co-payment amounts. This will increase the costs of health care for these customers, leaving some of them under-insured and vulnerable to financial hardship or bankruptcy if a major medical expense occurs.

Some insurers are terminating their participation in the exchanges, ostensibly because they aren’t making money on the policies they are selling there. However, in the case of Aetna, apparently it is planning to withdraw from 11 of the 15 exchanges in which it participates as retaliation for the federal government’s opposition to Aetna’s proposed merger with Humana. Both Aetna and Humana are among the 5 largest health insurers in the country. If this merger and another one (between Anthem and Cigna) that the government is blocking were approved, the top 5 health insurers would become 3 huge corporations. These are exactly the kinds of mergers that are resulting in decreased competition, increased prices, and near monopolistic power. (See my earlier blog post for more details.)

Overall, the health insurance corporations are raising premiums and cutting their participation in the exchanges to cut losses or increase profits. Profits are more important to them than meeting the goals of health insurance for customers.

Furthermore, private insurers are much less efficient than Medicare, the public health insurance program for our seniors. This is well documented. Medicare spends over 95% of its budget on actual health care. Private health insurers spends as little as 67% of premiums on actual health care. They use money from premiums to pay for advertising, profits, and executives’ compensation. To ensure a reasonable level of efficiency, the ACA requires health insurance policies offered on its exchanges to spend 80% of their premiums on actual health care – as opposed to administrative and corporate expenses.

Private health insurance simply doesn’t make sense from two key perspectives. First, health insurance and health care are not “markets” as defined by economics. Consumers don’t have perfect and clear information about the competing products. Consumers can’t and don’t effectively shop around for health insurance plans or health care services. When one needs a medical procedure, one doesn’t have the time, information, or capacity to shop around and find the best combination of quality and price.

Second, the whole premise of insurance is that risk is shared among a large, random pool of people. However, the multiple health insurers fragment the pool and, furthermore, each one works to attract healthy people (who are less costly to serve) and avoid those who are sick. With one large, random pool, the unpredictable nature of health care needs and costs is shared. The financial hardship of a serious medical issue does not fall on one individual or family. However, our private health insurance industry fragments the pool and tries to only insure healthy people. They do this through advertising, which therefore becomes a major expense, along with special perks like coverage for membership in a fitness center. They do it by denying payment so sick customers get frustrated and leave. This is clearly documented in Medicare, where the private insurers that provide services under Medicare are clearly successful at attracting the healthier seniors but then dumping them back into the government insurance pool when they get sick.

In addition, the presence of multiple private health insurers also increases costs for doctors, hospitals, and other providers of health care. Each insurer has its own forms and procedures with which the providers have to cope.

Roughly 50 – 60 million adults struggle with health care bills each year and the great majority of them have health insurance. This includes roughly 20% of the adult population under 65, the age of eligibility for automatic health insurance under Medicare. [2] Nearly 2 million Americans will file for bankruptcy this year in cases where unpaid medical bills are a major factor. Overwhelming health care bills are the number one reason for personal bankruptcy filings. [3]

More Americans have health insurance today than ever before thanks to the ACA, which has provided health insurance to 15 million people. However, because of the dysfunction of privatized health insurance, this has not significantly reduced financial hardship due to medical bills. Notably, the easiest way for health insurers to reduce costs and increase profits is to refuse to pay for health care services. Having a health insurer deny authorization or payment for care is something almost all Americans have experienced. In addition, health insurers are increasing premiums while reducing coverage and raising deductibles and co-payments.

Clearly, private health insurance is not meeting the goals of affordability and protection from financial hardship. My next post will present solutions to the problems of our privatized health care system.

[1]       Alonso-Zaldivar, R., 8/29/16, “Challenges mount for health law,” The Boston Globe from the Associated Press

[2]       Sanger-Katz, M., 1/5/16, “Even insured can face crushing medical debt, study finds,” The New York Times

[3]       Mangan, D., 6/25/13, “Medical bills are the biggest cause of US bankruptcies,” CNBC (


A series of recent events have highlighted the problems with our privatized, for-profit health care system. First, there have been numerous cases of drug prices that have increased dramatically. I’ll discuss this topic in this post.

Second, health insurance corporations have been merging (and continue to try to) to create a few, enormous corporations that have monopolistic power, which leads to increases in health insurance costs. A similar pattern is occurring among health care providers, although this tends to be more regional than national. I’ll discuss these issues in my next post, followed by a post on solutions to the problems of our privatized health care system.

These recent events highlight that per capita health care spending in the U.S. continues to climb more rapidly than overall inflation. And they underscore that our health care spending is already exorbitant compared to every other country, while our health outcomes are worse.

The dramatic increase in the cost of EpiPens has been the most recent and perhaps most prominent of the extraordinary increases in drug prices. Perhaps this is because of its widespread usage and dramatic life-saving potential, especially for allergic reactions in children. The history here is that the EpiPen cost $50 in 2004. It was bought by Mylan in 2007, which began to steadily increase its price. It hit $250 in 2013 and then, in August, Mylan jumped the price to $600 – 12 times what it cost in 2004. By the way, the actual drug in the EpiPen costs about $1. [1]

The pharmaceutical corporations typically argue that their high drug prices are needed to pay for research and development. The validity of this argument is questionable at best and clearly false in many cases, such as the EpiPen case. A recent study found no evidence of a connection between drug research and development costs and prices. It concluded that drug prices are based on what the manufacturer can squeeze out of consumers and their insurance. [2]

For example, in August the price of Daraprim was raised to $750 per pill from $13.50. It had been $1 per pill in 2010. This is a 62-year-old drug that treats a life-threatening parasitic infection in babies and those with compromised immune systems, such as AIDS and cancer patients. In 2010, GlaxoSmithKline sold the drug to CorePharma, which quickly increased the price from $1 to around $10 per pill. In August, the drug was acquired by Turing Pharmaceuticals, a start-up run by a former hedge fund manager, and its price was immediately increased to $750 – 750 times its cost in 2010. [3] Turing is not a pharmaceutical company; it doesn’t do research and development. It is basically a hedge fund that buys the rights to drugs on which it believes it can dramatically increase prices to make a great return on its investment. Why the price increases? Greed coupled with a lack of regulation is the only answer.

Similarly, Rodelis Therapeutics bought Cycloserine, a drug to treat drug-resistant tuberculosis. It quickly increased the price per pill to $360 from about $17. Likewise, Valeant Pharmaceuticals acquired two heart drugs and more than doubled the price of one and quintupled the price of the other. This was on top of a quintupling of their prices in 2013 by the previous owner that had recently purchased them. So, overall their prices have jumped to 10 and 25 times what they were in 2013.

Per capita prescription drug spending in the U.S. is the highest in the world. U.S. drug spending is more than twice as high as the average for 19 other advanced countries and one-third higher than in the next most expensive countries, Canada and Germany.

Medicare, the huge health insurance plan for our seniors, is prohibited from negotiating with pharmaceutical corporations for lower drug prices. [4] This was written into the expansion of Medicare that added coverage of drugs by the George W. Bush administration at the behest of the pharmaceutical corporations. Meanwhile, the Veterans Administration, many health insurers, and health care systems in other countries negotiate far lower prices for drugs than what Medicare ends up paying.

U.S. patent laws and other market protections slow the availability of less expensive, generic versions of drugs, thereby supporting high prices for brand name drugs here in the U.S. Brand name drugs (as opposed to generics) represent 10% of prescriptions but 72% of drug spending.

The pharmaceutical corporations also use multiple business strategies to limit competition so they can maintain high prices for their drugs. One strategy is to use what the pharmaceutical industry calls “controlled distribution.” This means that the drugs are not distributed through drugstores but only directly by the corporation. Therefore, companies that want to make and sell a generic version of the drug, cannot get the samples they need to analyze, replicate, and test a generic version of the drug. Another strategy is to pay generic drug manufacturers not to make a generic version of a drug, even after its patent has expired. A third strategy is to make a minor modification to a drug, one that often has no functional impact, in order to obtain a patent extension based on the modification.

Dramatic increases in the prices of generic drugs (i.e., non-brand-name drugs that are no longer covered by a patent) are a relatively new phenomenon. Prices of generic drugs declined from 2006 to 2013. However, there are numerous examples of dramatic price increases over the past 3 years: [5]

  • Tetracycline (a common antibiotic): $0.06 to $4.60 per pill (77 times as expensive)
  • Amitriptyline (an antidepressant): $0.04 to $1.03 per pill (26 times)
  • Clobetasol (a prescription skin cream): $0.26 to $4.15 per gram (16 times)
  • Captopril (a blood pressure med): $0.11 to $0.91 per pill (8 times)
  • Digoxin (a heart med): $0.12 to $0.98 per pill (8 times)

Drug prices in the U.S. are not regulated or routinely negotiated as they are in other countries. Mergers of pharmaceutical corporations have reduced competition. Increasingly, the remaining large corporations have monopolistic power in the marketplace, and hence can increase prices more or less at will.

In California, the pharmaceutical industry, led by Merck and Pfizer, is spending over $80 million to defeat a ballot question that would limit state health plans to paying the discounted drug prices negotiated by the US Department of Veterans Affairs. Back in 2005, the pharmaceutical industry spent $135 million to defeat a ballot question that would have required it to provide discounted drugs for the poor. [6]

Perhaps not surprisingly, prescription drug costs represent the fastest growing portion of health care costs. Overall spending on prescription drugs has been growing at 10% per year, double the rate of increase of total health care spending, and roughly 5 times the rate of general inflation in the economy. Prescription drugs now account for 17% of all health care spending. [7]

[1]       Rosenthal, E., 9/2/16, “The lesson of EpiPens: Why drug prices spike, again and again,” The New York Times

[2]       Kesselheim, A.S., Avorn, J., & Sarparwari, A., 8/23/16, “The high cost of prescription drugs in the United States: Origins and prospects for reform,” The Journal of the American Medical Association

[3]       Pollack, A., 9/21/16, “Huge hikes in prices of drugs raise protests and questions,” The Boston Globe from The New York Times

[4]       Weisman, R., 8/24/16, “Exclusivity rule seen driving up drug costs,” The Boston Globe

[5]       McCluskey, P. D., 11/7/15, “The not-so-cheap alternative,” The Boston Globe

[6]       Robbins, R., 9/7/16, “A revolt against high drug prices,” The Boston Globe

[7]       Weisman, R., 8/24/16, see above


The low-wage business model of Walmart and McDonald’s, for example, is a choice, both of corporations and of our policy makers. In the restaurant industry, there are restaurants in Seattle and San Francisco that are paying their servers $13 per hour and are doing fine. Costco successfully competes with Walmart and In-N-Out-Burger with McDonald’s even though the former eschew the low-wage business model of their competitor. [1]

Economists have a label for the behavior of corporations that rely on a low-wage business model where employees need public assistance to survive: it’s called “free riding.” It’s a free ride for the employer, as public assistance programs are subsidizing their payrolls. It’s anything but a free ride for taxpayers and the workers.

In the fast food industry, over half of employees are enrolled in at least one public assistance program. The estimated cost to taxpayers is $76 billion per year. Ironically, the taxes paid by high-wage businesses and their employees, including those competing with the likes of McDonald’s and Walmart, help to pay for the public benefits that subsidize the low wages of these parasitic corporations. Until recently, McDonald’s actually assisted its employees in signing up for public benefits – to the tune of $1.2 billion per year. Walmart employees are estimated to receive $6 billion per year in public assistance. By the way, in 2015 McDonald’s profit was $4.53 billion and Walmart’s was $130.2 billion.

Economic theory states that workers get paid what they are worth. Clearly, this is an over simplification given the variations in pay that exist among employers within an industry, such as within the fast food or restaurant industries. It is more accurate to say that workers get paid what they negotiate, and that some employers are friendlier negotiators than others. At the top end of the pay spectrum, some CEOs negotiate to get paid far more than they’re worth, while many ordinary workers get paid far less than they are worth because they don’t have the power to negotiate better pay.

The U.S. labor market has a dramatic imbalance of power. Unless a worker is a member of a union, he or she has little or no power to negotiate with an employer. The rate of union membership has fallen from roughly 1 in 3 private sector workers in 1979 to only about 1 in 10 workers today. Unions negotiate higher wages and benefits for union members and also, indirectly, for nonunion workers. This occurs for several reasons: union contracts set wage standards across whole industries and strong unions prompt employers to keep wages high in order to reduce turnover and discourage unionizing at non-union employers. The decline in union membership has resulted in reduced wages for both union and nonunion workers. It is estimated that this decline is costing non-union workers $133 billion a year in lost wages. [2]

Individual workers lack bargaining power because there are relatively few employers and job openings but lots of workers looking for a job. Furthermore, a worker has an immediate need for income to pay for food and shelter, while most employers can leave a job unfilled for a while without suffering any great hardship. They can take the time to search for someone willing to take the job at whatever pay they offer.

Since 1980, employers have aggressively exploited this imbalance of power, while our federal government has stood aside and, in many ways, supported them in doing so. As a result, $1 trillion per year that used to go to workers now goes to executives and profits. Workers’ rewards for their contributions to our economic output (gross domestic product [GDP]) has dropped from 50% of GDP to 43%.

There is truth to the argument that in very competitive, price-sensitive industries producers have to squeeze workers’ wages to remain in business. However, this is where the role of government and public policy is critical. If every producer in the industry is required to pay a minimum wage, then a floor is set and all producers are on a level playing field, but with workers getting better pay. Without a good minimum wage, the competition drives wages down to the point where workers are suffering and public subsidies are required.

Public policies and laws, as well as collective action (such as unions negotiating on workers’ behalf), regulate the marketplace and affect the balance of power among competing economic interests. A market economy cannot operate effectively without the rules put in place by policies and laws. They are not antithetical to capitalism; rather, they are essential for markets to function.

Rules are necessary to prevent cheating, such as regulation of weights and measures of goods sold, and to protect the health and safety of consumers and workers. Laws and court systems enforce contracts between parties for the exchange of goods and services for money. Rules are needed to prevent companies from gaining an unfair advantage by being a free rider or externalizing costs (i.e., shifting the costs to others such as by polluting public air and water or by paying such low wages that employees need taxpayer-funded support).

Our low-wage, parasite economy is a collective choice, made by corporations but allowed and abetted – and subsidized – by public polices enacted by elected officials. We, as voters, can change this by electing representatives who support:

  • Increasing the minimum wage,
  • Enforcing and strengthening laws that allow workers to bargain collectively through unions, and
  • Stopping the free riding and externalizing of costs by large, profitable corporations.

Increasing the minimum wage and strengthening unions are two key policies that would strengthen our economy and the middle class by reducing the prevalence of the low-wage business model of parasitic corporations. I encourage you to ask candidates where they stand on these issues and to vote for ones who support fair wages and bargaining power for workers.

[1]       Hanauer, N., Summer 2016, “Confronting the parasite economy,” The American Prospect

[2]       Rosenfeld, J., Denice, P., & Laird, J., 8/30/16, “Union decline lowers wages for nonunion workers,” Economic Policy Institute (


The term the parasite economy is being applied to employers whose business model is built on low-wage jobs. These corporations take more out of their employees and society than they put in, hence they are parasites. The low incomes of their workers mean that the workers can only survive with the support of the publicly-funded safety net, including subsidized food, housing, child care, and health insurance, as well as the Earned Income Tax Credit. [1] And to make matters worse, some of these corporations are ones that use loopholes in the tax code to avoid paying their fair share of taxes.

As Henry Ford realized 100 years ago, if you don’t pay your workers enough to buy the products you make, your business model will struggle to be sustainable. In 1914, Ford began paying his employees $5.00 a day, over twice the average wage in the auto industry. He also reduced the work day from 9 hours to 8 hours. Ford believed he would get higher quality work and less turnover as a result. He stated, “The owner, the employees, and the buying public are all one and the same, and unless an industry can so manage itself as to keep wages high and prices low it destroys itself, for otherwise it limits the number of its customers. One’s own employees ought to be one’s own best customers.” [2]

As Henry Ford acknowledged in the early 1900s, the U.S. economy is driven by consumers. About two-thirds of our economic activity today is consumer spending. However, low-wage workers have a very limited ability to purchase goods and services, either to support themselves and their families or to sustain our consumer economy. A strong middle class is essential for the vitality for our consumer economy.

Although some of our politicians deride those who use public assistance as “takers” (as contrasted with “makers”), the real “takers” in our economy and society are the low-wage paying corporations. These low-wage employers are subsidized by the tax dollars that pay for the public assistance programs their low-paid workers (and their families) rely on to survive. [3] This is corporate welfare and these corporations are truly “takers,” as opposed to “makers” who contribute to our economy and society. [4]

Low-wage corporations are parasites, making nice profits and typically paying high compensation to their executives while relying for their success on low pay and public subsidies for their workers. Walmart and McDonald’s are classic examples.

It is estimated that American taxpayers pay roughly $153 billion a year for public assistance programs that support low-wage workers and their families. Seventy-three percent or almost three out of every four people who use public assistance programs live in families where at least one person is working. Forty-eight percent of home care workers rely on public assistance, along with 46% of those providing child care and 25% of part-time college faculty. [5]

A large part of the restaurant industry is a classic example of the parasite economy. The industry association, the National Restaurant Association, is a leading advocate for the low wages of the parasite economy. It has lobbied hard and is actively engaged in election campaigns in its efforts to keep industry wages low by opposing increases in the minimum wage and supporting the existence of an even lower, special minimum wage for tipped workers. The federal minimum wage for tipped workers – most restaurant employees – is $2.13 per hour and hasn’t been changed since 1991. The median wage for restaurant servers including tips is just $9.25 per hour. As a result, restaurant servers are three times as likely to be in poverty as the average worker.

The effects of moving to a low-wage business model were seen in the 2009 outsourcing of hotel housekeeping by Hyatt Hotels in the Boston area. Ninety-eight housekeepers were fired and replaced by contracted temp workers at half the pay, with no benefits, and with almost twice the workload. The fired housekeepers, some of whom had worked for Hyatt for 25 years, had had average pay of $17 per hour with good benefits. They were financially stable and appeared secure – able to pay their bills, support their children including with college costs, and help aging parents. Today, seven years later, the effects are still being felt by some of them, who have depleted their savings, defaulted on loans, and have poor credit ratings. Some have experienced high levels of stress and health consequences. Taxpayers had to provide unemployment benefits, as well as food, housing, and health care subsidies. [6]

The low-wage business model is pervasive in the U.S. today. Seventy-three million Americans (nearly a quarter of our population) live in working poor households that are eligible for the Earned Income Tax Credit (EITC). This public program, the primary replacement for “welfare as we know it” that President Clinton ended in 1996, provides subsidies to workers who are paid so poorly they and their families cannot survive without public assistance. The federal government spent $57 billion on EITC benefits in 2014 and many states provided their own additional EITC benefits (roughly another $10 billion). Most of these workers – and you have to be working to qualify for this benefit – work for large, profitable corporations.

Between 2003 and 2013, wages (after adjusting for inflation) actually fell for the 70% of workers at the lower end of the U.S. income spectrum. Further contributing to the need for public assistance, fewer and fewer Americans have health insurance through their employers. As a result, working-poor families (as opposed to the unemployed) receive more than half of all federal and state public assistance. Beyond the EITC, public subsidies that go primarily to the working poor include ones for food and nutrition ($86 billion), child care ($71 billion), housing ($38 billion), and health insurance ($475 billion).

My next post will discuss why the parasite economy is so prevalent in the U.S. today and what we can and should do about it.

[1]       Hanauer, N., Summer 2016, “Confronting the parasite economy,” The American Prospect

[2]       Nilsson, J., 1/3/14, “Why did Henry Ford double his minimum wage?” The Saturday Evening Post (

[3]       Hanauer, N., Summer 2016, see above

[4]       Johnson, J., 5/3/16, “McDonald’s, the corporate welfare moocher,” Common Dreams (

[5]       Jacobs, K., 4/15/16, “Americans are spending $153 billion a year to subsidize low-wage workers,” The Washington Post (

[6]       Boguslaw, J., & Trotter Davis, M., 9/5/16, “Lessons from the Hyatt 100,” The Boston Globe


Our mainstream media rarely present the numerous benefits of increasing the minimum wage. The benefits more than offset any negative effects and include:

  • Increased incomes for workers at and just above the minimum wage,
  • Benefits for children in families where income increases,
  • Health benefits for workers whose income increases,
  • Reduced need for publicly-funded safety net programs,
  • Stimulation of the local economy,
  • Reduced income inequality,
  • Increased incentive to work for low-wage workers, and
  • Reduced turnover, less absenteeism, and improved worker productivity in businesses where workers’ pay increases.

First and foremost, increasing the minimum wage would increase the incomes of many workers, both those earning the minimum wage and those earning just above the current and new minimum wage levels. And these aren’t teenagers working part-time: 91% are over 20 and 57% work full-time. More than half of minimum wage workers are the primary sources of income for their families and over 20% have a college degree. [1]

Nationally, 42% of all workers earn less than $15 per hour. The commitments in New York and California to increase their minimum wages to $15 are estimated to increase the incomes of over a third of workers in those states. [2] Even at $15 per hour (i.e., $30,000 per year based on 50 weeks at 40 hours per week), in many areas of our country a single person would have a barely adequate income to live on after taxes. A family with one or more children and one parent working full-time at $15 would be struggling to get by, let alone to provide the kind of experiences that support good child outcomes. At the current federal minimum wage of $7.25, a parent working full-time is in poverty.

The evidence is very strong that children’s outcomes improve when their family’s income increases. Children, and especially young children, are disproportionately in low income families. In Massachusetts, 22% of working parents would benefit from a $15 minimum wage, while 31% of children would. Parents experiencing less economic stress are more likely to have the time and energy to be nurturing parents. And they have more money to purchase all the things that support strong child development, from good food to books.

Raising the minimum wage improves workers’ health according to studies in the U.S. and in Great Britain. Workers who benefited from an increase in the minimum wage have been found to have reduced anxiety and depression. Increased income has been found to reduce the number of low birthweight babies and neonatal deaths. Low income has been linked to higher rates of obesity, high blood pressure, heart disease, smoking, diabetes, and arthritis. [3]

Health may be affected by low income a) due to the increased stress of trying to make ends meet, b) because health care and medicine are not affordable, and c) because healthy food is less available and affordable. Therefore, an increase in the minimum wage and in workers’ incomes is likely to have health benefits and contribute to restraining increases in health care costs.

When a person’s or family’s income increases, they are less likely to need publicly-funded safety net programs. Therefore, taxpayers and government save money due to a reduced need for subsidies for food, housing, child care, and health insurance.

Increasing the minimum wage stimulates the economy. The increased spending and consumer demand from workers whose incomes increase has positive effects on other local workers and businesses. Because of the multiplier effect, [4] the stimulus effect on local economies is substantial. A fundamental reality of economics – not just a theory or “law” – is that when workers have more money, they consume more and, therefore, businesses have more customers and sales, so they hire more workers, reducing unemployment.

Every dollar an hour increase means $2,000 per full-time worker per year in additional income to spend. When you multiply that by millions of workers, there are billions of additional dollars that would be spent in our economy. That would contribute to strengthening our economic recovery in a significant way.

Furthermore, this increase in economic activity will increase governments’ tax revenues. Some of these revenues should be used to ameliorate any negative effects of a minimum wage increase. Unemployment benefits, job training and placement programs, and other social supports should be provided to help anyone who lost a job. Small businesses that experienced significant negative effects should receive assistance, such as low cost loans to help bridge the transition.

Because an increase in the minimum wage would raise the incomes of those at the bottom of our income distribution, it would reduce income inequality. Other policy changes are needed to address this issue, but increasing the minimum wage is one important step.

Employers will benefit, as well as workers. Workers whose wages increase because of an increase in the minimum wage (both those at and just above the new minimum wage level) will have an increased incentive to work because their time is more highly rewarded. They will work more hours and be more motivated. As a result, absenteeism will decline and productivity will be enhanced. Furthermore, increases in pay have been found to reduce turnover. This is a major benefit to employers, as recruiting and training new workers is a major expense.

The evidence is clear that an increase in the minimum wage will have significant benefits for many workers and their families, for businesses and employers, and for our economy and society as a whole. A national, $15 minimum wage, phased in over a few years and then indexed to increase with inflation, is both economically sound policy and the right thing to do.

[1]       Chaddha, A., Sept. 2016, “A $15 minimum wage in New England: Who would be affected?” Federal Reserve Bank of Boston ($15-minimum-wage-in-new-england-who-would-be-affected.aspx)

[2]       Howell, D.R., Summer 2016, “Reframing the minimum-wage debate,” The American Prospect

[3]       Leigh, J.P., 7/28/16, “Raising the minimum wage could improve public health,” Economic Policy Institute, Working Economics Blog (

[4]       The multiplier effect refers to the fact that each dollar spent in the local economy supports additional spending by the individual or business that received it. This cycle of re-spending of every dollar spent is repeated endlessly. Therefore, the impact of each additional dollar spent in the local economy is multiplied.


 Whenever a proposal to raise the minimum wage is put forth, especially one for a significant increase such as to $15 per hour (the current federal minimum wage is $7.25), the business community and its allies among elected officials immediately warn that there would be dramatic negative effects on the number of jobs and the growth of the economy.

However, there is no actual evidence that raising the minimum wage to $15 over the course of a few years would reduce the number of jobs or slow economic growth. These assertions by the business sector are pure speculation based on the economic theory of ideal markets (which don’t exist in reality). The warnings are meant to create fear among voters and elected officials, and therefore foster opposition to increasing the minimum wage.

Past increases in the minimum wage have not led to increases in unemployment. In January 1950, the minimum wage was increased 87.5% (from $.40 to $.75). Over the next 15 months, the unemployment rate fell from 7.9% to 3.1%. A similar result occurred after a 33.3% increase in the minimum wage in March 1956. A study by the NY Department of Labor found that after six of eight increases in New York’s minimum wage between 1991 and 2015 employment increased.

When San Jose increased its minimum wage by $2 in 2013, the business community and particularly restaurants and small businesses predicted disaster. However, new business registrations grew and unemployment fell, including in the restaurant and hospitality sector where 4,000 jobs were added over the next year. [1]

Washington State has the highest minimum wage in the country at $9.47, and it applies to tipped workers. (This is four and a half times the federal minimum wage for tipped workers of $2.13.) And yet Seattle has the second highest concentration of restaurants per capita in the country (behind only San Francisco, where the city’s minimum wage is even higher). Washington State also boasts the highest rate of small-business job growth in the country.

In 2014, when Seattle raised its city minimum wage to $15, the restaurant industry and the business sector predictably claimed that disaster would follow. But six months later, Seattle’s restaurant industry was growing faster than ever. And in early 2016, Washington State was first in the country in job and wage growth.

International comparisons demonstrate that a high minimum wage does not reduce the number of low paying jobs or increase the unemployment rate of low-education workers. Among 18 countries with advanced economies, the U.S. has the highest proportion of low-wage jobs (25%) but only an average employment rate for low-education workers (57%). In other words, having lots of low-wage jobs in the U.S. has not led to high employment among workers with low levels of education.

It is the presence of a high minimum wage and collective bargaining for workers that explains the presence of jobs with good wages in other countries. Furthermore, most of the 18 other countries have stronger social supports for workers and families than the U.S. in areas such as health care, housing, education, and especially child care. The lower minimum wage and weaker social supports in the U.S. reflect the lack of political power of ordinary workers in America. [2]

It has been seven years since the federal minimum wage was raised to $7.25. That’s seven years without a raise for many workers, while housing, food, and health care costs have risen. Not since the 1930s has the American workforce experienced such a low-wage and insecure labor market. Relatively high unemployment and very high under-employment, as well as the rise of part-time and contingent jobs with their uncertain incomes, are the symptoms of insecure jobs.

Today’s low wages (which have been declining with inflation) and job insecurity are largely the result of decreased union membership and weakened government regulation of the labor market. As Adam Smith wrote over 200 years ago, if workers negotiate wages and working conditions individually with employers, employers will always have the upper hand.

In competitive markets for goods and services, without government regulation (such as a strong minimum wage law) and collective bargaining for workers, the job market becomes a race to the bottom. Employers will drive down wages, benefits, and working conditions to maximize competitiveness and profits.

This is what has happened in the U.S. since 1968 as government regulation and union membership have declined. Using 1968 as the reference point, today’s current federal minimum wage of $7.25 would be:

  • $9.63 if it had kept up with inflation; (In other words, the minimum wage today has roughly 25% less purchasing power than it had in 1968.)
  • $11.35 if it had kept up with the average wage in the economy; or
  • $18.85 if it had kept up with the improvement in workers’ productivity. [3] (In other words, the value of the increased production of today’s workers over those of 1968 is not getting paid to the workers but is going to managers and investors or shareholders.)

So, the truth about increasing the minimum wage is that it doesn’t increase unemployment and slow economic growth. In fact, the opposite may occur. Furthermore, there are many benefits to increasing the minimum wage (which I’ll discuss in my next post) that outweigh any possible negative effects.

[1]       Hanauer, N., Summer 2016, “Confronting the parasite economy,” The American Prospect

[2]       Howell, D.R., Summer 2016, “Reframing the minimum-wage debate,” The American Prospect

[3]       Cooper, D., 7/25/16, “The federal minimum wage has been eroded by decades of inaction,” The Economic Policy Institute (