inequalityforall

The Four Biggest Right-Wing Lies About Inequality

Even though French economist Thomas Piketty has made an air-tight case that we’re heading toward levels of inequality not seen since the days of the nineteenth-century robber barons, right-wing conservatives haven’t stopped lying about what’s happening and what to do about it.

Herewith, the four biggest right-wing lies about inequality, followed by the truth.

Lie number one: The rich and CEOs are America’s job creators. So we dare not tax them.

The truth is the middle class and poor are the job-creators through their purchases of goods and services. If they don’t have enough purchasing power because they’re not paid enough, companies won’t create more jobs and economy won’t grow.

We’ve endured the most anemic recovery on record because most Americans don’t have enough money to get the economy out of first gear. The economy is barely growing and real wages continue to drop.

We keep having false dawns. An average of 200,000 jobs were created in the United States over the last three months, but huge numbers of Americans continue to drop out of the labor force.

Lie number two: People are paid what they’re worth in the market. So we shouldn’t tamper with pay.

The facts contradict this. CEOs who got 30 times the pay of typical workers forty years ago now get 300 times their pay not because they’ve done such a great job but because they control their compensation committees and their stock options have ballooned.

Meanwhile, most American workers earn less today than they did forty years ago, adjusted for inflation, not because they’re working less hard now but because they don’t have strong unions bargaining for them.

More than a third of all workers in the private sector were unionized forty years ago; now, fewer than 7 percent belong to a union. 

Lie number three: Anyone can make it in America with enough guts, gumption, and intelligence. So we don’t need to do anything for poor and lower-middle class kids.

The truth is we do less than nothing for poor and lower-middle class  kids. Their schools don’t have enough teachers or staff, their textbooks are outdated, they lack science labs, their school buildings are falling apart.

We’re the only rich nation to spend less educating poor kids than we do educating kids from wealthy families. 

All told, 42 percent of children born to poor families will still be in poverty as adults – a higher percent than in any other advanced nation. 

Lie number four: Increasing the minimum wage will result in fewer jobs. So we shouldn’t raise it.

In fact, studies show that increases in the minimum wage put more money in the pockets of people who will spend it – resulting in more jobs, and counteracting any negative employment effects of an increase in the minimum. 

Three of my colleagues here at the University of California at Berkeley – Arindrajit Dube, T. William Lester, and Michael Reich – have compared adjacent counties and communities across the United States, some with higher minimum wages than others but similar in every other way.

They found no loss of jobs in those with the higher minimums.

The truth is, America’s lurch toward widening inequality can be reversed. But doing so will require bold political steps.

At the least, the rich must pay higher taxes in order to pay for better-quality education for kids from poor and middle-class families. Labor unions must be strengthened, especially in lower-wage occupations, in order to give workers the bargaining power they need to get better pay. And the minimum wage must be raised. 

Don’t listen to the right-wing lies about inequality. Know the truth, and act on it. 

The Great U-Turn

Do you recall a time in America when the income of a single school teacher or baker or salesman or mechanic was enough to buy a home, have two cars, and raise a family? 

I remember. My father (who just celebrated his 100th birthday) earned enough for the rest of us to live comfortably. We weren’t rich but never felt poor, and our standard of living rose steadily through the 1950s and 1960s. 

That used to be the norm. For three decades after World War II, America created the largest middle class the world had ever seen. During those years the earnings of the typical American worker doubled, just as the size of the American economy doubled. (Over the last thirty years, by contrast, the size of the economy doubled again but the earnings of the typical American went nowhere.)  

In that earlier period, more than a third of all workers belonged to a trade union – giving average workers the bargaining power necessary to get a large and growing share of the large and growing economic pie. (Now, fewer than 7 percent of private-sector workers are unionized.) 

Then, CEO pay then averaged about 20 times the pay of their typical worker (now it’s over 200 times). 

In those years, the richest 1 percent took home 9 to 10 percent of total income (today the top 1 percent gets more than 20 percent). 

Then, the tax rate on highest-income Americans never fell below 70 percent; under Dwight Eisenhower, a Republican, it was 91 percent. (Today the top tax rate is 39.6 percent.)

In those decades, tax revenues from the wealthy and the growing middle class were used to build the largest infrastructure project in our history, the Interstate Highway system. And to build the world’s largest and best system of free public education, and dramatically expand public higher education. (Since then, our infrastructure has been collapsing from deferred maintenance, our public schools have deteriorated, and higher education has become unaffordable to many.)

We didn’t stop there. We enacted the Civil Rights Act and Voting Rights Act to extend prosperity and participation to African-Americans; Medicare and Medicaid to provide health care to the poor and reduce poverty among America’s seniors; and the Environmental Protection Act to help save our planet. 

And we made sure banking was boring. 

It was a virtuous cycle. As the economy grew, we prospered together. And that broad-based prosperity enabled us to invest in our future, creating more and better jobs and a higher standard of living.  

Then came the great U-turn, and for the last thirty years we’ve been heading in the opposite direction. 

Why?

Some blame globalization and the loss of America's  manufacturing core. Others point to new technologies that replaced routine jobs with automated machinery, software, and robotics. 

But if these were the culprits, they only raise a deeper question: Why didn’t we share the gains from globalization and technological advances more broadly? Why didn’t we invest them in superb schools, higher skills, a world-class infrastructure?

Others blame Ronald Reagan’s worship of the so-called “free market,” supply-side economics, and deregulation. But if these were responsible, why did we cling to these ideas for so long? Why are so many people still clinging to them? 

Some others believe Americans became greedier and more selfish. But if that’s the explanation, why did our national character change so dramatically? 

Perhaps the real problem is we forgot what we once achieved together. 

The collective erasure of the memory of that prior system of broad-based prosperity is due partly to the failure of my generation to retain and pass on the values on which that system was based. It can also be understood as the greatest propaganda victory radical conservatism ever won.

We must restore our recollection. In seeking to repair what is broken, we don’t have to emulate another nation. We have only to emulate what we once had.

That we once achieved broad-based prosperity means we can achieve it again – not exactly the same way, of course, but in a new way fit for the twenty-first century and for future generations of Americans. 

America’s great U-turn can be reversed. It is worth the fight.

The "Paid-What-You're-Worth" Myth

It’s often assumed that people are paid what they’re worth. According to this logic, minimum wage workers aren’t worth more than the $7.25 an hour they now receive. If they were worth more, they’d earn more. Any attempt to force employers to pay them more will only kill jobs. 

According to this same logic, CEOs of big companies are worth their giant compensation packages, now averaging 300 times pay of the typical American worker. They must be worth it or they wouldn’t be paid this much. Any attempt to limit their pay is fruitless because their pay will only take some other form. 

“Paid-what-you’re-worth” is a dangerous myth.  

Fifty years ago, when General Motors was the largest employer in America, the typical GM worker got paid $35 an hour in today’s dollars. Today, America’s largest employer is Walmart, and the typical Walmart workers earns $8.80 an hour. 

Does this mean the typical GM employee a half-century ago was worth four times what today’s typical Walmart employee is worth? Not at all. Yes, that GM worker helped produce cars rather than retail sales. But he wasn’t much better educated or even that much more productive. He often hadn’t graduated from high school. And he worked on a slow-moving assembly line. Today’s Walmart worker is surrounded by digital gadgets – mobile inventory controls, instant checkout devices, retail search engines – making him or her quite productive. 

The real difference is the GM worker a half-century ago had a strong union behind him that summoned the collective bargaining power of all autoworkers to get a substantial share of company revenues for its members. And because more than a third of workers across America belonged to a labor union, the bargains those unions struck with employers raised the wages and benefits of non-unionized workers as well. Non-union firms knew they’d be unionized if they didn’t come close to matching the union contracts.

Today’s Walmart workers don’t have a union to negotiate a better deal. They’re on their own. And because fewer than 7 percent of today’s private-sector workers are unionized, non-union employers across America don’t have to match union contracts. This puts unionized firms at a competitive disadvantage. The result has been a race to the bottom. 

By the same token, today’s CEOs don’t rake in 300 times the pay of average workers because they’re “worth” it. They get these humongous pay packages because they appoint the compensation committees on their boards that decide executive pay. Or their boards don’t want to be seen by investors as having hired a “second-string” CEO who’s paid less than the CEOs of their major competitors. Either way, the result has been a race to the top. 

If you still believe people are paid what they’re worth, take a look at Wall Street bonuses. Last year’s average bonus was up 15 percent over the year before, to more than $164,000. It was the largest average Wall Street bonus since the 2008 financial crisis and the third highest on record, according to New York’s state comptroller. Remember, we’re talking bonuses, above and beyond salaries.

All told, the Street paid out a whopping $26.7 billion in bonuses last year. 

Are Wall Street bankers really worth it? Not if you figure in the hidden subsidy flowing to the big Wall Street banks that ever since the bailout of 2008 have been considered too big to fail. 

People who park their savings in these banks accept a lower interest rate on deposits or loans than they require from America’s smaller banks. That’s because smaller banks are riskier places to park money. Unlike the big banks, the smaller ones won’t be bailed out if they get into trouble.

This hidden subsidy gives Wall Street banks a competitive advantage over the smaller banks, which means Wall Street makes more money. And as their profits grow, the big banks keep getting bigger. 

How large is this hidden subsidy? Two researchers, Kenichi Ueda of the International Monetary Fund and Beatrice Weder di Mauro of the University of Mainz, have calculated it’s about eight tenths of a percentage point. 

This may not sound like much but multiply it by the total amount of money parked in the ten biggest Wall Street banks and you get a huge amount – roughly $83 billion a year.  

Recall that the Street paid out $26.7 billion in bonuses last year. You don’t have to be a rocket scientist or even a Wall Street banker to see that the hidden subsidy the Wall Street banks enjoy because they're  too big to fail is about three times what Wall Street paid out in bonuses.

Without the subsidy, no bonus pool. 

By the way, the lion’s share of that subsidy ($64 billion a year) goes to the top five banks – JPMorgan, Bank of America, Citigroup, Wells Fargo. and Goldman Sachs. This amount just about equals these banks’ typical annual profits. In other words, take away the subsidy and not only does the bonus pool disappear, but so do all the profits.  

The reason Wall Street bankers got fat paychecks plus a total of $26.7 billion in bonuses last year wasn’t because they worked so much harder or were so much more clever or insightful than most other Americans. They cleaned up because they happen to work in institutions – big Wall Street banks – that hold a privileged place in the American political economy. 

And why, exactly, do these institutions continue to have such privileges? Why hasn’t Congress used the antitrust laws to cut them down to size so they’re not too big to fail, or at least taxed away their hidden subsidy (which, after all, results from their taxpayer-financed bailout)? 

Perhaps it’s because Wall Street also accounts for a large proportion of campaign donations to major candidates for Congress and the presidency of both parties. 

America’s low-wage workers don’t have privileged positions. They work very hard – many holding down two or more jobs. But they can’t afford to make major campaign contributions and they have no political clout. 

According to the Institute for Policy Studies, the $26.7 billion of bonuses Wall Street banks paid out last year would be enough to more than double the pay of every one of America’s 1,085,000 full-time minimum wage workers. 

The remainder of the $83 billion of hidden subsidy going to those same banks would almost be enough to double what the government now provides low-wage workers in the form of wage subsidies under the Earned Income Tax Credit.

But I don’t expect Congress to make these sorts of adjustments any time soon. 

The “paid-what-your-worth” argument is fundamentally misleading because it ignores power, overlooks institutions, and disregards politics. As such, it lures the unsuspecting into thinking nothing whatever should be done to change what people are paid, because nothing can be done. 

Don’t buy it. 

David Brooks' Utter Ignorance About Inequality
Occasionally David Brooks, who personifies the oxymoron “conservative thinker” better than anyone I know, displays such profound ignorance that a rejoinder is necessary lest his illogic permanently pollute public debate. Such is the case with his New York Times column last Friday, arguing that we should be focusing on the “interrelated social problems of the poor” rather than on inequality, and that the two are fundamentally distinct. Baloney.   First, when almost all the gains from growth go to the top, as they have for the last thirty years, the middle class doesn’t have the purchasing power necessary for buoyant growth. Once the middle class has exhausted all its coping mechanisms – wives and mothers surging into paid work (as they did in the 1970s and 1980s), longer working hours (which characterized the 1990s), and deep indebtedness (2002 to 2008) – the inevitable result is fewer jobs and slow growth, as we continue to experience. Few jobs and slow growth hit the poor especially hard because they’re the first to be fired, last to be hired, and most likely to bear the brunt of declining wages and benefits.   Second, when the middle class is stressed, it has a harder time being generous to those in need. The “interrelated social problems” of the poor presumably will require some money, but the fiscal cupboard is bare. And because the middle class is so financially insecure, it doesn’t want to, nor does it feel it can afford to, pay more in taxes.   Third, America’s shrinking middle class also hobbles upward mobility. Not only is there less money for good schools, job training, and social services, but the poor face a more difficult challenge moving upward because the income ladder is far longer than it used to be, and its middle rungs have disappeared.   Brooks also argues that we should not be talking about unequal political power, because such utterances cause divisiveness and make it harder to reach political consensus over what to do for the poor.   Hogwash. The concentration of power at the top – which flows largely from the concentration of income and wealth there – has prevented  Washington from dealing with the problems of the poor and the middle class. To the contrary, as wealth has accumulated at the top, Washington has reduced taxes on the wealthy, expanded tax loopholes that disproportionately benefit the rich, deregulated Wall Street, and provided ever larger subsidies, bailouts, and tax breaks for large corporations. The only things that have trickled down to the middle and poor besides fewer jobs and smaller paychecks are public services that are increasingly inadequate because they’re starved for money.   Unequal political power is the endgame of widening inequality – its most noxious and nefarious consequence, and the most fundamental threat to our democracy. Big money has now all but engulfed Washington and many state capitals – drowning out the voices of average Americans, filling the campaign chests of candidates who will do their bidding, financing attacks on organized labor, and bankrolling a vast empire of right-wing think-tanks and publicists that fill the airwaves with half-truths and distortions.   That David Brooks, among the most thoughtful of all conservative pundits, doesn’t see or acknowledge any of this is a sign of how far the right has moved away from the reality most Americans live in every day. 
How the Right Wing is Killing Women

According to a report released last week in the widely-respected health research journal, The Lancet, the United States now ranks 60th out of 180 countries on maternal deaths occurring during pregnancy and childbirth.

To put it bluntly, for every 100,000 births in America last year, 18.5 women died. That’s compared to 8.2 women who died during pregnancy and birth in Canada, 6.1 in Britain, and only 2.4 in Iceland.

A woman giving birth in America is more than twice as likely to die as a woman in Saudi Arabia or China.

You might say international comparisons should be taken with a grain of salt because of difficulties of getting accurate measurements across nations. Maybe China hides the true extent of its maternal deaths. But Canada and Britain?

Even if you’re still skeptical, consider that our rate of maternal death is heading in the wrong direction. It’s risen over the past decade and is now nearly the highest in a quarter century.

In 1990, the maternal mortality rate in America was 12.4 women per 100,000 births. In 2003, it was 17.6. Now it’s 18.5.

That’s not a measurement error because we’ve been measuring the rate of maternal death in the United States the same way for decades.

By contrast, the rate has been dropping in most other nations. In fact, we’re one of just eight nations in which it’s been rising.  The others that are heading in the wrong direction with us are not exactly a league we should be proud to be a member of. They include Afghanistan, El Salvador, Belize, and South Sudan.

China was ranked 116 in 1990. Now it’s moved up to 57. Even if China’s way of measuring maternal mortality isn’t to be trusted, China is going in the right direction. We ranked 22 in 1990. Now, as I’ve said, we’re down to 60th place.

Something’s clearly wrong.

Some say more American women are dying in pregnancy and childbirth because American girls are becoming pregnant at younger and younger ages, where pregnancy and birth can pose greater dangers.

This theory might be convincing if it had data to support it. But contrary to the stereotype of the pregnant young teenager, the biggest rise in pregnancy-related deaths in America has occurred in women 20-24 years old.

Consider that in 1990, 7.2 women in this age group died for every 100,000 live births. By 2013, the rate was 14 deaths in this same age group – almost double the earlier rate.

Researchers aren’t sure what’s happening but they’re almost unanimous in pointing to a lack of access to health care, coupled with rising levels of poverty.

Some American women are dying during pregnancy and childbirth from health problems they had before they became pregnant but worsened because of the pregnancies – such as diabetes, kidney disease, and heart disease.

The real problem, in other words, was they didn’t get adequate health care before they became pregnant.

Other women are dying because they didn’t have the means to prevent a pregnancy they shouldn’t have had, or they didn’t get the prenatal care they needed during their pregnancies. In other words, a different sort of inadequate health care.

One clue: African-American mothers are more than three times as likely to die as a result of pregnancy and childbirth than their white counterparts.

The data tell the story: A study by the Roosevelt Institute shows that U.S. states with high poverty rates have maternal death rates 77 percent higher than states with lower levels of poverty. Women with no health insurance are four times more likely to die during pregnancy or in childbirth than women who are insured.

What do we do about this? Yes, of course, poor women (and the men who made them pregnant) have to take more personal responsibility for their behavior.

But this tragic trend is also a clear matter of public choice.

Many of these high-poverty states are among the twenty-one that have so far refused to expand Medicaid, even though the federal government will cover 100 percent of the cost for the first three years and at least 90 percent thereafter.

So as the sputtering economy casts more and more women into near poverty, they can’t get the health care they need.

Several of these same states have also cut family planning, restricted abortions, and shuttered women’s health clinics.

Right-wing ideology is trumping the health needs of millions of Americans.

Let’s be perfectly clear: These policies are literally killing women.

American Bile

Not long ago I was walking toward an airport departure gate when a man approached me.

“Are you Robert Reich?” he asked.

“Yes,” I said.

“You’re a Commie dirtbag.” (He actually used a variant of that noun, one that can’t be printed here.)

“I’m sorry?” I thought I had misunderstood him.

“You’re a Commie dirtbag.”

My mind raced through several possibilities. Was I in danger? That seemed doubtful. He was well-dressed and had a briefcase in one hand. He couldn’t have gotten through the checkpoint with a knife or gun. Should I just walk away? Probably. But what if he followed me? Regardless, why should I let him get away with insulting me?

I decided to respond, as civilly as I could: “You’re wrong. Where did you get your information?”

“Fox News. Bill O’Reilly says you’re a Communist.”

A year or so ago Bill O’Reilly did say on his Fox News show that I was a Communist. I couldn’t imagine what I’d done to provoke his ire except to appear on several TV shows arguing for higher taxes on the wealthy, which hardly qualified me as a Communist. Nor am I exactly a revolutionary. I served in Bill Clinton’s cabinet. My first full-time job in Washington was in the Ford administration, working for Robert H. Bork at the Justice Department.

“Don’t believe everything you hear on Fox News,” I said. The man walked away, still irritated.

It’s rare that I’m accosted and insulted by strangers, but I do receive vitriolic e-mails and angry Facebook posts. On the Internet and on TV shows, name-calling substitutes for argument, and ad hominem attack for reason.

Scholars who track these things say the partisan divide is sharper today than it has been in almost a century. The typical Republican agrees with the typical Democrat on almost no major issue. If you haven’t noticed, Congress is in complete gridlock.

At the same time, polls show Americans to be more contemptuous and less trusting of major institutions: government, big business, unions, Wall Street, the media.

I’m 67 and have lived through some angry times: Joseph R. McCarthy’s witch hunts of the 1950s, the struggle for civil rights and the Vietnam protests in the 1960s, Watergate and its aftermath in the 1970s. But I don’t recall the degree of generalized bile that seems to have gripped the nation in recent years.

The puzzle is that many of the big issues that used to divide us, from desegregation to foreign policy, are less incendiary today. True, we disagree about guns, abortion and gay marriage, but for the most part have let the states handle these issues. So what, exactly, explains the national distemper?

For one, we increasingly live in hermetically sealed ideological zones that are almost immune to compromise or nuance. Internet algorithms and the proliferation of media have let us surround ourselves with opinions that confirm our biases. We’re also segregating geographically into red or blue territories: chances are that our neighbors share our views, and magnify them. So when we come across someone outside these zones, whose views have been summarily dismissed or vilified, our minds are closed.

Add in the fact that most Americans no longer remember the era, from the Great Depression through World War II, when we were all in it together — when hardship touched almost every family, and we were palpably dependent on one another. There were sharp disagreements, but we shared challenges that forced us to work together toward common ends. Small wonder that by the end of the war, Americans’ confidence in major institutions of our society was at its highest.

These changes help explain why Americans are so divided, but not why they’re so angry. To understand that, we need to look at the economy.

Put simply, most people are on a downward escalator. Although jobs are slowly returning, pay is not. Most jobs created since the start of the recovery, in 2009, pay less than the jobs that were lost during the Great Recession. This means many people are working harder than ever, but still getting nowhere. They’re increasingly pessimistic about their chances of ever doing better.

As their wages and benefits shrink, though, they see corporate executives and Wall Street bankers doing far better than ever before. And they are keenly aware of bailouts and special subsidies for agribusinesses, pharma, oil and gas, military contractors, finance and every other well-connected industry.

Political scientists have noted a high correlation between inequality and polarization. But economic class isn’t the only dividing line in America. Many working-class voters are heartland Republicans, while many of America’s superrich are coastal Democrats. The real division is between those who believe the game is rigged against them and those who believe they have a decent shot.

Losers of rigged games can become very angry, as history has revealed repeatedly. In America, the populist wings of both parties have become more vocal in recent years — the difference being that the populist right blames government more than it does big corporations while the populist left blames big corporations more than government.

Widening inequality thereby ignites what the historian Richard Hofstadter called the “paranoid style in American politics.” It animated the Know-Nothing and Anti-Masonic movements before the Civil War, the populist agitators of the Progressive Era and the John Birch Society — whose founder accused President Dwight D. Eisenhower of being a “dedicated, conscious agent of the Communist conspiracy” — in the 1950s.

Inequality is far wider now than it was then, and threatens social cohesion and trust. I don’t think Bill O’Reilly really believes I’m a Communist. He’s just channeling the nation’s bile.

[I wrote this for today’s New York Times.]

Raising Taxes on Corporations that Pay Their CEOs Royally and Treat Their Workers Like Serfs

Until the 1980s, corporate CEOs were paid, on average, 30 times what their typical worker was paid. Since then, CEO pay has skyrocketed to 280 times the pay of a typical worker; in big companies, to 354 times.

Meanwhile, over the same thirty-year time span the median American worker has seen no pay increase at all, adjusted for inflation. Even though the pay of male workers continues to outpace that of females, the typical male worker between the ages of 25 and 44 peaked in 1973 and has been dropping ever since. Since 2000, wages of the median male worker across all age brackets has dropped 10 percent, after inflation.

This growing divergence between CEO pay and that of the typical American worker isn’t just wildly unfair. It’s also bad for the economy. It means most workers these days lack the purchasing power to buy what the economy is capable of producing – contributing to the slowest recovery on record. Meanwhile, CEOs and other top executives use their fortunes to fuel speculative booms followed by busts.

Anyone who believes CEOs deserve this astronomical pay hasn’t been paying attention. The entire stock market has risen to record highs. Most CEOs have done little more than ride the wave.

There’s no easy answer for reversing this trend, but this week I’ll be testifying in favor of a bill introduced in the California legislature that at least creates the right incentives. Other states would do well to take a close look.

The proposed legislation, SB 1372, sets corporate taxes according to the ratio of CEO pay to the pay of the company’s typical worker. Corporations with low pay ratios get a tax break.Those with high ratios get a tax increase.

For example, if the CEO makes 100 times the median worker in the company, the company’s tax rate drops from the current 8.8 percent down to 8 percent. If the CEO makes 25 times the pay of the typical worker, the tax rate goes down to 7 percent.

On the other hand, corporations with big disparities face higher taxes. If the CEO makes 200 times the typical employee, the tax rate goes to 9.5 percent; 400 times, to 13 percent.

The California Chamber of Commerce has dubbed this bill a “job killer,” but the reality is the opposite. CEOs don’t create jobs.Their customers create jobs by buying more of what their companies have to sell – giving the companies cause to expand and hire.

So pushing companies to put less money into the hands of their CEOs and more into the hands of average employees creates more buying power among people who will buy, and therefore more jobs.

The other argument against the bill is it’s too complicated. Wrong again. The Dodd-Frank Act already requires companies to publish the ratios of CEO pay to the pay of the company’s median worker (the Securities and Exchange Commission is now weighing a proposal to implement this). So the California bill doesn’t require companies to do anything more than they’ll have to do under federal law. And the tax brackets in the bill are wide enough to make the computation easy.

What about CEO’s gaming the system? Can’t they simply eliminate low-paying jobs by subcontracting them to another company – thereby avoiding large pay disparities while keeping their own compensation in the stratosphere?

No. The proposed law controls for that. Corporations that begin subcontracting more of their low-paying jobs will have to pay a higher tax.  

For the last thirty years, almost all the incentives operating on companies have been to lower the pay of their workers while increasing the pay of their CEOs and other top executives. It’s about time some incentives were applied in the other direction.

The law isn’t perfect, but it’s a start. That the largest state in America is seriously considering it tells you something about how top heavy American business has become, and why it’s time to do something serious about it.

Fear is Why Workers in Red States Vote Against Their Economic Self-Interest

Last week’s massive spill of the toxic chemical MCHM into West Virginia’s Elk River illustrates another benefit to the business class of high unemployment, economic insecurity, and a safety-net shot through with holes. Not only are employees eager to accept whatever job they can get. They are also also unwilling to demand healthy and safe environments.  

The spill was the region’s third major chemical accident in five years, coming after two investigations by the federal Chemical Safety Board in the Kanawha Valley, also known as “Chemical Valley,” and repeated recommendations from federal regulators and environmental advocates that the state embrace tougher rules to better safeguard chemicals. 

No action was ever taken. State and local officials turned a deaf ear. The storage tank that leaked, owned by Freedom Industries, hadn’t been inspected for decades. 

But nobody complained. 

Not even now, with the toxins moving down river toward Cincinnati, can the residents of Charleston and the surrounding area be sure their drinking water is safe – partly because the government’s calculation for safe levels is based on a single study by the manufacturer of the toxic chemical, which was never published, and partly because the West Virginia American Water Company, which supplies the drinking water, is a for-profit corporation that may not want to highlight any lingering danger.  

So why wasn’t more done to prevent this, and why isn’t there more of any outcry even now? 

The answer isn’t hard to find. As Maya Nye, president of People Concerned About Chemical Safety, a citizen’s group formed after a 2008 explosion and fire killed workers at West Virginia’s Bayer CropScience plant in the state, explained to the New York Times: “We are so desperate for jobs in West Virginia we don’t want to do anything that pushes industry out.” 

Exactly.

I often heard the same refrain when I headed the U.S. Department of Labor. When we sought to impose a large fine on the Bridgestone-Firestone Tire Company for flagrantly disregarding workplace safety rules and causing workers at one of its plants in Oklahoma to be maimed and killed, for example, the community was solidly behind us – that is, until Bridgestone-Firestone threatened to close the plant if we didn’t back down.

The threat was enough to ignite a storm of opposition to the proposed penalty from the very workers and families we were trying to protect. (We didn’t back down and Bridgestone-Firestone didn’t carry out its threat, but the political fallout was intense.)

For years political scientists have wondered why so many working class and poor citizens of so-called “red” states vote against their economic self-interest. The usual explanation is that, for these voters, economic issues are trumped by social and cultural issues like guns, abortion, and race. 

I’m not so sure. The wages of production workers have been dropping for thirty years, adjusted for inflation, and their economic security has disappeared. Companies can and do shut down, sometimes literally overnight. A smaller share of working-age Americans hold jobs today than at any time in more than three decades. 

People are so desperate for jobs they don’t want to rock the boat. They don’t want rules and regulations enforced that might cost them their livelihoods. For them, a job is precious – sometimes even more precious than a safe workplace or safe drinking water. 

This is especially true in poorer regions of the country like West Virginia and through much of the South and rural America – so-called “red” states where the old working class has been voting Republican. Guns, abortion, and race are part of the explanation. But don’t overlook economic anxieties that translate into a willingness to vote for whatever it is that industry wants. 

This may explain why Republican officials who have been casting their votes against unions, against expanding Medicaid, against raising the minimum wage, against extended unemployment insurance, and against jobs bills that would put people to work, continue to be elected and re-elected. They obviously have the support of corporate patrons who want to keep unemployment high and workers insecure because a pliant working class helps their bottom lines. But they also, paradoxically, get the votes of many workers who are clinging so desperately to their jobs that they’re afraid of change and too cowed to make a ruckus.  

The best bulwark against corporate irresponsibility is a strong and growing middle class. But in order to summon the political will to achieve it, we have to overcome the timidity that flows from economic desperation. It’s a diabolical chicken-and-egg conundrum at a the core of American politics today.

The Share-the-Scraps Economy

How would you like to live in an economy where robots do everything that can be predictably programmed in advance, and almost all profits go to the robots’ owners?

Meanwhile, human beings do the work that’s unpredictable – odd jobs, on-call projects, fetching and fixing, driving and delivering, tiny tasks needed at any and all hours – and patch together barely enough to live on.

Brace yourself. This is the economy we’re now barreling toward.

They’re Uber drivers, Instacart shoppers, and Airbnb hosts. They include Taskrabbit jobbers, Upcounsel’s on-demand attorneys, and Healthtap’s on-line doctors.

They’re Mechanical Turks.

The euphemism is the “share” economy. A more accurate term would be the “share-the-scraps” economy.

New software technologies are allowing almost any job to be divided up into discrete tasks that can be parceled out to workers when they’re needed, with pay determined by demand for that particular job at that particular moment.

Customers and workers are matched online. Workers are rated on quality and reliability.

The big money goes to the corporations that own the software. The scraps go to the on-demand workers.

Consider Amazon’s “Mechanical Turk.” Amazon calls it “a marketplace for work that requires human intelligence.”

In reality, it’s an Internet job board offering minimal pay for mindlessly-boring bite-sized chores. Computers can’t do them because they require some minimal judgment, so human beings do them for peanuts – say, writing a product description, for $3; or choosing the best of several photographs, for 30 cents; or deciphering handwriting, for 50 cents.

Amazon takes a healthy cut of every transaction.

This is the logical culmination of a process that began thirty years ago when corporations began turning over full-time jobs to temporary workers, independent contractors, free-lancers, and consultants.

It was a way to shift risks and uncertainties onto the workers – work that might entail more hours than planned for, or was more stressful than expected.

And a way to circumvent labor laws that set minimal standards for wages, hours, and working conditions. And that enabled employees to join together to bargain for better pay and benefits.

The new on-demand work shifts risks entirely onto workers, and eliminates minimal standards completely.

In effect, on-demand work is a reversion to the piece work of the nineteenth century – when workers had no power and no legal rights, took all the risks, and worked all hours for almost nothing.

Uber drivers use their own cars, take out their own insurance, work as many hours as they want or can – and pay Uber a fat percent. Worker safety? Social Security? Uber says it’s not the employer so it’s not responsible.

Amazon’s Mechanical Turks work for pennies, literally. Minimum wage? Time-and-a half for overtime? Amazon says it just connects buyers and sellers so it’s not responsible.

Defenders of on-demand work emphasize its flexibility. Workers can put in whatever time they want, work around their schedules, fill in the downtime in their calendars.

“People are monetizing their own downtime,” Arun Sundararajan, a professor at New York University’s business school, told the New York Times.

But this argument confuses “downtime” with the time people normally reserve for the rest of their lives.

There are still only twenty-four hours in a day. When “downtime” is turned into work time, and that work time is unpredictable and low-paid, what happens to personal relationships? Family? One’s own health?

Other proponents of on-demand work point to studies, such as one recently commissioned by Uber, showing Uber’s on-demand workers to be “happy.”

But how many of them would be happier with a good-paying job offering regular hours?

An opportunity to make some extra bucks can seem mighty attractive in an economy whose median wage has been stagnant for thirty years and almost all of whose economic gains have been going to the top.

That doesn’t make the opportunity a great deal. It only shows how bad a deal most working people have otherwise been getting.

Defenders also point out that as on-demand work continues to grow, on-demand workers are joining together in guild-like groups to buy insurance and other benefits.

But, notably, they aren’t using their bargaining power to get a larger share of the income they pull in, or steadier hours. That would be a union – something that Uber, Amazon, and other on-demand companies don’t want.

Some economists laud on-demand work as a means of utilizing people more efficiently.

But the biggest economic challenge we face isn’t using people more efficiently. It’s allocating work and the gains from work more decently.

On this measure, the share-the-scraps economy is hurtling us backwards.

The Real Truth About ObamaCare

Despite the worst roll-out conceivable, the Affordable Care Act seems to be working. With less than two weeks remaining before the March 31 deadline for coverage this year, five million people have already signed up. After decades of rising percentages of Americans’ lacking health insurance, the uninsured rate has dropped to its lowest levels since 2008.

Meanwhile, the rise in health care costs has slowed drastically. No one knows exactly why, but the new law may well be contributing to this slowdown by reducing Medicare overpayments to medical providers and private insurers, and creating incentives for hospitals and doctors to improve quality of care.

But a lot about the Affordable Care Act needs fixing — especially the widespread misinformation that continues to surround it. For example, a majority of business owners with fewer than 50 workers still think they’re required to offer insurance or pay a penalty. In fact, the law applies only to businesses with 50 or more employees who work more than 30 hours a week. And many companies with fewer than 25 workers still don’t realize that if they offer plans they can qualify for subsidies in the form of tax credits.

Many individuals remain confused and frightened. Forty-one percent of Americans who are still uninsured say they plan to remain that way. They believe it will be cheaper to pay a penalty than buy insurance. Many of these people are unaware of the subsidies available to them. Sign-ups have been particularly disappointing among Hispanics.

Some of this confusion has been deliberately sown by outside groups that, in the wake of the Supreme Court’s “Citizens United” decision, have been free to spend large amounts of money to undermine the law. For example, Gov. Rick Scott,  Republican of Florida, told Fox News that the Affordable Care Act was “the biggest job killer ever,” citing a Florida company with 20 employees that expected to go out of business because it couldn’t afford coverage.

None of this is beyond repair, though. As more Americans sign up and see the benefits, others will take note and do the same.

The biggest problem on the horizon that may be beyond repair — because it reflects a core feature of the law — is the public’s understandable reluctance to be forced to buy insurance from private, for-profit insurers that aren’t under enough competitive pressure to keep premiums low.

But even here, remedies could evolve. States might use their state-run exchanges to funnel so many applicants to a single, low-cost insurer that the insurer becomes, in effect, a single payer. Vermont is already moving in this direction. In this way, the Affordable Care Act could become a back door to a single-payer system — every conservative’s worst nightmare.

The Six Principles of the New Populism (and the Establishment's Nightmare)

More Americans than ever believe the economy is rigged in favor of Wall Street and big business and their enablers in Washington. We’re five years into a so-called recovery that’s been a bonanza for the rich but a bust for the middle class. “The game is rigged and the American people know that. They get it right down to their toes,” says Senator Elizabeth Warren.

Which is fueling a new populism on both the left and the right. While still far apart, neo-populists on both sides are bending toward one another and against the establishment.

Who made the following comments? (Hint: Not Warren, and not Bernie Sanders.)

A. We “cannot be the party of fat cats, rich people, and Wall Street.”

B. “The rich and powerful, those who walk the corridors of power, are getting fat and happy…”

C. “If you come to Washington and serve in Congress, there should be a lifetime ban on lobbying.”

D. “Washington promoted moral hazard by protecting Fannie Mae and Freddie Mac, which privatized profits and socialized losses.”

E. “When you had the chance to stand up for Americans’ privacy, did you?”

F. “The people who wake up at night thinking of which new country they want to bomb, which new country they want to be involved in, they don’t like restraint. They don’t like reluctance to go to war.”

(Answers: A. Rand Paul, B. Ted Cruz, C. Ted Cruz, D. House Republican Joe Hensarling, E. House Republican Justin Amash, F. Rand Paul )

You might doubt the sincerity behind some of these statements, but they wouldn’t have been uttered if the crowds didn’t respond enthusiastically – and that’s the point. Republican populism is growing, as is the Democratic version, because the public wants it.

And it’s not only the rhetoric that’s converging. Populists on the right and left are also coming together around six principles:

1. Cut the biggest Wall Street banks down to a size where they’re no longer too big to fail. Left populists have been advocating this since the Street’s bailout now they’re being joined by populists on the right. David Camp, House Ways and Means Committee chair, recently proposed an extra 3.5 percent quarterly tax on the assets of the biggest Wall Street banks (giving them an incentive to trim down). Louisiana Republican Senator David Vitter wants to break up the big banks, as does conservative pundit George Will. “There is nothing conservative about bailing out Wall Street,” says Rand Paul.

2. Resurrect the Glass-Steagall Act, separating investment from commercial banking and thereby preventing companies from gambling with their depositors’ money. Elizabeth Warren has introduced such legislation, and John McCain co-sponsored it. Tea Partiers are strongly supportive, and critical of establishment Republicans for not getting behind  it. “It is disappointing that progressive collectivists are leading the effort for a return to a law that served well for decades,” writes the Tea Party Tribune. “Of course, the establishment political class would never admit that their financial donors and patrons must hinder their unbridled trading strategies.”

3. End corporate welfare – including subsidies to big oil, big agribusiness, big pharma, Wall Street, and the Ex-Im Bank. Populists on the left have long been urging this; right-wing populists are joining in. Republican David Camp’s proposed tax reforms would kill dozens of targeted tax breaks. Says Ted Cruz: “We need to eliminate corporate welfare and crony capitalism.” 

4. Stop the National Security Agency from spying on Americans. Bernie Sanders and other populists on the left have led this charge but right-wing populists are close behind. House Republican Justin Amash’s amendment, that would have defunded NSA programs engaging in bulk-data collection, garnered 111 Democrats and 94 Republicans last year, highlighting the new populist divide in both parties. Rand Paul could be channeling Sanders when he warns: “Your rights, especially your right to privacy, is under assault… if you own a cellphone, you’re under surveillance.”

5. Scale back American interventions overseas. Populists on the left have long been uncomfortable with American forays overseas. Rand Paul is leaning in the same direction. Paul also tends toward conspiratorial views about American interventionism. Shortly before he took office he was caught on video claiming that former vice president Dick Cheney pushed the Iraq War because of his ties to Halliburton.

6. Oppose trade agreements crafted by big corporations. Two decades ago Democrats and Republicans enacted the North American Free Trade Agreement. Since then populists in both parties have mounted increasing opposition to such agreements. The Trans-Pacific Partnership, drafted in secret by a handful of major corporations, is facing so strong a backlash from both Democrats and tea party Republicans that it’s nearly dead. “The Tea Party movement does not support the Trans-Pacific Partnership,” says Judson Philips, president of Tea Party Nation. “Special interest and big corporations are being given a seat at the table” while average Americans are excluded.

Left and right-wing populists remain deeply divided over the role of government. Even so, the major fault line in American politics seems to be shifting, from Democrat versus Republican, to populist versus establishment – those who think the game is rigged versus those who do the rigging.

In this month’s Republican primaries, tea partiers continue their battle against establishment Republicans. But the major test will be 2016 when both parties pick their presidential candidates.

Ted Cruz and Rand Paul are already vying to take on Republican establishment favorites Jeb Bush or Chris Christie. Elizabeth Warren says she won’t run in the Democratic primaries, presumably against Hillary Clinton, but rumors abound. Bernie Sanders hints he might.

Wall Street and big business Republicans are already signaling they’d prefer a Democratic establishment candidate over a Republican populist.

Dozens of major GOP donors, Wall Street Republicans, and corporate lobbyists have told Politico that if Jeb Bush decides against running and Chris Christie doesn’t recover politically, they’ll support Hillary Clinton. “The darkest secret in the big money world of the Republican coastal elite is that the most palatable alternative to a nominee such as Senator Ted Cruz of Texas or Senator Rand Paul of Kentucky would be Clinton,” concludes Politico

Says a top Republican-leaning Wall Street lawyer, “it’s Rand Paul or Ted Cruz versus someone like Elizabeth Warren that would be everybody’s worst nightmare.” 

Everybody on Wall Street and in corporate suites, that is. And the “nightmare” may not occur in 2016. But if current trends continue, some similar “nightmare” is likely within the decade. If the American establishment wants to remain the establishment it will need to respond to the anxiety that’s fueling the new populism rather than fight it.

 

 

youtube

Out with 2014, In with 2015, and Up with People

We’ve made progress this year – raising the minimum wage in dozens of states and cities, providing equal marriage rights in a majority of states, limiting carbon emissions. But there’s far more to do. 

The economy looks like it’s improving but most Americans are still stuck in recession, and almost all the economic gains are still going to the top. The only way we can have an economy that works for the many, not the few, is to get big money out of politics – so the rules of the economic game aren’t biased in favor of big corporations, Wall Street, and the rich. And to get more people fighting for equal opportunity and shared prosperity.  

But many Americans have become so cynical about politics they no longer even bother to vote. Turnout in the 2014 midterm elections was the lowest in decades. This is exactly what the moneyed interests want. If we give up on politics we give up on democracy, and they can take over all of it. 

Never underestimate what we can, and will, accomplish together. Organizing. Mobilizing. Energizing. Making a ruckus.

Here’s to your and yours for a great 2015.  

The New Tribalism and the Decline of the Nation State

We are witnessing a reversion to tribalism around the world, away from nation states. The the same pattern can be seen even in America – especially in American politics.

Before the rise of the nation-state, between the eighteenth and twentieth centuries, the world was mostly tribal. Tribes were united by language, religion, blood, and belief. They feared other tribes and often warred against them. Kings and emperors imposed temporary truces, at most.

But in the past three hundred years the idea of nationhood took root in most of the world. Members of tribes started to become citizens, viewing themselves as a single people with patriotic sentiments and duties toward their homeland. Although nationalism never fully supplanted tribalism in some former colonial territories, the transition from tribe to nation was mostly completed by the mid twentieth century.

Over the last several decades, though, technology has whittled away the underpinnings of the nation state. National economies have become so intertwined that economic security depends less on national armies than on financial transactions around the world. Global corporations play nations off against each other to get the best deals on taxes and regulations.

News and images move so easily across borders that attitudes and aspirations are no longer especially national. Cyber-weapons, no longer the exclusive province of national governments, can originate in a hacker’s garage.

Nations are becoming less relevant in a world where everyone and everything is interconnected. The connections that matter most are again becoming more personal. Religious beliefs and affiliations, the nuances of one’s own language and culture, the daily realities of class, and the extensions of one’s family and its values – all are providing people with ever greater senses of identity.

The nation state, meanwhile, is coming apart. A single Europe – which seemed within reach a few years ago – is now succumbing to the centrifugal forces of its different languages and cultures. The Soviet Union is gone, replaced by nations split along tribal lines. Vladimir Putin can’t easily annex the whole of Ukraine, only the Russian-speaking part. The Balkans have been Balkanized.

Separatist movements have broken out all over – Czechs separating from Slovaks; Kurds wanting to separate from Iraq, Syria, and Turkey; even the Scots seeking separation from England.

The turmoil now consuming much of the Middle East stems less from democratic movements trying to topple dictatorships than from ancient tribal conflicts between the two major denominations of Isam – Sunni and Shia.

And what about America? The world’s “melting pot” is changing color. Between the 2000 and 2010 census the share of the U.S. population calling itself white dropped from 69 to 64 percent, and more than half of the nation’s population growth came from Hispanics.

It’s also becoming more divided by economic class. Increasingly, the rich seem to inhabit a different country than the rest.

But America’s new tribalism can be seen most distinctly in its politics. Nowadays the members of one tribe (calling themselves liberals, progressives, and Democrats) hold sharply different views and values than the members of the other (conservatives, Tea Partiers, and Republicans).

Each tribe has contrasting ideas about rights and freedoms (for liberals, reproductive rights and equal marriage rights; for conservatives, the right to own a gun and do what you want with your property).

Each has its own totems (social insurance versus smaller government) and taboos (cutting entitlements or raising taxes). Each, its own demons (the Tea Party and Ted Cruz; the Affordable Care Act and Barack Obama); its own version of truth (one believes in climate change and evolution; the other doesn’t); and its own media that confirm its beliefs.

The tribes even look different. One is becoming blacker, browner, and more feminine. The other, whiter and more male. (Only 2 percent of Mitt Romney’s voters were African-American, for example.)

Each tribe is headed by rival warlords whose fighting has almost brought the national government in Washington to a halt. Increasingly, the two tribes live separately in their own regions – blue or red state, coastal or mid-section, urban or rural – with state or local governments reflecting their contrasting values.

I’m not making a claim of moral equivalence. Personally, I think the Republican right has gone off the deep end, and if polls are to be believed a majority of Americans agree with me.

But the fact is, the two tribes are pulling America apart, often putting tribal goals over the national interest – which is not that different from what’s happening in the rest of the world.

The New Billionaire Political Bosses

Charles and David Koch should not be blamed for having more wealth than the bottom 40 percent of Americans put together. Nor should they be condemned for their petrochemical empire. As far as I know, they’ve played by the rules and obeyed the laws.

They’re also entitled to their own right-wing political views. It’s a free country.  

But in using their vast wealth to change those rules and laws in order to fit their political views, the Koch brothers are undermining our democracy. That’s a betrayal of the most precious thing Americans share.

The Kochs exemplify a new reality that strikes at the heart of America. The vast wealth that has accumulated at the top of the American economy is not itself the problem. The problem is that political power tends to rise to where the money is. And this combination of great wealth with political power leads to greater and greater accumulations and concentrations of both – tilting the playing field in favor of the Kochs and their ilk, and against the rest of us.

America is not yet an oligarchy, but that’s where the Koch’s and a few other billionaires are taking us.   

American democracy used to depend on political parties that more or less represented most of us. Political scientists of the 1950s and 1960s marveled at American “pluralism,” by which they meant the capacities of parties and other membership groups to reflect the preferences of the vast majority of citizens.

Then around a quarter century ago, as income and wealth began concentrating at the top, the Republican and Democratic Parties started to morph into mechanisms for extracting money, mostly from wealthy people.

Finally, after the Supreme Court’s “Citizen’s United” decision in 2010, billionaires began creating their own political mechanisms, separate from the political parties. They started providing big money directly to political candidates of their choice, and creating their own media campaigns to sway public opinion toward their own views.

So far in the 2014 election cycle, “Americans for Prosperity,” the Koch brother’s political front group, has aired more than 17,000 broadcast TV commercials, compared with only 2,100 aired by Republican Party groups.

“Americans for Prosperity” has also been outspending top Democratic super PACs in nearly all of the Senate races Republicans are targeting this year. In seven of the nine races the difference in total spending is at least two-to-one and Democratic super PACs have had virtually no air presence in five of the nine states.

The Kochs have spawned several imitators. Through the end of February, four of the top five contributors to 2014 super-PACs are now giving money to political operations they themselves created, according to the Center for Responsive Politics.

For example, billionaire TD Ameritrade founder Joe Ricketts and his son, Todd, co-owner of the Chicago Cubs, have their own $25 million political operation called “Ending Spending.” The group is now investing heavily in TV ads against Republican Representative Walter Jones in a North Carolina primary (they blame Jones for too often voting with Obama).

Their ad attacking Democratic New Hampshire Senator Jeanne Shaheen for supporting Obama’s health-care law has become a template for similar ads funded by the Koch’s “Americans for Prosperity” in Senate races across the country.

When billionaires supplant political parties, candidates are beholden directly to the billionaires. And if and when those candidates win election, the billionaires will be completely in charge. 

At this very moment, Casino magnate Sheldon Adelson (worth an estimated $37.9 billion) is busy interviewing potential Republican candidates whom he might fund, in what’s being called the “Sheldon Primary.”

“Certainly the ‘Sheldon Primary’ is an important primary for any Republican running for president,” says Ari Fleischer, former White House press secretary under President George W. Bush. “It goes without saying that anybody running for the Republican nomination would want to have Sheldon at his side.”

The new billionaire political bosses aren’t limited to Republicans. Democratic-leaning billionaires Tom Steyer, a former hedge-fund manager, and former New York Mayor Michael Bloomberg, have also created their own political groups. But even if the two sides were equal, billionaires squaring off against each other isn’t remotely a democracy.

In his much-talked-about new book, “Capital in the Twenty-First Century,” economist Thomas Piketty explains why the rich have become steadily richer while the share of national income going to wages continues to drop. He shows that when wealth is concentrated in relatively few hands, and the income generated by that wealth grows more rapidly than the overall economy – as has been the case in the United States and many other advanced economies for years – the richest receive almost all the income growth.

Logically, this leads to greater and greater concentrations of income and wealth in the future – dynastic fortunes that are handed down from generation to generation, as they were prior to the twentieth century in much of the world.  

The trend was reversed temporarily in the twentieth century by the Great Depression, two terrible wars, the development of the modern welfare state, and strong labor unions. But Piketty is justifiably concerned about the future.

A new gilded age is starting to look a lot like the old one. The only way to stop this is through concerted political action. Yet the only large-scale political action we’re witnessing is that of Charles and David Koch, and their billionaire imitators.

What Walmart Could Learn from Henry Ford

Walmart just reported shrinking sales for a third straight quarter. What’s going on? Explained William S. Simon, the CEO of Walmart, referring to the company’s customers, “their income is going down while food costs are not. Gas and energy prices, while they’re abating, I think they’re still eating up a big piece of the customer’s budget.”

Walmart’s CEO gets it. Most of Walmart’s customers are still in the Great Recession, grappling with stagnant or declining pay. So, naturally, Walmart’s sales are dropping. 

But what Walmart’s CEO doesn’t get is that a large portion of Walmart’s customers are lower-wage workers who are working at places like … Walmart. And Walmart, not incidentally, refuses to raise its median wage (including its army of part-timers) of $8.80 an hour.

Walmart isn’t your average mom-and-pop operation. It’s the largest employer in America. As such, it’s the trendsetter for millions of other employers of low-wage workers. As long as Walmart keeps its wages at or near the bottom, other low-wage employers keep wages there, too. All they need do is offer $8.85 an hour to have their pick.

On the other hand, if Walmart were to boost its wages, other employers of low-wage workers would have to follow suit in order to attract the employees they need.

Get it? Walmart is so huge that a wage boost at Walmart would ripple through the entire economy, putting more money in the pockets of low-wage workers. This would help boost the entire economy – including Walmart’s own sales. (This is also an argument for a substantial hike in the minimum wage.)

Walmart could learn a thing or two from Henry Ford, who almost exactly a century ago decided to pay his workers three times the typical factory wage at the time. The Wall Street Journal called Ford a traitor to his class but he proved to be a cunning businessman.

Ford’s decision helped boost factory wages across the board – enabling so many working people to buy Model Ts that Ford’s revenues soared far ahead of his increased payrolls, and he made a fortune.

So why can’t Walmart learn from Ford? Because Walmart’s business model is static, depending on cheap labor rather than increased sales, and it doesn’t account for Walmart’s impact on the rest of the economy.

You can help teach Walmart how much power its consumers have: Stand with its workers who deserve a raise, and boycott Walmart on the most important sales day of the year, November 29.

Today's Jobs Report and the Supreme Court's "McCutcheon" Debacle

What does the Supreme Court’s “McCutcheon” decision this week have to do with today’s jobs report, showing 192,000 new jobs for March?

Connect the dots. More than five years after Wall Street’s near meltdown the number of full-time workers is still less than it was in December 2007, yet the working-age population of the U.S. has increased by 13 million since then.

This explains why so many people are still getting nowhere. Unemployment among those 18 to 29 is 11.4 percent, nearly double the national rate.

Most companies continue to shed workers, cut wages, and horde their cash because they don’t have enough customers to warrant expansion. Why? The vast middle class and poor don’t have enough purchasing power, as 95 percent of the economy’s gains go to the top 1 percent.

That’s why we need to (1) cut taxes on average people (say, exempting the first $15,000 of income from Social Security taxes and making up the shortfall by taking the cap off income subject to it), (2) raise the minimum wage, (3) create jobs by repairing roads, bridges, ports, and much of the rest of our crumbling infrastructure, (4) add teachers and teacher’s aides to now over-crowded classrooms, and (5) create “green” jobs and a new WPA for the long-term unemployed.

And pay for much of this by raising taxes on the top, closing tax loopholes for the rich, and ending corporate welfare.

But none of this can be done because some wealthy people and big corporations have a strangle-hold on our politics. “McCutcheon” makes that strangle-hold even tighter.

Connect the dots and you see how the big-money takeover of our democracy has lead to an economy that’s barely functioning for most Americans.

Antitrust in the New Gilded Age

We’re in a new gilded age of wealth and power similar to the first gilded age when the nation’s antitrust laws were enacted. Those laws should prevent or bust up concentrations of economic power that not only harm consumers but also undermine our democracy – such as the pending Comcast acquisition of Time-Warner. 

In 1890, when Republican Senator John Sherman of Ohio urged his congressional colleagues to act against the centralized industrial powers that threatened America, he did not distinguish between economic and political power because they were one and the same. The field of economics was then called “political economy,” and inordinate power could undermine both. “If we will not endure a king as a political power,” Sherman thundered, “we should not endure a king over the production, transportation, and sale of any of the necessaries of life.”

Shortly thereafter, the Sherman Antitrust Act was passed by the Senate 52 to 1, and moved quickly through the House without dissent. President Harrison signed it into law July 2, 1890.

In many respects America is back to the same giant concentrations of wealth and economic power that endangered democracy a century ago. The floodgates of big money have been opened even wider in the wake of the Supreme Court’s 2010 decision in “Citizen’s United vs. FEC” and its recent “McCutcheon” decision.

Seen in this light, Comcast’s proposed acquisition of Time-Warner for $45 billion is especially troublesome – and not just because it may be bad for consumers. Comcast is the nation’s biggest provider of cable television and high-speed Internet service; Time Warner is the second biggest.

Last week, Comcast’s executives descended on Washington to persuade regulators and elected officials that the combination will be good for consumers. They say it will allow Comcast to increase its investments in cable and high-speed Internet, and encourage rivals to do so as well. 

Opponents argue the combination will give consumers fewer choices, resulting in higher cable and Internet bills. And any company relying on Comcast’s pipes to get its content to consumers (think Netflix, Amazon, YouTube, or any distributor competing with Comcast’s own television network, NBCUniversal) also will have to pay more – charges that will also be passed on to consumers.

I think the opponents have the better argument. Internet service providers in America are already too concentrated, which is why Americans pay more for Internet access than the citizens of almost any other advanced nation. 

Some argue that the broadband market already has been carved up into a cartel, so blocking the acquisition would do little to bring down prices. One response would be for the Federal Communications Commission to declare broadband service a public utility and regulate prices. 

But Washington should also examine a larger question beyond whether the deal is good or bad for consumers: Is it good for our democracy?

We haven’t needed to ask this question for more than a century because America hasn’t experienced the present concentration of economic wealth and power in more than a century.

But were Senator John Sherman were alive today he’d note that Comcast is already is a huge political player, contributing $1,822,395 so far in the 2013-2014 election cycle, according to data collected by the Center for Responsive Politics – ranking it 18th of all 13,457 corporations and organizations that have donated to campaigns since the cycle began. 

Of that total, $1,346,410 has gone individual candidates, including John Boehner, Mitch McConnell, and Harry Reid; $323,000 to Leadership PACs; $278,235 to party organizations; and $261,250 to super PACs.

Last year, Comcast also spent $18,810,000 on lobbying, the seventh highest amount of any corporation or organization reporting lobbying expenditures, as required by law.

Comcast is also one of the nation’s biggest revolving doors. Of its 107 lobbyists, 86 worked in government before lobbying for Comcast. Its in-house lobbyists include several former chiefs of staff  to Senate and House Democrats and Republicans as well as a former commissioner of the Federal Communications Commission.

Nor is Time-Warner a slouch when it comes to political donations, lobbyists, and revolving doors. It also ranks near the top.

When any large corporation wields this degree of political influence it drowns out the voices of the rest of us, including small businesses. The danger is greater when such power is wielded by media giants because they can potentially control the marketplace of ideas on which a democracy is based.

When two such media giants merge, the threat is extreme. If film-makers, television producers, directors, and news organizations have to rely on Comcast to get their content to the public, Comcast is able to exercise a stranglehold on what Americans see and hear. 

Remember, this is occurring in America’s new gilded age – similar to the first one in which a young Teddy Roosevelt castigated the “malefactors of great wealth, who were “equally careless of the working men, whom they oppress, and of the State, whose existence they imperil.”

It’s that same equal carelessness toward average Americans and toward our democracy that ought to be of primary concern to us now. Big money that engulfs government makes government incapable of protecting the rest of us against the further depredations of big money.

After becoming President in 1901, Roosevelt used the Sherman Act against forty-five giant companies, including the giant Northern Securities Company that threatened to dominate transportation in the Northwest. William Howard Taft continued to use it, busting up the Standard Oil Trust in 1911. 

In this new gilded age, we should remind ourselves of a central guiding purpose of America’s original antitrust law, and use it no less boldly. 

When Charity Begins at Home (Particularly the Homes of the Wealthy)

It’s charity time, and not just because the holiday season reminds us to be charitable. As the tax year draws to a close, the charitable tax deduction beckons.

America’s wealthy are its largest beneficiaries. According to the Congressional Budget Office, $33 billion of last year’s $39 billion in total charitable deductions went to the richest 20 percent of Americans, of whom the richest 1 percent reaped the lion’s share.

The generosity of the super-rich is sometimes proffered as evidence they’re contributing as much to the nation’s well-being as they did decades ago when they paid a much larger share of their earnings in taxes. Think again.

Undoubtedly, super-rich family foundations, such as the Bill and Melinda Gates Foundation, are doing a lot of good. Wealthy philanthropic giving is on the rise, paralleling the rise in super-rich giving that characterized the late nineteenth century, when magnates (some called them “robber barons”) like Andrew Carnegie and John D. Rockefeller established philanthropic institutions that survive today.

But a large portion of the charitable deductions now claimed by America’s wealthy are for donations to culture palaces – operas, art museums, symphonies, and theaters – where they spend their leisure time hobnobbing with other wealthy benefactors.

Another portion is for contributions to the elite prep schools and universities they once attended or want their children to attend. (Such institutions typically give preference in admissions, a kind of affirmative action, to applicants and “legacies” whose parents have been notably generous.)

Harvard, Yale, Princeton, and the rest of the Ivy League are worthy institutions, to be sure, but they’re not known for educating large numbers of poor young people. (The University of California at Berkeley, where I teach, has more poor students eligible for Pell Grants than the entire Ivy League put together.) And they’re less likely to graduate aspiring social workers and legal defense attorneys than aspiring investment bankers and corporate lawyers.

I’m all in favor of supporting fancy museums and elite schools, but face it: These aren’t really charities as most people understand the term. They’re often investments in the life-styles the wealthy already enjoy and want their children to have as well. Increasingly, being rich in America means not having to come across anyone who’s not.

They’re also investments in prestige – especially if they result in the family name engraved on a new wing of an art museum, symphony hall, or ivied dorm.

It’s their business how they donate their money, of course. But not entirely. As with all tax deductions, the government has to match the charitable deduction with additional tax revenues or spending cuts; otherwise, the budget deficit widens.

In economic terms, a tax deduction is exactly the same as government spending. Which means the government will, in effect, hand out $40 billion this year for “charity” that’s going largely to wealthy people who use much of it to enhance their lifestyles.

To put this in perspective, $40 billion is more than the federal government will spend this year on Temporary Assistance for Needy Families (what’s left of welfare), school lunches for poor kids, and Head Start, put together.

Which raises the question of what the adjective “charitable” should mean. I can see why a taxpayer’s contribution to, say, the Salvation Army should be eligible for a charitable tax deduction. But why, exactly, should a contribution to the Guggenheim Museum or to Harvard Business School?

A while ago, New York’s Lincoln Center held a fund-raising gala supported by the charitable contributions of hedge fund industry leaders, some of whom take home $1 billion a year. I may be missing something but this doesn’t strike me as charity, either. Poor New Yorkers rarely attend concerts at Lincoln Center.

What portion of charitable giving actually goes to the poor? The Washington Post’s Dylan Matthews looked into this, and the best he could come up with was a 2005 analysis by Google and Indiana University’s Center for Philanthropy showing that even under the most generous assumptions only about a third of “charitable” donations were targeted to helping the poor.

At a time in our nation’s history when the number of poor Americans continues to rise, when government doesn’t have the money to do what’s needed, and when America’s very rich are richer than ever, this doesn’t seem right.

If Congress ever gets around to revising the tax code, it might consider limiting the charitable deduction to real charities.

What Tuesday's Election Results Really Mean

Pundits who are already describing the victories of Terry McAuliffe in Virginia and Chris Christie in New Jersey as a “return to the center” of American politics are confusing the “center” with big business and Wall Street.

A few decades ago McAuliffe would be viewed as a right-wing Democrat and Christie as a right-wing Republican. Both garnered their major support from corporate America, and both will reliably govern as fiscal conservatives who won’t raise taxes on the wealthy.

Both look moderate only by contrast with the Tea Partiers to their extreme right.

The biggest game-changer, though, is Bill de Blasio, the mayor-elect of New York City, who campaigned against the corporatist legacy of Michael Bloomberg – promising to raise taxes on the wealthy and use the revenues for pre-school and after-school programs for the children of New York’s burdened middle class and poor.

Those who dismiss his victory as an aberration confined to New York are overlooking three big new things:

First, the new demographic reality of America gives every swing state at least one large city whose inhabitants resemble those of New York.

Second, de Blasio won notwithstanding New York’s position as the epicenter of big business and Wall Street, whose money couldn’t stop him.

Third, Americans are catching on to the scourge of the nation’s raging inequality, and its baleful consequences for our economy and democracy.