Lecturing the World -

Greece, the European Union, and the Free-Market (or Capitalist) Transformation of the Global Economy

If the question is Do we have capitalism to thank for the benefits of the global market? -
The answer is - It depends on who you ask.

By Jack Barkstrom

Austerity, privatization, and competitiveness - The magic wand to restore economic health

Yanis Varoufakis, in his book 'Adults in the Room: My Battle with the European and American Deep Establishment,' discusses his struggles with the European Union over the restructuring of Greece's debt, while Greek Finance Minister in 2015. When it came to modifying the terms for the restructuring, he ran into a brick wall, even though his suggestions might have benefited both Greece and her creditors.[1]

Mr. Varoufakis found himself negotiating with a host of officials, although the most influential were perhaps Dr. Wolfgang Shäuble, the German finance minister, and Thomas Wieser, president of the Eurogroup Working Group. [2] Shäuble was pushing the Greek government to reduce its size and expenditures. What was needed on the governmental side, Shäuble's insistence on a no-nonsense, follow-the-rules of austerity, where Greece was concerned, came at a time, when Germany, and its businesses, were blatantly ignoring rules of restraint it demanded of others. Daimler-Chrysler had accepted a bailout from the U.S. government to save its U.S. operations, while Angela Merkel was asking the Bundestag to bail out German banks for their investments in US-based toxic derivatives to the tune of 406 billion. When that proved insufficient, the EU was persuaded to look the other way, while German banks got the EU to assume part of their Greek debt guarantees, so they would have time to offload whatever remained of Greek debt.[3] Not long after, it would come to light that Volkswagen's management had been involved in a computer-chip scheme to evade and violate U.S. environmental standards - (very devious, but incredibly innovative). That actually was somewhat consistent with a free market philosophy which viewed all government regulations as impediments to business operations.

There were business reasons why the IMF and the EU would favor and back financial and business interests, which would be needed in any potential recovery. Conservatives did have a point, when it came to limited governmental spending: large governmental budget deficits are unsustainable over the long term. At the same time there was a conservative bias against governmental safety-net programs and trade unions. That bias demanded that the first to absorb, or pay for, reductions in government spending would be Greek pension funds, semi-public institutions and Greek savers.[4] The drive for efficiency, which saw trade union rights as an obstacle in the path of competitiveness, would seek to limit those rights as well.[5] The push for privatization was based on the free market belief that the value, or best use, of any asset should be determined by the market. In some ways, it was a valid argument. What was the point of keeping valuable works of art in a museum, if people were unwilling to pay the admission price to see them? Public displays cost money, and someone must come up with the money to pay for building maintenance, security, or even exhibit repairs.

Capitalism 101 - We didn't actually invent markets but we're the only ones who understand how they work!

About the time the Berlin Wall came down, there were a number of writers who saw the collapse of the Soviet Union as vindication of free market beliefs and proof that capitalism had proved itself superior. Futurist George Gilder proclaimed: "It is the entrepreneurs who know the rules of the world and the laws of God." Similarly Francis Fukuyama would assert: "Liberal democracy combined with open market economics has become the only model a state could follow." Such sentiments would be repeated in the 2002 "National Security Strategy of the United States of America:" "The great struggles of the twentieth century between liberty and totalitarianism ended with a decisive victory for the forces of freedom - and a single sustainable model for national success: freedom, democracy, and free enterprise." [6] In 1994, Milton Friedman would write: "Today there is wide agreement that socialism is a failure, and capitalism a success."[7] Others would express similar thoughts: Free-enterprise capitalism has been "the most powerful creative system of social cooperation and human progress ever conceived."[8]

What started off as a broad philosophy or statement of general beliefs gradually was turned into a sure-fire formula for success - a detailed list of procedures or steps which, if followed, would produce successful businesses and, on a broader scale, prosperous national economies. It was no secret that free market advocates hated government programs, since they smacked of socialism and communism, a holdover from economic systems which had been discredited. The ideal economic system was one without any government at all, or if that was not possible, one in which the private sector dominated. Privatization became the watchword of the day. As US News would put it, in the late 1990s: "privatize, deregulate, and do not interfere with the market."[9]

The beauty of the capitalist model was that it functioned perfectly. Inputs went in one end, products came out the other, market demand dictated what was produced, people made profits, and the world prospered. It was the perfect economic model - which was perfect if you believed life's only purpose was to construct and watch the perfect economic model operating perfectly. There was a certain fascination with the success stories of wealthy individuals - often how they had struggled against overwhelming odds to become successful. Where the early focus was on the great captains of industry, such as Andrew Carnegie or John D. Rockefeller, the modern version revolves around Silicon Valley in the U.S. Oddly enough, there seemed to be more stories of success coming out of the United States, or Europe. There were examples of failure, even for a near-perfect model, but there did seem to be a greater number of success stories coming out of the U.S. than anywhere else. (For success stories, the U.S. now finds itself in competition with China, which has begun to provide a number of rising millionaire and billionaire entrepreneurs - as if competition in a trade war weren't enough.)

Capitalist success stories seemed to focus exclusively on the efforts of individuals. Ignored or downplayed has been the role inputs - resources and raw materials - have played. The story was all about the entrepreneur. If America was the land of opportunity for entrepreneurs, it was also the location of abundant resources. Since Colonial times, America has been home to one of the largest supplies of untapped resources, from timber to iron to coal and oil. While it could be claimed that the capitalist or free market system was what distinguished America and Europe from the rest of the world, it was not the only thing which was different.

Privatization - Something from Nothing

In his book 'The Global Minotaur: America, Europe and the Future of the Global Economy,' Yanis Varoufakis recounts the legend of the Cretan Minotaur, the half-human, half-bull creature, which developed a taste for human flesh, fed by an annual tribute paid by Athens.[10] The Minotaur, for the book's purposes, is a symbolic stand-in for the deficits of the United States, which generate such overwhelming economic power, that lesser economies are forced to pay, or send, an annual tribute. Whether the deficits are the actual source of the economic disparity, there is clearly some truth to the idea that the American economy exerts a substantial amount of influence and dominance over smaller economies.

The seeming success of the U.S. economy has lent itself to, or carried over into the field of economics itself, where 'capitalist' or 'free market' theory seems to dominate all other potential theoretical rivals. There is actually only one competing theory, they believe - communism, with its associated labels, such as socialism or Marxism - largely discredited by the collapse of the Soviet Union. Capitalist and free market theories still form the core of what might be called free market economic theory, but other associated terms, if not replacing them, have sprouted up and gained popularity, notably 'privatization' and 'competitiveness.' In some ways they are meant as subtle digs at government, a reminder that government, or 'big government' programs are a carryover from the 'socialist' policies of the Soviet Union.

In political campaigns there is the claim that candidates have worked in the 'private sector,' which is where entrepreneurs come from, and where jobs have been created, and from which the term 'privatization' originated. The 'public sector' or 'government sector' is where bureaucrats work, creating bureaucracy and ever-more regulations which stifle the creative energies of entrepreneurs and the free market. Criticism of government has spawned almost a separate branch of free market economics. Where economies need to be competitive to survive in the world of globalization, governments need to 'live within their means,' reduce debt, and curb unnecessary spending and programs. Austerity and privatization, the two terms heard most often as cure for big government are somewhat different: austerity means reducing government budgets while maintaining government control, while privatization means transferring control to the private sector.

Austerity and privatization, as goals or economic theories, have been an attempt to reconcile the fact of governmental involvement in economic activity with the economic ideal of a freely operating economy. If government could not be eliminated altogether, at least it could be made to operate like a business. Its main purpose was to build highways, maintain basic services, and provide protection and it should confine its activities to basic services. Its only function was to provide those services. It was not to provide social benefits or act as a safety net or regulate and interfere in market operations or the free market.

The obsession with privatization, governmental austerity, and the goal of creating competitive economies, or the fear, at the other end of the argument, that countries and societies are losing their competitiveness, is almost a separate branch of economics. It assumes that the economic starting point for all economies is the same and that economies should follow the same path to development. The model is a static one, which assumes that the model for economic development which works in one country should be applied without adjustments, to other economies. If a tropical rain forest is the ideal, then the goal is to change a region like the Sahara desert into a tropical rain forest. If the United States, with its mineral and energy resources, is the ideal model for industrial development, the assumption is that that industrial model should be used to change an agriculturally-based economy into an economy based on manufacturing.

The Russian Experiment of the 1990s - Privatization Shock Therapy

The Bolsheviks, when they took power, had made the elimination of private property almost the centerpiece of their economic programs. As long as they remained in power, that idea survived. However, its special status provoked something of a backlash, among Russian economic students, as the Soviet Union entered its final days. Among the new generation of economic thinkers was Yegor Gaidar. While a student at Moscow State University, he had read the works of western economists, such as John Maynard Keynes and Milton Friedman, and a Hungarian economist named János Kornai, and came to believe that a Western market-style approach, with its reliance on private property, was a more realistic approach to economic problems than socialism.[11]

His ideas were radical for Russia, but in the late 1980s, when he was writing for the economics journal, the 'Kommunist,' they were not quite as dangerous as they would have been during Stalinist times. Gaidar's position as an academic was not quite as hazardous as those in the upper levels of Party politics. For Boris Yeltsin, as head of the Moscow Party organization in December 1989, it could have been extremely dangerous, since he became involved in a fight with Mikhail Gorbachev.[12] Nevertheless, on June 12, 1991, Yeltsin was elected president of the Russian Republic.[13] In August 1991 Gorbachev himself would be lucky enough to survive a coup. [14]

Following the coup, Russia was in turmoil. The collective farms, normally a reliable supplier of food, halted grain deliveries to the centralized distribution centers. With the military divided, any government threats seemed less than credible, even to the experienced managers on the collectives. They fell back on a more reliable form of exchange - barter - rather than selling produce for rubles. Unfortunately, what the largest cities produced in their factories was of little value in a barter market dominated by agriculture. Factory production was geared to military or industrial markets. It was of little use to the grain producing regions.[15]

Boris Yeltsin, uncertain about which direction to take the country economically, turned to Gaidar, for help. During the fall of 1991 he, and a team he assembled, worked to come up with a solution. In November 1991, Yeltsin formally appointed Gaidar as minister of the economy and finance.[16] Gaidar was not so enamored of free market ideas that he believed there weren't economic risks if they were adopted. But he also felt that the shortages were already so severe that lifting price controls offered the only hope of encouraging producers to put goods out on the market.[17]

On January 2, 1992, government price controls on consumer goods were lifted, with the exception of bread, milk, and alcohol. It succeeded in making goods available, but, with prices up 352 percent, for many, they were at a price few could afford. In January Yeltsin also signed a decree legalizing private commerce. People started selling goods on the streets of Moscow and other cities. Some markets developed to meet the newly discovered demand, although these markets largely left heavy industry out in the cold. Young entrepreneurial 'shuttle traders' traveled to Turkey and China and brought back jackets, coats, and underwear for re-sale.[18] Yeltsin resisted pressure from the deputies within the Congress to slow or reverse the reforms, going even further by summer and discontinuing all subsidies of consumer goods, including bread, milk, and alcohol. Inflation stabilized - prices remained high - but were less than a hyperinflation level. If prices had stabilized, unemployment remained a problem.[19]

Gaidar was not the only economic theoretician disillusioned with what socialism had produced in the Soviet Union. Anatoly Chubais, the son of a Soviet army colonel, had studied economics in Leningrad, then become an economic adviser to Anatoly Sobchak, the mayor of Leningrad. From Leningrad, in 1991, he moved on to become part of President Yeltsin's government, as chairman of the Committee for the Management of State Property. His economic focus was on shortages - why command-and-control economies created shortages. He believed that what was needed was information, and that the market and market prices would provide realistic and reliable information about what was needed and should be produced.[20]

Gaidar and Chubais represented Russian economic thought - a pessimistic disillusionment with socialist planning. In contrast Western economists, particularly Americans, took a more optimistic, almost giddy, view of Russia and its economy - an opportunity for capitalism. Harvard economists, eager to put their free market theories to the test, were hired by the Russian government. They wanted the transformation to capitalism to proceed as fast as possible.[21]

Western consultancies did very well, in fact probably much better than the Russians themselves. Their assumption was that communist theory had so permeated every aspect of life that Russians either did not understand how life worked in the real world, or if they had once known, had long forgotten it. The real world, as they understood it, was the world of free markets and capitalism. Overlooked, or forgotten, was Russia's trading tradition. Peter the Great had commissioned the Danish navigator Vitus Bering in 1725 to explore the waters east of Siberia. By the 1780s, the Russians, under Alexander Andreyevich Baranov, had established a network of trading outposts in Alaska, had even had to deal with a revolt by the Aleut tribe of Kodiak in 1800, organized by priests of the Russian Orthodox Church.[22] The Russian settlements in Alaska were enough to establish Russia's claims to the land, recognized when the U.S. Secretary of State, William Seward, agreed to pay the Russian government for its claims in 1867.[23]

For Western governments, and the consultants sent to Russia, privatization itself was the goal. It was also seen as proof that the transformation to a market economy had reached its final stages - both complete and successful. They counted the privatization project of the city of Nizhny Novgorod (formerly Gorky), when it sold off most of the state's assets to ordinary people, among its success stories.[24] It may have looked like a free market triumph to Americans, but it may have seemed to many Russians that the Americans had taken charge of selling off Russia, perhaps even a sense that Russia was being taken over. American companies wanted to sell Russians American products (which most Russians couldn't afford at the time), but America didn't seem interested in offering any serious economic help. Being lectured to, ordered around, and treated as a backward country, did not sit well with many Russians.[25] Even Boris Yeltsin, who had acquiesced in the privatization policy, hinted at the festering resentment when he told President Clinton: "Russia isn't Haiti!"[26]

The Russian Revolution may have transformed Russia, but the process had taken years. There was a temptation, during the 1990s, to judge the Russian transition to the free market as an immediate success. Why wait for the long-term economic studies, when privatization was proof enough that the transition had been successfully made? Yet, within a few years, there was a reassessment of the initial economic success. On the political side, the festering resentment came to the fore in 1998, when American and NATO forces began bombing Serbia. Russians staged massive protest rallies outside the American embassy in Moscow. Someone even fired a grenade launcher at the embassy which, having failed, but was followed by automatic weapons fire.[27] By 2000, Russians were less inclined to view good relations with the West as a priority. A public opinion poll conducted in January 2000 found that 55 percent of the Russian population expected Vladimir Putin to return Russia to the status of a great and respected country, while only 8 percent expected him to bring Russia closer to the West. [28]

On August 19, 1992, Boris Yeltsin had been persuaded to announce the sale of state-owned assets to the Russian people - all 148 million of them. They would be issued a "privatization check," with a nominal value of ten thousand rubles. The idea was that they could be exchanged for shares in any one of the newly formed private enterprises.[29] Some people immediately sold them, or used them to invest in mutual funds. No one knew what their actual value was, but within two years, 70 percent of the state economy had been sold.[30]

Privatization shares, for ordinary Russians, did not bring instant wealth. Nevertheless, they were fascinated by the idea of instant wealth, which the new freedoms promised. They received a reality check in the summer of 1994, when a company called MMM collapsed. MMM turned out to be a pyramid scheme, but Russians had seen the TV spots promising an easy way of earning a living for life. In 1994, the founder of MMM was arrested for tax evasion and those who had bought shares discovered that they had lost everything they had invested.[31] The MMM collapse was followed in October 1994, by the collapse of the ruble. "Black Tuesday," October 11, 1994, was a day Russian currency traders would not forget. In a single session of currency trading, the ruble lost more than a quarter of its value against the dollar.[32]

Oil Prices - The Foundation of Economic Freedom

Privatization represents the non-governmental side of the capitalism debate; austerity is the governmental (or more accurately, the anti-governmental) side of the discussion. For those who wanted to lecture Greece on the need for living within its means, for reducing government expenditures, governmental budgets represent the be-all and end-all of economic theory. At the same time, there is rarely a lecture aimed at the U.S. or a budgetary need to reduce military spending. In fact, military spending is hardly mentioned at all - is something of an exception - when it comes to budgetary discussions. Government spending on social programs or a social safety net are unnecessary expenditures which governments cannot afford.

There have been claims that the Star Wars program and the increased military spending by the Reagan administration in an arms race, contributed to the collapse of the Soviet Union. The U.S., able to increase its military budget, forced the Soviet Union into a competition it could not afford. If the increased military spending meant that the U.S. was living beyond its means budgetarily, it was money well spent.

Whether the arms race had that much to do with Russia's budget problems, oil prices certainly did - and their impact on the Russian budget were something that could be quantified. In September 1985, Saudi Arabia increased its oil output, contributing to a glut on the world market. The price of a barrel dropped from $50.11 in 1985, to $24.71 in 1986, then stayed in the 20s for the next several years. The years 1987 and 1991 were exceptions, when the price was in the 30s. The Soviet Union lost around $20 billion a year in revenues, as a result.[33]

In May 2000, Vladimir Putin won a landslide victory as president. A few months later, oil prices would begin to recover. In 1998, the spot price of Ural's crude had been at $9.57 per barrel. By 2008 it would reach $94 per barrel.[34]

Russia may have begun its recovery in the 2000s, but the late 1990s had been a struggle. The Asian financial crisis, which had started in the summer of 1997, with first Thailand, then Indonesia, Malaysia, Singapore, and finally South Korea.[35] Russia was borrowing heavily already, and the Asian crisis meant that investors were demanding a higher rate of return.[36] Just before the Asian crisis began, the International Monetary Fund (IMF) had approved an $11.2 billion loan to Russia, which was part of a $22.6 billion package from international lenders.[37] In July 1998, the IMF agreed to another multi-billion-dollar rescue plan.[38]

Whether the help came too late or was not enough, on August 17th, Moscow devalued its currency and declared a ninety-day moratorium on the payment of foreign debts owed by Russian banks. Two days later, on August 19th, the Russian government defaulted on its domestic debt. By October 1998, Moscow was asking the international community for assistance to pay for food imports.[39]

Austerity - The Other Shoe Drops

Austerity measures were almost certain to follow the IMF's involvement in Russia's economic decisions. The focus of the IMF at the time seemed to be on stability and fiscal restraint. A stable currency and conservative governmental economic policies were meant to encourage investment, which over the long run would lead to economic growth. Yet, in some ways, the IMF had a warped view of economics, which focused on investment possibilities, to the exclusion of other considerations. World War I had supposedly been fought to 'Make the world safe for democracy,' but in the 1990s the message had been transformed - the goal of the post-Soviet world was to 'make the world safe for investment.'

If Yanis Varoufakis would later complain that the austerity program imposed on Greece would impact pensions and social welfare programs, and all but eliminate its cultural heritage, the impact of austerity measures imposed on Russia went much deeper. Under pressure from the IMF, Yeltsin's government slashed real spending by between 30 and 50 percent. Health, education, pensions, and social welfare budgets were drastically cut back. The health and education sectors were hit particularly hard. Some schools, to cover basic costs, began charging 'user fees,' which many parents couldn't afford. By 2000, more than a fifth of sixteen to seventeen-year-olds were not in school. Child vaccination rates fell and the lack of funding for primary care led to a doubling of the mortality rate for infectious and parasitic diseases and an increase in the number of tuberculosis cases.[40]

Whether the privatization or austerity models are true or have universal application is one question - that question (or answer) falls on the solution side. Does the model provide solutions to economic problems? The other question falls on the problem side. Why do so many seemingly different economies in different locations suffer from similar problems? Why is youth unemployment a problem in such seemingly diverse countries as Spain, Zimbabwe, Chile, and Poland, or any of a number of different countries? If Greece is suffering from the flight of professionals who leave for better job prospects elsewhere, so is Poland, the Philippines, Malaysia, Nigeria, and Ukraine. Even in the United States there are complaints that wages are not keeping pace with living costs. There are fears that even industrialized countries, such as Germany, the U.S., and China, are heading into recession and that their growth rates are slowing down. It is as if, for all the emphasis on competitiveness, even developed economies cannot grow fast enough, cannot provide enough jobs, and cannot find or create enough markets for their products to keep their economies going (let alone growing).

Competitiveness - What does it mean?

Just what does the term competitiveness mean? Does it have the same meaning when applied to individual businesses as it does to national economies? Does it have an objective meaning, or standard, or is it just a generic phrase thrown around from time to time, with no set definition? Generally, when competitiveness is discussed in the business world as a goal, it means companies are always looking for ways to make a better product, to make it more efficiently, in a way which will bring prices down for the consumer.

But did competitiveness always mean the same thing, or did what sounded like a universal ideal, really have different meanings in different situations? In the case of Greece, did it mean creating a better product, or just forcing prices down far enough to make goods attractive to foreigners?[41] While the argument was made in general terms, was it really a reference to the tourist trade and hotel prices. Drop the price of a hotel room from, say $175 a night to $100, and it will make a Greek vacation package a real bargain for foreign tourists. Was the assumption that hotel prices were too high anyway and that cutting prices was just cutting fat? How about forcing prices down to $50 a night or down to $10? If hotels could sustain such occasional cuts on a temporary basis, could they survive such cost-cutting on a continuing or long-term basis?

When Greece was told to become more competitive by the IMF, why did the IMF seem so obsessed with Greek pharmacies? Was it because, with their monopoly power, Greek pharmacies were preventing Greek consumers from finding lower-cost drugs or benefiting from more convenient shopping locations? Was it an isolated example or was it intended as absolute proof that the Greek economy must be riddled with a series of other inefficient "Greek pharmacy" sectors? Or was it really chosen because it happened to be the one sector tourists were likely to encounter when they visited Greece? Was its real source, not a serious economic study, but complaints by tourists who were inconvenienced at not finding the same products, or hours, in Greek shops, that they found at home?[42] Or, more broadly, was it an indication that the 'economic theories' which were supposed to be based on long-standing 'principles' flowing from rigorous analysis were nothing more than a collection of tourist anecdotes? - If Greece were more competitive, I might be able to find that bar of soap I like. I can always find it at my supermarket at home.

Or was the push for competitiveness in Greece just another line of attack in the never ending battle to put an end to unions and worker rights? Competitiveness certainly had the ring of authority, as if founded on some long-standing economic principle, without the abrasiveness of a term such as 'union busting.' It did have an historical ring - it sounded like a page from the American playbook of the 1880s or 1920s or 1950s. The 'sanctity of contract' had been a long-standing principle followed by U.S. courts. Was Europe really losing its competitiveness because its products were no longer up to standard, or was it just a sense that it was being overrun by labor unions and worker rights?

The curtailment of worker rights and the elimination of unions seems to be a cornerstone of conservative economic thought. If there is one subject guaranteed to make people see 'red,' it is the subject of unions. It can be interesting to watch how the tone of any discussion is likely to change when the word 'union' is brought up. The 'red' they see is not the red associated with communism, rather it is the red associated with the vegetable - as in 'beet red,' as in rising blood pressure 'beet red.' There is, however, no basis in ordinary economic theory that requires that worker rights or union rights be placed in a special category of their own. Labor costs, especially when associated with labor unions, may carry some special significance for the conservative business community, but they are not really that much different from fuel or raw materials costs - except that it may be easier to pressure workers into accepting reduced wages than it is to force another business owner to reduce the cost at which they will sell their product.

In one sense, competitiveness and austerity, seem to have become code words for the curtailment of labor rights and the elimination of unions, in addition to pensions and safety net programs. Competitiveness can be associated with labor costs. If a competing business can offer its products at a lower price because it pays its workers less, there is a problem of competitiveness. Labor unions and labor contracts may make it difficult to immediately respond by limiting the ability to immediately cut costs. But the term competitiveness can be more broadly applied to changes in industrial technology. The steel industry is a classic example, where technological changes led to cheaper production costs and higher quality. In that case competitiveness or a lack of competitiveness was a less politically-charged term.

While conservative or capitalist theorists, may label countries or economies as competitive or not, based on whether they allow or limit union rights, they are assuming that their economic theories or goals take priority over all other societal goals, and that their goals and a society's goals must be one and the same. It is true that competitiveness, in the capitalist sense, may be required if a country wants to market its products and maintain its standard of living. It is also true that economic forces in the real world are not subject to theoretical rules of fair dealing or justice.

On the other hand, societies and governments decide what rules they will follow and what regulations they will adopt. They may be praised for a business-friendly environment or condemned for the inequities they allow, but they are free to ignore or bend to their critics. Competitiveness may be a priority for them or it may not. They may find business arguments persuasive, and often do. Apart from the debate over union rights, the most contentious issue in the world economy seems to be tax breaks offered to businesses to attract them to a particular city or country. Although there are claims that corporations should be paying their fair share of taxes, the real fight is over jobs. The criticism of corrupt politicians seems to be universal. Perhaps the bigger danger for politicians is not whether they are considered corrupt, rather it is whether they have been able to create jobs.

The Ruhr Crisis of 1923-1924 - Austerity demands, privatization solutions

Here and there in "Adults in the Room," there are hints or observations which suggest, or more than suggest, underlying tensions within the EU between the member countries. In broad terms, the prosperous countries, able to balance their government books, seem to dominate the weaker ones, who they lecture to about balancing their books. [43] In more specific terms, Germany, an economic powerhouse, is suspected, or accused of, driving the economic agenda of the EU, generally, and pushing an agenda of balanced budgets, austerity, and privatization, which, when applied to specific countries, has forced France, Spain, Portugal, Ireland, Cyprus, Italy, and, not to forget Greece, to adopt budget-cutting measures. High on the agenda of austerity and privatization , of course, is 'labor market reform,' and the weakening of worker rights.[44] The details are also murky, but it also seems that German auto makers like the market for German cars in Greece, which are sold on credit to Greek buyers. If that is the case, it appears that allowing Greek consumers to go into debt is fine, when German auto-makers benefit, but that debt incurred by the Greek government is not so good, because governments need to observe strict rules of budgetary constraint. (There is always the accounting/auditing question of whether sales, such as sales of automobiles, should be recognized at the time of sale, if the sale is on credit, and the consumer may default later.) Revenue definitely looks good if a $100,000 credit sale can be reported at the full $100,000, at the time of sale, even if it later turns out the consumer can only pay back $20,000 of the $100,000. There was also Argentina, where the IMF sought to diversify the market for austerity, to a non-European audience. The theoretical theory was so successful that the real-world economy actually collapsed.[45]

At one point in his book, Yanis Varoufakis suggests to Thomas Wieser, Angela Merkel's negotiator, that unless the IMF makes some concessions to Greece on its debt payments, Greece would very shortly default on its IMF loans. Wieser's unsympathetic response was that Greece could last longer if it plundered the reserves of non-governmental but publicly owned institutions such as pension funds, universities, utility companies and local authorities.[46] It is not the unsympathetic tone that is noteworthy, rather it is the similarity to proposals made by the French government nearly a hundred years ago, in 1922, when France was in a position to dictate terms, and Germany was on the receiving end.[47]

On January 11, 1923, units of a Franco-Belgian army marched into the Ruhr District of Germany, in what would become known as the Ruhr Crisis. Since the end of World War I, there had been a seemingly endless series of negotiations over what Germany owed from the war and how it would make payments. Prior to that action, on August 7, 1922, the Allied powers opened a conference in London to discuss, not only German war reparations from World War I, but also questions surrounding repayment of loans by the Allies incurred during the War, primarily to the US. There was a sense among the Allies that the total debt needed to be reduced and that the repayment schedules needed to be revised.

In a role reversal, the France of 1922, was proposing something similar to what the German Ministry of Finance would be suggesting in 2015. If Thomas Wieser wanted Greek pension fund and utility company assets made available for private investors, French Prime Minister Raymond Poincaré, in 1922, had demanded similar access to German assets. On the opening day of the August conference, Poincaré proposed that a customs frontier be re-established on the Rhine, Prussian state forests and mines on the Rhine and Ruhr be seized, and 60 per cent of the equity in the great chemical companies located west of the Rhine be sequestered.[48] The proposal alarmed the Allies who, even if sympathetic to the damage France had suffered during the War, viewed it more as a power grab which would dramatically increase the economic and military dominance of France over Europe. Notwithstanding that Belgian troops would later join France in occupying the Ruhr, even the Belgian Foreign Minister earlier had observed that the French proposals hearkened back to the 'golden days of the Napoleonic Empire.'[49] Belgium itself may have lent its troops to the occupation out of fear that it would be surrounded by French-controlled lands and totally excluded by France from future decisions.[50] Belgium had reason to remember French history, in general, and Napoleon Bonaparte, in particular, having contributed troops to the army which defeated Napoleon at Waterloo.

Adolph Hitler would always claim that Germany had been treated unfairly after World War I, and France's hard-line attitude and Ruhr invasion have been seen as one of the major factors in the rise of Hitler and National Socialism. Seen in isolation, France's actions look to be those of a conservative and backward political elite, intent on a strict interpretation of treaty terms, ready to pounce at the slightest deviation from what was agreed upon. To outward appearances, Britain and the United States looked to be neutral observers, watching as helpless bystanders while France acted as a rogue state against Germany's evasions. Yet, both Britain and the U.S. were themselves taking a hard-line attitude toward France and its financial obligations to them. The U.S., during the war had been happy to provide loans to France, as well as Britain. Britain, in addition, had made large loans to France, as well. [51] The Allies had begun the war as if it was a matter of right and wrong. Perhaps they should have paid more attention to the economic aspects of conflict, and calculated what the costs might be before the war started.

The British did not help with their 'Balfour Declaration' of August 1922. They publicly offered to renounce their financial claims on their allies in return for a similar renunciation by the United States. The U.S., rather than being conciliatory, was offended, partly because it made them look ungenerous. In response, they took a hard-line attitude, refusing to forgive any European loans, and insisting that the loans be repaid. The Republican administration, which had replaced Woodrow Wilson, would also not attempt to restructure the debt to at least create a more realistic repayment plan.[52] There was a vicious circle of accusations which followed. The British insisted that they needed France to repay the loans the British had granted them. France, in turn, claimed that they could not pay the British unless Germany paid France what it was owed under the reparations agreement.[53]

The Ruhr Crisis involved what might be called the perfect storm of conflicts and goals, part political and part economic. On the political side, France feared the military power of Germany, and wanted security guarantees against future German aggression, some means of either weakening Germany or establishing buffers between Germany and France. It could point to its recent defeat in the Franco-Prussian War as the danger posed by Germany on that front. Almost at the other extreme of the political spectrum, were the ambitions France itself had of dominating Europe. It had a form of Napoleonic complex which saw itself as the dominant power in Europe. If Prussia had succeeded in defeating France in 1871, Napoleon had turned the tables on the Prussians in 1806, when his armies defeated them at Jena and Auerstedt, , then moved to occupy Berlin.[54] Even earlier, at the time of the Thirty Years War, in the 1630s, Germany had found itself the battleground of Europe, while France was playing the power-broker role. In 1697, Louis XIV had gained control of Alsace under the Treaty of Ryswick.[55]

On the economic front, France had pinned its hopes on regaining Alsace and Lorraine, lost to Germany at the end of the Franco-Prussian War. Yet, even as Germany lost territory with the Treaty of Versailles, and France gained, German economic recovery seemed more robust than that of France. Seemingly handicapped by its treaty obligations, Germany had nevertheless invested in capital projects and the modernization of its steel industry. The German government had compensated companies which owned French subsidiaries, confiscated after the Armistice. Ironically, the dwindling value of the mark lowered the cost of capital improvements, making German modernization possible.[56]

Germany also benefited from the availability of scrap, which became more available after disarmament, and high-grade Swedish, Canadian, and Spanish ore. It no longer needed French iron ore, and French mine output fell with the market, while unemployment rose. German firms also maintained demand by dumping steel in foreign markets. If they lost money selling at lower prices, they at least maintained a market. France, and Lorraine in particular, had also relied on the German markets of Rhineland and south Germany, for its sales of semi-finished products. Overall, German heavy industrial productivity eclipsed that of France before 1914 and was widening.[57]

Germany was doing well, only in comparison to France, and primarily in terms of industrial output. Critically, it was short of food and consumer goods, and the labor force was malnourished, with little of the statistical gains in output being translated into a restoration of the standard of living.[58]

French politicians in 1922 were not inclined to sympathize with the hardships of the German labor force. Whether they focused obsessively on the Franco-Prussian humiliation, they worried almost as much about Germany's economic power, as its military superiority. Even in the 1890s German companies had bought up or established French subsidiaries, especially iron ore mines in French Lorraine and Normandy. Outwardly, France had resisted German encroachment - even a proposed German Chamber of Commerce in Paris met with political resistance - yet French consumers seemed to prefer German goods. The economic power of Germany was only one threat. France also feared the Anglo-American threat. Even the Minister of Commerce, Etienne Clémental, as well as his officials, during the war, was not optimistic. France was in decline as a trading nation, they believed. France would find itself squeezed between the Anglo-Americans and a central European economy dominated by German industrial cartels.[59] France, at the end of the war, needed help. Wartime controls had kept inflation under control, yet once they were lifted it became a serious problem.[60]

In the larger economic context, there was a conflict of ideologies. The United States and Britain favored a free market approach.[61] There were even French businesses which favored less government involvement in managing the economy.[62] Clémental, in September 1918, had wanted, if not government interference, at least the prioritization of the reconstruction of northern France and of Belgium.[63] Still, the existing coalfields and developed heavy industry of the Ruhr District were a tempting target for French politicians, unwilling to undertake the economic reforms necessary or to make the investment within France itself, or too impatient to wait for the results of long-term economic development.[64]

It was hard to reconcile the idealistic French pleas for sympathy for France's wartime suffering with French determination to take maximum advantage of its position as a victor. They were up against the principle of national self-determination which the Allies espoused, which held that the Rhineland should remain German.[65] Clemenceau provided a glimpse of the French strategy toward the Rhineland: "In reality we shall occupy the country until it is willing to unite itself with France."[66] French indifference or hostility to the wishes of regional inhabitants was illustrated by its treatment of Lorraine, which saw 120,000 people expelled by the French by 1921, under their policy of purification. Clemenceau had gotten the Allies to agree to the expulsion of German immigrants from Alsace and Lorraine, as well as the liquidation of German industrial holdings there.[67] Under the peace terms proposed in April 1919, the German territory which Germany had to give up contained a third of its coal and three-quarters of its iron ore reserves. In addition it had to surrender its long-distance telegraph cable, 90 percent of its merchant marine, and 11 percent of its cattle. Germany was experiencing chronic food shortages at the time. Germany was also required to deliver 40 million tons of coal annually to the Allies, which represented some 30 per cent of its postwar production. The actual demands were later scaled down and, in 1921 and 1922, it delivered only 16-17 million tons of coal and coke.[68]

It was sometimes said, or believed, that France's one overriding aim, coming out of World War I, was security. Clemenceau was prepared to compromise on most demands, as long as French security was guaranteed.[69], [70] The French had been disappointed by the lack of American sympathy for their debt situation. The only thing more important, in their eyes, were the military guarantees President Wilson had promised. Yet, Woodrow Wilson could not get the U.S. Senate to ratify the Versailles Treaty and, with that failure, hopes of keeping the U.S. involved in keeping the peace faded.[71]

In the late summer of 1922 French politicians began to think seriously of taking action against Germany alone. There was a feeling that force was required to force Germany to comply with its treaty obligations. But forcing Germany to comply with treaty terms was only one part of their thinking. They also wanted to test how far they could push their allies. Plans were somewhat preliminary until August 16, when Poincaré called an extraordinary Cabinet meeting and it was decided, in principle, that the Ruhr was to be occupied.[72] There was a sense that, if Britain and the other allies might not be happy about such action, they would do nothing. If successful, it might result in an autonomous Rhineland. There were domestic political reasons for Poincaré's stance. He wanted to show the Right that he had a strong foreign policy.[73] He had expectations that it might cause Germany to collapse. Before taking action, they decided they needed a pretext. The pretext was Germany's default in its deliveries of telegraph poles and cut timber.[74]

For France, the decision to send troops into the Ruhr proved to be a mistake. Diplomatically, it was a disappointment for France, when Britain declined to participate in the occupation. French Communists came out in opposition to the policy and a number were arrested.[75] The Germans in the occupied regions began a campaign of passive resistance, refusing to operate the factories where they worked. While the French army could occupy the land, they could not run the factories themselves. German reparations payments were suspended. French businesses were impacted directly by the resistance campaign. The French ironworks in Lorraine, dependent on German coke, could not maintain production when coke was no longer available. In Essen, in March, French soldiers trying to requisition lorries at the Krupp factory were confronted by angry protesters. For several hours there was a standoff, until a French soldier opened fire on the workers with a machine gun, leaving thirteen workers dead.[76]

The German government encouraged the passive resistance campaign by continuing to pay subsistence wages to the two million workers participating in the campaign. The government could not afford the payments, but continued printing money. By the end of 1923, there were 133 printing offices with 1783 presses operating to print currency.[77] Inflation became a serious problem, causing a decline in the value of the mark. By August 1923, its value was one-fortieth of what it had been at the beginning of April.[78] By November 1923, the value of the mark collapsed.[79] At the end of the year, its low point, it stood at 25 billion marks to the dollar.[80]

Gustav Stresemann, who became Germany's chancellor in August 1923, recognized the need to stabilize the German economy. The subsidies to the resisters was costing the government 350 gold marks a week, at a time when the stoppage was also depriving the government of revenues. The subsidy policy was terminated and note printing was stopped, along with the withdrawal of the old currency and issuance of a new one.[81] France may have been looking for a face-saving way out of its occupation predicament, which may have been provided by the Dawes Plan, negotiated during 1924.[82] Germany would resume making annual reparations payments, aided by the receipt of foreign loans, mostly from the United States, totaling 25 billion marks. In 1928 Germany's national income was 50 percent greater than in 1913.[83]

In Europe, France could partially justify its treatment of Germany on security grounds. In Asia, it could not use security as an excuse for its attitude toward China. China, following the end of the Boxer Rebellion in 1901, was required to pay an indemnity of £67 million.[84] The payments had been suspended during the First World War, but were to resume in December 1922. The Chinese government was reluctant to resume payments and put off restarting them. It was France, which had pleaded hardship when it came to repaying its war debts to Britain and America, which insisted that the Chinese government not only start paying again, but pay in gold francs, unlikely to depreciate.[85]

France, along with other countries in Europe, however badly they treated each other, was complaining about America's attitude toward France. In 1926 the U.S. economy was reaching record levels of profitability and productivity. Yet, it was insisting that Europeans repay their war debts. Congress confirmed the debt agreements already made. The French debt, of more than $4000 million, was not to be reduced, although it was to be repaid in sixty-two annual installments.[86] Both sides seemed eager to pour salt in the wounds. French veterans, on learning of the Senate action, protested in Paris, declaring that France would be 'enslaved' to the United States until 1992. Former President Clemenceau even wrote an open letter to President Coolidge, complaining of America's 'hardness.' Senator Borah, Chairman of the Senate Foreign Relations Committee was clearly irritated by the parade and Clemenceau's letter, responding: "If they want to cancel debts, let them cancel reparations as well, and show us that the benefits of cancellation will go to humanity and the betterment of Europe, and not to bolster up imperialistic schemes which are now crushing the life out of people who were in no sense responsible for the war."[87]

Radium - Miraculous Material, Unlimited Potential, Hidden Dangers

Radium, in 1916, was expensive - about $120,000 for a single gram (or $2.2 million today) - but even in small quantities, it produced a fascinating glow.[88] It was not considered dangerous. In fact it was considered a potential miracle cure for a number of ailments. It was enough to make watch dials glow, and the quantity was small enough to make watch sales profitable. The formula for the paint had been invented by an Austrian-born doctor, Sabin von Sochocky, in 1913. In the first year of production, he had sold 2,000 luminous watches. Prospects looked so promising that he had formed the Radium Luminous Materials Corporation with another doctor, George Willis. [89]

There had been warnings about radium, with a half-life of 1,600 years, almost as soon as it was discovered in 1898. Even von Sochocky, who had studied with Marie and Pierre Curie, was aware that it could be very dangerous. The Curries had suffered skin burns from handling it, and von Sochocky had hacked the tip of his left index finger off when he realized radium had found its way under his skin.[90] He once told one of his workers not to put a radium-paint brush in her mouth, as it would make her sick, and laboratory workers were also issued lead-lined aprons and ivory forceps for handling tubes of radium.[91] If he was aware of the potential dangers, at another point he would assure a group of worried workers that the levels of radium they were working with were too small to cause any harm. For all its dangers, von Sochocky and Willis both found radium fascinating - so fascinating that even they ignored the dangers, sometimes handling test tubes of radium with their bare hands.[92] In November 1928, von Sochocky would die of radium poisoning.[93] In September 1922, George Willis had had his right thumb amputated, after tests revealed it had become cancerous.[94]

The workers who produced the luminous dials were doing more than just holding glowing tubes of radium in their hands. They were tasting and swallowing it. There were a number of ways watch dials could be painted. In Europe, glass rods or small sticks or metal needles were used. In America, it was hand painted on the dials with brushes dipped in a solution of water and radium powder. The bristles had a tendency to spread with use and to keep them together, the workers put them in their mouths, in a technique called lip-pointing. Exposure was not limited to what was swallowed. When workers entered a dark room, there was a glow emanating from clothing or faces - anywhere the paint or dust had settled on. Whatever the drawbacks, the job paid well.[95]

In the summer of 1921, both von Sochocky and George Willis were ousted from the company after its treasurer, Arthur Roeder, acquired a majority of the company's stock. Roeder would become president of the newly named United States Radium Corporation.[96]

In Newark, New Jersey, in early 1924, Walter Barry, a dentist saw an increase in the number of patients with serious jaw and bone problems. He suspected it was due to their work at the radium plant, but thought the culprit was phosphorus.[97] He told them they needed to quit working there. The ultimatum began to impact the United States Radium Corporation. People were quitting and it was having difficulty hiring replacements. It was when the mother of a sick worker threatened to file a claim for compensation, that the company took action.[98] A study by a medical team from Harvard University in May concluded that the problems experienced by the workers seemed about average, which seemed to vindicate the company, or at least that was how the company interpreted the report.[99] There were a number of medical experts and health agencies who were beginning to take notice of worker health issues related to their employment. On February 5, 1925, one of the former workers filed suit against the United States Radium Corporation for $75,000 ($1 million).[100]

Publicly, the law suit had no immediate impact. It was noticed in Ottawa, Illinois however, where a company named Radium Dial, was operating one of the largest dial-painting plants in the country. The company did began a search for alternative methods of painting watch dials but it did not prevent its workers from lip-pointing.[101]

While radium had been suspected as the cause of a number of ailments and deaths among dial-painting workers, there had been no proven link, even at the time of the filing of the Radium Corporation suit in February 1925. In June 1925, Dr. Harrison Maitland, Chief Medical Examiner of Essex County, New Jersey, conducted an autopsy on Sarah Maillefer, a former dial painter at the Orange plant.[102] An examination of her bones and her organs revealed that they were all radioactive. Her spleen and liver were radioactive. He strapped dental films to her bones and within sixty hours, the normally black film showed fog-like patches of white, indicating that radioactivity was exposing the film. The bones were radioactive. He shortly after went public with his conclusions, that her illness and death was the result of ingestion of luminous paint. [103]

In June 1924, when the United States Radium Company claimed the Harvard study of its Orange, New Jersey plant, cleared it of responsibility for the illnesses of its workers, it was not being entirely truthful. What the company provided was its own summary of the results. The full report, which it withheld, did not state that the blood of employees was "practically normal" or that the girls in the plant were in "perfect condition." [104] The report stated that "no blood" taken from the employees "was entirely normal." Even their cover letter had attributed the cause of the illnesses in the plant was due to radium. [105]

Dr. Cecil K. Drinker, professor of physiology at the Harvard School of Public Health, had been asked by the President of United States Radium Company, Arthur Roeder, in March 1924, to conduct the study at the Orange Plant.[106] Drinker was an MD and an authority in the newly emerging field of occupational disease. On April 16, 1924 he arrived at the Orange plant to begin his investigation.[107] He and his wife, Dr. Katherine Drinker, were given a tour of the plant, and spoke to some of the women who worked there. Katherine Drinker insisted that they needed more time to conduct a survey. They came back for two days, on May 7 and 8. They conducted a more thorough investigation of the plant and conducted medical examinations of twenty-five employees, which included taking blood samples. Apart from the medical exams, there were some odd and striking observations. Dr. Edwin Leman, the chief chemist, had "serious lesions" on his hands.[108] He dismissed their concerned questions. On June 7, 1925, he died.[109] They also observed the luminous radium dust which seemed to be everywhere, and once settled on the skin, seemed to remain there, even after vigorous washing. They also talked to a local doctor and some of the dial-painters, who were suffering from unexplained pains and fatigue.[110] On June 3, 1924, they delivered their final report to the firm.[111]

On June 18, Harold Viedt, a vice president of USRC , informed John Roach, deputy commissioner of the New Jersey Department of Labor that the report showed the girls in the plant were normal. Viedt did not send the full report, only a summary. What he sent was also a misrepresentation of the data.[112]

The Drinkers were not the only investigators looking at the connection between dial-painting employment and disease. Dr. Frederick Hoffman, who worked for the Prudential Insurance Company, was a statistician who specialized in industrial diseases. He had been asked by Katherine Wiley to look into some unexplained illnesses suffered by women who were working or had worked in the dial-painting business. Katherine Wiley was the executive director of the Consumers League, a national organization that fought for better working conditions for women. (p. 82) [113] (Radium Girls, p. 96-97)

In one sense, an insurance company was both part of the conservative corporate world, with a conservative outlook, and yet not part of it; its revenues came from the premiums businesses paid for insurance policies, but its costs were the result of paying out settlements for the mistakes or bad policy decisions businesses made. At any rate, it had the financial resources to do battle with other large businesses. If it was providing insurance for businesses involved in manufacturing, it did have an interest in finding out whether any of their operations represented an unusually high level of risk. For that reason, they were willing to support the work of Dr. Hoffman.

Hoffman began his investigation in December 1924, half a year after the Drinkers had completed their report, with a visit to see one of the former workers, Marguerite Carlough.[114] By May 1925 he had completed his study and read it before the American Medical Association. His conclusion was that the women workers were being poisoned by minute levels of a radioactive substance which had been introduced into their systems.[115] Five months was a very short period for a medical study. Hoffman had taken even less time to draw some preliminary conclusions about the relationship between the ailments of the women and their employment. He had written to Arthur Roeder in December 1924, shortly after visiting Marguerite Carlough, based largely on his observations of her illness, that he doubted that the company would escape liability if the disease were compensable, and it was self-evident that it would be made compensable if more cases came to light. Two months later, on February 5, 1925, her lawsuit against the United States Radium Corporation was filed for $75,000 ($1 million).[116]

Most of Hoffman's investigation had involved sending questionnaires to the doctors and dentists who had attended the women, along with interviews of the women who had become ill. But he had also visited some of the plants, even the Radium Dial studio in Ottawa, Illinois, along with plants in Long Island.[117] He would finally visit the Orange plant several times in April 1925, and, on his final visit noted that the company had taken measures to remedy safety concerns. Arthur Roeder may have hoped the new measures would influence Hoffman to write a more favorable report, but was informed by Hoffman that he had furnished the American Medical Association an abstract of his paper "some time ago" and the Handbook was already at the printer. He also added that he had agreed to provide a copy to the Bureau of Labor Statistics.[118]

Hoffman may have been encouraged in his conclusions by a letter he received from the inventor of the paint, Sabin von Sochocky, in February 1925, in which he concluded that "the disease in question is, without doubt, an occupational disease." Hoffman had not even seen the cover letter which Cecil Drinker had included with the report he had delivered to Arthur Roeder n June 1924. "We believe that the trouble which has occurred is due to radium."[119]

In some ways, the timeline between the introduction of radium and an understanding of its dangers was no different than that of any other hazard. Even the medical community had been slow to understand the connection between germs or bacteria, which were too small to see, and disease. There would be later parallels following the development of the atomic bomb. Even after the dangers of radioactivity from radium should have hinted at the dangers of any radioactive exposure, the medical community and public were slow to comprehend the relationship between nuclear weapons production and testing and the dangers it posed. Just as the glow produced by radium was almost hypnotic in its fascination, the power of the atomic bomb and the positive reports of its potential served to mute any discussion of its dangers.

Before the medical studies began to focus on radium as a potential culprit, there were suspicions about something at the paint dial plants that was behind the unexplained illnesses. What it was, people did not know. Dr. Walter Barry, a dentist in Newark, New Jersey, who had been treating workers for unexplained tooth loss and mouth infection since the summer of 1922, in January 1924, saw such an increase in patients with similar problems that he gave his patients an ultimatum.[120] Unless they quit their jobs at the Orange studio, or he would refuse to treat them. He suspected that phosphorus in the paint was the source of their problems.[121]

If Dr. Barry's conclusions were merely suspicions, there had been earlier, more specific conclusions about radium and its dangers. In December 1923, the U.S. Public Health Service had issued an official report on radium workers, serious enough to make a formal recommendation that all places where radium was used should take undertake safety measures to protect those handling radium. Although its recommendation was based on a study of only nine technicians, it had found in those nine, two cases of skin erosion and one case of anemia.[122] A year earlier, in December 1922, a doctor, after a patient visit, had forwarded his conclusion that the patient suffered from phosphorus poisoning to the New Jersey Industrial Hygiene Division. The complaint was taken seriously enough that an inspector was sent to the Orange studio. The inspector observed many of the workers lip pointing. He also noticed that one of the workers was limping. Before leaving, he took a paint sample for testing.[123]

John Roach sent the paint sample to Dr. Martin Szamatolski, a chemist. One possibility was phosphorus, although there had been no suggestions it was an ingredient. On January 30, 1923, before even beginning his tests, he wrote to Roach: "It is my belief that the serious condition of the jaw has been caused by the influence of radium." [124] He added a note to the letter suggesting that operators be warned with a printed leaflet about the dangers of contact with the skin, and especially getting it in their mouths.[125] When the tests were completed in April 1923, his preliminary guess was confirmed. There was no phosphorus. On April 6, 1923 he wrote to Roach. His January opinion had been correct. "Such trouble as may have been caused is due to the radium."[126] It was not the only warning at the time. In February 1923, Dr. George Willis, the co-founder of Radium Luminous Materials Corporation, published his personal findings about the dangers of radium in the 'Journal of the American Medical Association' (JAMA). Although it was relatively early, and cases were rare, radium's reputation for harmlessness might change as more individuals became exposed. The effect of long-term exposure was unknown, he suggested. Neglecting precautions might result in serious injury to the radium workers themselves. His findings were very much the result of personal experience. His right thumb had been amputated just five months before, in September 1922, when it was found to be cancerous.[127]

Marguerite Carlough, who had filed suit against the radium company in February died just after Christmas in 1925, at the age of twenty-four.[128]In 1926 the company agreed to an out-of-court settlement for her case and two others. It did not admit guilt.[129] Shortly after, in July 1926, Arthur Roeder had resigned as president of the United States Radium Company, although he remained a director on the board. [130] Things seemed to settle down after that for the company. Then, on May 18, 1927, Grace Fryer, a former employee, filed suit for $250,000 ($3.4 million).[131] She was soon joined by four other former workers, in a case being closely followed by the press, which created headlines with 'The Case of the Five Women Doomed to Die.'[132]

In response to the new lawsuits, the company adopted a take-no-prisoners attitude. It hired detectives to follow the girls, looking for dirt. It attempted to discredit the reputations of expert witnesses and hired medical experts of its own to dispute them. But this type of fight was not the only approach. When the attorney for the five women learned that there had been suspected radium paint cases in plants in Connecticut, he was surprised to learn that none had been reported to the Workmen's Compensation Commission. He also learned the reason why. The Waterbury Clock Company, which owned the plants, had been quietly settling cases without reporting them to the state authorities. [133]

On January 12, 1928, the trial began.[134] The women would present their evidence until April 27, 1928, when Raymond Berry, the attorney for the plaintiffs, rested his case. It was expected that the corporation would begin to present its case. Instead it asked for a delay. The judge announced that it would be granted and the court would re-convene on September 24.[135] It may have been a smart legal maneuver, but it did not play well with the public - or the press. One of the most powerful newspapers in America, the 'World,' and its writer, Walter Lippmann, had taken up the cause of the women.[136] Lippmann labeled the delay "one of thee most damnable travesties of justice that has ever come to our attention." [137] The pressure had an impact. The case was set almost immediately for trial. Shortly after the company offered to settle, and, by June 4, final settlement terms had been worked out.[138]

The settlement served to quiet public outrage over the USRC's treatment of its workers, but it was not enough to convince the authorities or the broader public that radium was sufficiently dangerous to have it banned. The fact that four of the five radium girls were still alive in 1933, nearly five years after their case settled suggested to some that the lawsuit had been a fraudulent scheme to extort money from the radium company. It would take another attention-grabbing and bizarre headline to convince the authorities to take serious action.[139]

In 1927, Eben Byers, a wealthy industrialist, had been prescribed Radithor, a highly radioactive tonic produced by Bailey Radium Laboratories, for an injury.[140] It worked so well that he drank several thousand bottles. He died of radium poisoning on March 30, 1932. The dangers of radium were summarized by a 1931 newspaper headline: "The Radium Water Worked Fine Until His Jaw Came Off." The case got the attention of the Federal Trade Commission (FTC) which took his testimony - Radithor had killed him - before he died. In December 1931, it issued a cease-and-desist order against Radithor. The U.S. Food and Drug Administration would follow with its own declaration making radium medicines illegal. The American Medical Association had the internal use of radium from its list of "New and Nonofficial Remedies."[141]

The radium industry was finished - or so it seemed. With most of the products using radium banned there was not much of a market. In August 1932, the Orange plant was razed to the ground. USRC had tried to sell it but could not find a buyer. USRC had taken some major hits, but it had managed to survive. On December 17, 1935, it would win a favorable ruling. Irene La Porte had been a dial painter for a relatively short time during the war, from the spring of 1917 until the spring of 1919. She had died of a sarcoma on June 16, 1931.[142] The company argued that the statute of limitations applied, since she had left the company in 1919. The judge ruled that, while the company in 1917, would have violated the standards of the 1930s, it could only be held to the standards and knowledge of 1917 and the case would have to be dismissed.[143] USRC, and the radium dial-painting business had survived the worst, and would be around to enjoy a resurgence in World War II, when the military demand for luminous dials came back. Radium use by the United States would total 190 grams where it had used less than 30 grams in World War I.[144]

Ottawa, Illinois

The consumer market for radium products in 1932 may have collapsed, but not the demand for radium or the industrial market for radium products. Ottawa, Illinois, where the Radium Dial company had a plant, seemed oblivious to events, even immune from the lawsuits hitting the USRC. Ottawa had seen the headlines, but Radium Dial had let the USRC, as the flagship or public face of the industry, take the brunt of the bad publicity. Like the Waterbury Clock Company, Radium Dial managed to stay out of the limelight, although its safety record was every bit as bad.

The Radium Dial Company was headquartered in Chicago, but in September 1922, decided to set up a dial-painting plant in Ottawa, eighty-five miles southwest of Chicago. Ottawa was small, with a population of only 10,816.[145] Its main client was Westclox, which had a 60 percent share of the U.S. alarm-clock market.[146] Radium Dial had advertised for girls over 18, for fine brushwork and, when they started they were shown how to point the brushes with their tongues. Mercedes 'Mercy' Reed, the wife of the assistant superintendent, worked at the plant as an instructress. To show the girls the radium was harmless, she ate the paint solution with a spatula, licking it in front of them. The girls were told the radium was harmless.[147] Mercedes Reed seemed to live a charmed life, said to have died in 1971, at the age of eighty-six, supposedly from colon cancer.[148]

At the time, Radium Dial, like USRC, was unfamiliar with the dangers posed by radium exposure - or ingestion. By 1925, the Ottawa plant was supplying 4,300 dials a day, and was the largest such plant in the country. That changed when Marguerite Carlough filed her suit against USRC in February 1925. Radium Dial was less concerned about the legal case itself than about the impact of the bad publicity on its workforce. They opened a second studio in Streator, sixteen miles south, which they ran for nine months, as a backup facility, in case the Ottawa force quit. When the Ottawa painters showed no signs of quitting, they shut the Streator plant down.[149]

As another precautionary measure, the company began having its workers medically tested, although not all the women were tested. The results were only provided to the company, without being shared with the workers. As another precaution, the company looked for alternatives to lip pointing, but put little effort into the task.[150]

The Bureau of Labor Statistics, following the February 1925 lawsuit filing, had taken an interest in plant working conditions. In April 1925, Swen Kjaer, one of the Bureau's agents, was ordered to do a study on the conditions at the Ottawa plant. He first interviewed a vice president of the company and met with some laboratory workers at the Chicago office. What he noticed when meeting the workers was that they had lesions on their fingers - a striking similarity to the observation made by the Drinkers at the Orange plant the year before. The lab workers also conceded that radium was dangerous to handle without safeguards. Lead screens would be installed as protection.[151]

On April 20, he began his tour of the Ottawa plant. He was informed that there had been no signs of illness among the girls and, although he observed them lip-pointing, they did appear healthy. He also talked to three of the town dentists. They said that they had seen none of the conditions among their patients, seen by dentists in New Jersey. They would notify the bureau if anything did show up. Kjaer spent three weeks on the study before it was stopped, but it had been enough time for him to conclude that radium was dangerous.[152]

The June 1928 settlement of the five New Jersey workers hit the front page of the 'Ottawa Daily Times.' If the 1925 New Jersey lawsuit filing had attracted little notice in Ottawa, it was hard to ignore the headline: "More Deaths Raise Radium Paint Toll to 17." What caught the attention of the dial-painters at work was a smaller sentence in the article, which matter-of-factly stated that the first manifestation of radium poisoning was decay of the gums and teeth. One of the workers immediately recognized the symptoms. A tooth extraction she'd had the prior year had still not healed. Work slowed as a result, the women more concerned about the dangers than working on the waiting dials. There was also Ella Cruse, who had died in 1927 at the age of twenty-four. She had had a tooth pulled, but it wouldn't heal. Then, in August, a pimple on her face became swollen, then turned into septic poisoning, and her whole head had turned black. The death certificate read "streptococcic poisoning."[153]

The company brought in experts and tested all the women. However, they did not share the results of the tests. Their manager tried to reassure them however that didn't help. It was only after the Radium Dial Company placed a full-page ad in the paper, ostensibly providing the test results that the dial painters quieted down. None of the medical experts, the ad said, had found any conditions or symptoms close to those shown in 'radium poisoning' cases.[154] After the ad ran several more times, accompanied by reassuring statements from the company, the girls returned to work. The day after the ad ran, the family of Ella Cruse filed a lawsuit against Radium Dial.[155] The case went nowhere. It was difficult to find anyone who knew about radium poisoning and an autopsy, which would have provided definitive proof, was too expensive.[156]

Swen Kjaer, the radium-poisoning investigator of 1925, had not lost interest in Radium Dial or Ottawa, in spite of his short three-week investigation. He had returned to Ottawa in February 1929. He visited the plant and talked to the same doctors and dentists he visited before. They said the same thing. They had seen no unusual symptoms in the patients they treated. If the attorney for Ella Cruse seemed to have hit a brick wall, with his lawsuit, Kjaer wanted more information from the company on her health records and cause of death. The only information the company provided were her employment dates. There was another worker, Margaret Looney, who had been found radioactive by electroscopic (breath) test in 1925 and again in 1928. Kjaer wanted information on her as well.[157] She would die later that year, on August 14th.[158]

Radium Dial's actions following her death were very strange. Knowing of the government's interest in her case, when she collapsed at work on August 6th, they had her admitted to the company doctor's hospital. It was likely that they paid for her bills. Suspected of having diphtheria, she was not allowed visitors, even family members. Yet the family was at the hospital when she died at 2:10 a.m. They were still there, in the middle of the night, when company men arrived to remove her body for burial. Jack White, her brother-in-law, was there, and imposing enough to argue with them. They backed down. Later, they agreed to an autopsy, if their family doctor could be present. However, when their doctor arrived at the time set for the autopsy, he found that it had been performed an hour before he got there. The company doctor had removed the remains of her jaw and taken her bones. Only the company received the autopsy results. Everything was reported as normal and her upper and lower jaw showed none of the destructive changes associated with radium poisoning. Diphtheria was listed as the cause of death.[159]

The number of workers or former workers suffering serious medical problems continued to grow. In the summer of 1934, a large group of Ottawa dial-painters, frustrated at the unwillingness of Radium Dial to help them, filed suit for $50,000 ($884,391) each.[160] That October, the president of Radium Dial, was ousted. Almost immediately he formed a new company, named Luminous Processes, which would continue operations in Ottawa, down the street from the Radium Dial studio.[161] Many of the dial-painters agreed to work for the new firm. On April 17, 1935, the lawsuit which had been filed in 1934 was dismissed. The law under which it was brought was declared invalid - it had not provided standards for measuring compliance. The dismissal was upheld on appeal.[162]

The legislature responded with a new law, the Illinois Occupational Diseases Act, which included a provision for industrial poisoning. It was signed by the governor, but would not become law until October 1936. It would not apply retroactively.[163]

Luminous Processes may have been a start-up business, but it was successful enough to finish Radium Dial off, at least in Ottawa. In December 1936, Radium Dial closed the doors of its Ottawa facility and left town.[164] The only thing it left in Illinois was a $10,000 ($164,595) bond posted with the Industrial Commission. In 1928, the Commission was informed by Joseph Kelly that its workers compensation policy had been canceled in light of the risk associated with the dial-painting business.[165] Joseph Kelly and Luminous Processes may have been running the same operation, but legally they had no responsibility for Radium Dial activities. It did not take long to discover where Radium Dial was. The 'Times' found that it had set up business in New York, where it was hiring women to paint brushes.[166]

If the Ottawa dial painters were only fighting for the $10,000, they nevertheless needed an attorney to represent them at the July 21, 1937 Industrial Commission hearing. Two days before the hearing they found an attorney willing to represent them - Leonard Grossman - although he was no ordinary attorney. Clarence Darrow had recommended him. With only two days of preparation, the first hearing was short. Grossman asked for a postponement. Radium Dial readily agreed.[167] The trial date was set for February 10, 1938.[168]

On April 5, 1938, following the trial, the Industrial Commission ruled that the illnesses suffered by the women were the result of their work. Radium Dial had, in effect, been found guilty, but the victory was largely symbolic. The Commission had no legal authority to levy on assets outside of Illinois. The $10,000 bond was the only amount available. An appeal by the company was dismissed by the IIC on July 6, 1938.[169] Radium Dial continued its appeals, until the case ended up at the U.S. Supreme Court. On October 23, 1939, it denied cert of a lower court ruling against the company and refused to hear the case. Radium Dial had lost its last appeal.[170]

The story of USRC and Radium Dial seemed to be a classic story of corporate greed and the triumph of good over evil. Yet the story cannot be viewed in clear black and white terms. If Radium Dial did everything to avoid its liability in Ottawa, even going so far as to lie to the public and its employees, it was not alone. The doctors and dentists there, who must have seen the symptoms of radium poisoning fairly early, refused to acknowledge or report them, perhaps out of fear of losing the jobs the company provided. The creation of the paint formula in 1913 by Sabin von Sochocky was an example of entrepreneurial innovation; the failure to find an alternative to lip-painting was an example of innovative laziness. It showed the power of the press to arouse public opinion in support of a cause, but also revealed just how shallow that support could. Its staying power lasted only as long as the next spectacular or bizarre headline. If there was an argument to be made for government getting out of the job of regulation, for letting market forces regulate conduct, the story of the Radium Girls was not exactly the best case available to advance that argument.

When the Luminous Processes studio was finally shut down on February 17, 1978, inspectors measured radiation levels which were 1,666 times higher than was considered safe. Some graffiti artist had scrawled on the outside of the abandoned building the words: "Dial Luminous for Death."[171] In 1979 the EPA found that the former USRC site in Orange had levels of radioactivity twenty times higher than was considered safe. USRC had dumped its radioactive waste as landfill, over which 750 homes had been built. The government had to spend $144 million to clean up radium-contaminated sites across New York and New Jersey.[172]

From privatization as a goal, to entrepreneurship as a proven theory

The Stock Market Crash of 1929 should have served as a permanent warning to anyone tempted to invest in the stock market. The best advice is - don't even think about it! For anyone who has attended a seminar on investing - for those who just can't resist, who choose to ignore the alarms of 1929 - the secondary, or second most repeated piece of advice, is to at least diversify. While it is possible to individually pick winning stocks, it becomes increasingly difficult, over time, statistically, to consistently pick winners. Over time, the odds of picking a winner, are almost as great as selecting a looser. Yet, there are still people coming out of an investment seminar devoted entirely to diversification, who will say they would like to get into this area, since they know how to pick winners. For anyone who thinks that picking winners is a matter of serious study, it is only necessary to ask for a show of hands of people who have invested in Bitcoin. The answer, among audience members is likely to be anywhere from a third to a half of those attending.

A diversified stock portfolio is generally considered a good idea, but diversification only goes so far. It does not provide absolute protection against all eventualities. 'Adults in the Room' is a cautionary tale of the problems a poor country faces in the capitalist marketplace, with Greece serving as the fable's subject. When it comes to investments and investment decision-making, perhaps it might be helpful to look for an example among the prosperous - in this case Norway. Norway, planning for the eventual depletion of its oil reserves, is betting on the market. As a wise investor, it has diversified its portfolio, which should provide some protection against a few investments which go south. What it is betting on is the accumulation of a fund so large that its portfolio will survive both bad investment decisions and the loss of oil revenues. It remains to be seen whether that assumption is based on reality - or is just another theoretical assumption. The economic engine which sustains or drives economies is real, or hard resources. Once they are depleted, or gone, the ability to rely on saved financial reserves becomes a race against statistical odds. A bad decision can be as likely as making a good decision.

A comparison of a modern, prosperous Norway with a poor, struggling Greece may seem an unlikely comparison. Yet, there is a striking parallel between the economic circumstances of the two countries. Some two thousand years ago, Athens' circumstances were in stark contrast to those in which it finds itself today. It maintained an empire which made it one of the wealthiest states of the Greek world. Its wealth was based on trade and an annual tribute it received from smaller states around the Mediterranean. In 431 B.C., on the eve of the Peloponnesian War, it had accumulated reserves in its treasury of some 6000 talents. When the war ended with Athens' surrender in 404 B.C., its empire was gone, as well as the accompanying revenues and the financial reserves it held at the beginning of the war. It never fully recovered.

Privatization and free market entrepreneurship have been touted as stand-alone activities which, by themselves are capable of manufacturing products, creating demand, and fueling growth. Where even the most powerful economies are struggling to sustain a growth rate of two or three percent, advocates of entrepreneurship imagine or suggest a Silicon Valley-inspired world where growth will be in the two hundred to three hundred percent range. As important as innovative ideas are to economic development, it is difficult to find an example of an economy based on ideas alone. The silicon chip represented one of the most significant advances in the field of computing, but people did not spend money on the idea of a computer chip, they spent money on the computer it was installed in. While it is true that people will pay a lot for an idea, but in the economic world, that is because the idea can be converted into a product or service which the market will pay for. Economies have been quick to adopt ideas they found useful, but, for the most part, are dependent on tangible, and continuing, sources of revenues.

The problem with austerity and privatization, as economic theories or policies, has been the belief that government, government spending, and government programs, represent the only problem facing countries and their economies. Governments have been ridiculed for spending on social programs, or attempting to eliminate poverty with government spending, as unrealistic failures. Yet, when governments have adopted proposals for austerity and privatization, those policies alone did not bring growth or prosperity. The result was not that different than that of the government programs that were criticized - failure. The stories coming out of Greece, or Argentina, or nearly every country which tried austerity has been the same - not a miraculous turn-around and recovery or an entrepreneurial resurrection, but hyperinflation, high unemployment, and an exodus of skilled workers to foreign countries.

Perhaps the mythical creature which best symbolizes the latest free-market thinking, is not the fearsome Minotaur, but another legendary combination, partially from Greek mythology, that of Icarus, the boy who flew too close to the sun, which melted the wax holding his wings together, causing him to fall to his death in the sea. Icarus provides only half the creature. The other half is appropriately more modern - Hans Christian Andersen's 'The Emperor's New Clothes,' the story of the suit made of invisible threads, which turned into invisible cloth, then into an invisible suit - until a child bystander finally commented that 'the emperor has no clothes.' Perhaps the only difference is that, in contrast to the ancient Greek legend, the modern wings never reached the stage where Icarus could even attempt flight.

Athens - Democratic Ideals, Economic Abuses

One of the basic tenets or often-stated foundations of the free market is the idea of exchange between the willing seller and willing buyer. In theory, it works. Yet, in the real world, there is rarely a negotiation between equals. The temptation to exploit power to take advantage of, or dictate to others, has always been there. It did not start with the European Union (EU), the European Central Bank (ECB) , or the International Monetary Fund (IMF). It has not been confined to the world of economics. Stronger military powers have used their power to take advantage of weaker neighbors. Even today, religious debates are won or lost, not based on theological arguments, but on the power and size of religious communities. Minority religions or communities struggle.

Using power to take advantage of weaker parties usually takes place in the context of military or economic struggles. Stronger nations have used armies to expand territory. Economically powerful countries or companies can obtain better contracts or extract concessions from business partners. Real-world economic or military advantage can spill over into the theoretical to win debates about ideas. In the world of business, it is not necessarily the best idea which wins, but the idea backed by the largest advertising budget. Religion has a long history of theological battles taking place on real-world battlefields. If economic theories are debated in the academic world, perhaps their validity is tested and decided, not with academic argument, but by the application of raw economic power.

No surprise then that Greece, in 2015, had to accept the capitalist-associated ideas of austerity, privatization, and competitiveness in exchange for another loan. If Greece seems to be a victim of modern capitalism, there was a time when it was not in a subservient role. The terms capitalism, or free markets. or economics had not even been invented. There was a time before Greece became a country called Greece, a time when Greece was really just Athens and Sparta - a time when Athens ruled an empire. It did not have humiliating terms imposed on it, rather it dictated to others the terms they were forced to accept.

Ancient Athens is one of the earliest examples of economic development and the benefits of trade. It is also an example of how power, whether economic or military, can be abused. Athens, operating from a position of strength, did not so much negotiate agreements, as dictate terms. The Parthenon and the Athenian Acropolis have long served as both a symbol of democratic ideals, in the political sphere, and material prosperity and achievement, on the economic side. It takes little imagination to understand that Athens, in its day, was an economic powerhouse, as well as a military power. The battle of Plataea in 479 B.C. had ended Persian hopes of conquering Greece. In 477, Athens would convince many of the island states to form a confederacy, headquartered at Delos, which would offer protection from further Persian aggression.

The Confederacy of Delos started out as a voluntary military alliance, in which some states contributed ships and some paid money toward the maintenance of a fleet. The first assessment is said to have totaled 460 talents.[173] Militarily the Confederacy operated as planned, attacking and weakening Persian-controlled states where they posed a threat. After a time however, the economic interests of Athens came into conflict with those of its allies. In 465 B.C. the island city of Thasos revolted. Its fleet was defeated and it was forced, not only to continue in the Confederacy, but to pay an annual tribute.[174] Other cities attempted revolts and were forced to submit. By 454 B.C. the annual meetings in Delos had been discontinued, and the treasury was transferred to Athens.[175] In 449 Athens concluded a peace treaty with Persia.[176] While Persia still represented a potential threat, the justification for the Confederacy no longer existed.

Around 450 Pericles proposed that the Confederacy funds be used to restore the buildings destroyed during the Persian occupation. It was popular with Athenians, less so with the allies. It would mark the start of a building program which would continue for some twenty years, until 430. Construction of the Parthenon would begin in 447 and would be completed in 432. It was not the only policy in which Athens ignored the wishes of others. Athens also began sending settlers to different areas. Some were new colonies in abandoned places, but others were forced on rebellious cities, acting as garrisons.[177] More directly related to trade, was a decree requiring that trade with Athens be conducted using Athenian coins. Allied cities were forbidden to mint their own coins. As a result, sometime around 445, the Athenian owl coin began to be widely circulated.[178]

While it was dangerous to openly challenge Athens, economic interests and power changed over time. The city of Corinth, located on the Peloponnesian Peninsula, where it had the backing of Sparta, was in a position to defy Athens. If it did not challenge Athens in the wider world of the Aegean or Mediterranean Seas, Athens was content to cede control of the Gulf of Corinth to the city of Corinth. The city of Megara, an ally of Corinth, was in a different situation - relatively close, geographically, to Athens itself. In 433 Megara had sent a force to assist Corinth at the Battle of Sybota. Knowing that war with Sparta was coming, Pericles provided the pretext. Athens retaliated against Megara for its involvement at Sybota, passing a decree excluding Megarians from the Athenian Agora, as well as other ports of the Empire. The Spartan Assembly would shortly after approve a declaration of war, which would be ratified by the Peloponnesian League members.[179]

The surrender of Athens in 403 B.C. might be viewed as a scene from a Greek tragedy, a morality play in which those who abuse their power or act unjustly are punished. Many of the Peloponnesian allies wanted to destroy Athens and sell her people into slavery. Sparta saved her. (Perhaps the thought of tearing down stone buildings and razing the entire city to the ground was a daunting prospect, even for the Spartans. Perhaps, there was also a lingering fear that the gods might take revenge if their temples were damaged, even if Athens had abused its position in using the money of others to pay for the construction, in the first place.) As Yanis Varoufakis learned, the lesson modern capitalists might draw from the parable of Ancient Athens was not one of sympathy for the modern Greek state, nor one involving moral judgments of right or wrong. In fact, if a New York or Chicago accent were added, the lesson would not be one of sympathy for the Megarians, following the Athenian banishment, it might be more like: "You know that Pericles - What a guy! Did he know how to negotiate or what?" The other lesson would be just as practical. If Greece's art treasures were to be sold off in fire sale to private individuals, they might command a higher price if it could be proved they were owned by someone with name recognition, like Pericles or Leonidas, or were found to be part of the Parthenon.

Athens was punished when it surrendered in 404 B.C. It lost all of its foreign possessions, all but 12 triremes of its fleet, and had to tear down the Long Walls and fortifications of its port, the Piraeus.[180] But it was punished, not for its unjust treatment of lesser states, but simply because it had lost. Sparta drew its own conclusions about the lessons history teaches - the lesson was not to treat others with fairness. The lesson instead was to treat the lesser states in the same way Athens had treated them. The Athenian model, once Sparta was in control, actually looked pretty good as a model for rule. Sparta, now in a leadership position, became as hated for its own actions, as Athens had been before it. It soon was at war with the allies who had helped it in the war with Athens. The Corinthians, who been among the states appealing to Sparta for help in 431 B.C., now came into conflict with Sparta.[181] In 394 Corinth would build a defensive long wall across the northern Peloponnese, which was strong enough to prevent Sparta from sending troops into other parts of Greece.[182]

The Spartans, who had defeated Athens with Persian held, now began attacking Persian provinces and came into conflict with the local Persian satrap (administrative ruler), Pharnabazus. He not only retaliated by raiding Spartan lands, but also helped a recovering Athens, in 393 BC. The Athenians, with the Spartans bottled up in their peninsula, now rebuilt their Long Walls and Piraeus fortifications. Athens, within ten years, had managed to recover from the humiliation of her surrender. She was not the power she once was, although she managed to assemble several fleets, which began, once again, to collect tribute from former island states. Her recovery came just in time. In 392, the Spartans would launch a series of attacks against the Corinthian fortifications, capturing some key positions, which they abandoned. The next year, they returned to the attack and providing a means of once more sending troops into the rest of Greece. Sparta worked diplomatically to restore relations with Persia, and, in 387-386 forced Athens to agree to peace terms, (the King's Peace) largely dictated by Persia.[183]

Sparta found itself once more in charge. In 386-5, it would attack Mantinea, one of its smaller neighbors. In 382 B.C., it attained a major prize, with the seizure of the Cadmea, the citadel of Thebes, and the installation of a friendly government in that city. Three years later, in the winter of 379-8, the Thebans managed a coup, which threw out the Spartans.[184] In 378 Athens would begin another war with Sparta and enter into an alliance with Thebes. The following year Athens managed to persuaded some of the smaller states to form a Second Athenian Confederacy. Supposedly she provided guarantees which would prevent a repeat of the abuses of the Confederacy of Delos.[185] If it did restore Athenian dominance, it did at least force Sparta into negotiations. In 371 B.C., a peace agreement, the Peace of Callias, was signed.[186]

Thebes had been excluded from the treaty. With Athens sidelined, the contest for dominance of Greece, was now between Sparta and Thebes. Sparta was still considered the leading military power, but when she sent an army against Thebes in July 371 B.C., she suffered a major defeat in the Battle of Luectra. Not only was her king, Cleombrotus, killed, but the Spartans lost 400 of their number, which included members of the Imperial Guard, which was wiped out.[187] The Thebans had trained and employed an elite force of 300 hoplites, called the Sacred Band, along with new tactics. Thebes followed up her victory by invading Laconia in 370 B.C., with her forces reaching the outskirts of Sparta. The city was too strong to attack, but Sparta had been unable to stop the invasion.[188] In 369 B.C., Thebes would invade Spartan territory for a second time, although they failed to repeat the success of the first invasion. Sparta, clearly a declining power, was invaded for a third time, in 366, and again, a fourth time, in 362. Thebes inflicted another serious defeat on the Spartans at Mantinea, but lost its inspirational general, Epaminondos, who was killed by a spear thrust, while he was pursuing the retreating forces.[189]

The death of Epaminondos forced Thebes to abandon its most aggressive policies. There was continuing rivalry with Athens, a diminished, but still formidable sea-power. If they fought each other, they both kept an eye on Macedon, the rising power to the north. Athens had to worry about threats to her sea commerce. By 352 B.C. Philip of Macedon had created his own naval force which was capturing Athenian grain ships and some of her island allies.[190] By 338 B.C. Athens and Thebes entered into an alliance in hopes of stopping Philip. He had sent an army south and, at Chaeronea, between Phocis and Thebes, his force encountered the army of Athens and Thebes. The Theban Sacred Band formed the corps of the defending army. On the opposing side, the Macedonian cavalry was commanded by Alexander, in what would be his first battle. It would be the cavalry which would decide the battle. At some point in the battle the Athenians, sensing victory, pushed forward, only to be hit from behind and on the flank by Alexander's cavalry. One thousand were killed, another 2000 thousand captured, while others who escaped, ran. The Sacred Band did not run, but continued fighting to the end.[191]

Chaeronea marked the end of the power of the three major states which had dominated Greece: Thebes, Athens, and Sparta. Thebes, with her army gone, was treated as a conquered territory, her fortress garrisoned by Macedonian troops and potential political opponents executed. Athens was treated somewhat leniently, not occupied, her prisoners restored, and the bodies of those slain returned. She was however, forced to join the new Hellenic union and dissolve her own confederacy. It was also clear she would no longer be free to challenge Macedonia. Sparta had to endure an invasion of the Peloponnese. She refused to formally submit and while Philip left the city itself alone, his army devastated most of her surrounding lands and distributed parts of her territory to neighboring states, such as Argos, Tegea, Megalopolis, and Messenia.[192]

Austerity and Privatization - Mission Accomplished

If capitalist and free market economists are looking for an economic model in which to test their theories, then perhaps they should look to the ultimate macro-economic model - the United States. Rather than picking on Greece, they ought to do a study on how privatization, austerity, and competitiveness are working in the U.S. -- No problems there! - so it should be a relatively quick study. The U.S. is one country where their proposals to eliminate unions and curtail worker rights would be well-received. That struggle has been going on since the 1870s and 1880s - not quite since the beginning of time, but at least further back than a couple of years. Technically, the origins of the bad treatment of workers can be dated to 1844, with the Silesian weavers revolt in Germany, or perhaps a little earlier, to 1839, with Engels' publication of "Letters from Wuppertal," or his 1845 work "The Condition of the Working Class in England in 1844." For shear abysmal treatment of workers over a longer period of time however, the U.S. will not take second place to Germany.

The economist Joseph E. Stiglitz may have best summarized the criticism of privatization in a single statement: "But moving people from low-productivity jobs in state enterprises to unemployment does not increase a country's income, and it certainly does not increase the welfare of the workers."[193] Criticism of privatization and austerity programs should perhaps be kept in perspective. It is easy to criticize the IMF or the EU for imposing restrictive, even harsh, terms on poor countries for the use of their money; for implementing policies which do not work or which fail to help those who need help the most. However there are countries or regions which receive no aid at all and are forced to deal with problems on their own, who do not have the luxury of deciding which budget programs are unnecessary or which needs can go unmet. In a world accustomed to assigning problems and solutions to governments or countries, there are an increasing number of problems and solutions which fall outside the domain of government. Refugees, whether fleeing poverty or war or ethnic conflict, who have become stateless, cannot turn to governments for help.

The triumph of capitalism and the free market threatened to turn economics into an exciting, vibrant field, for a time. Capitalism would lead the world into a new Age of Prosperity and Enlightenment. It remains to be seen how that will work out in the end. The danger of attempting to turn boring economic analysis into an inspirational, and exciting story of epic proportions, is that economics, already saddled with the label of the dismal science, will be burdened with a new description - the dismally wrong science.


(1) Yanis Varoufakis, "Adults in the Room: My Battle with the European and American Deep Establishment" (New York:Farrar, Straus and Grioux, 2017)
(2) Varoufakis, ibid, p. 328
(3) Varoufakis, ibid, pp. 25-27
(4) Varoufakis, ibid, p. 41
(5) Varoufakis, ibid, p. 43
(6) Ellen Schrecker, editor, "Cold War Triumphalism: The Misuse of History After the Fall of Communism" (New York:The New Press, 2004), p. 104.
(7) Schrecker, ibid, p. 6.
(8) John Mackey & Raj Sisodia, "Conscious Capitalism: liberating the heroic spirit of business," (Boston:Harvard Business Review Press, 2014), p. 23.
(9) Schrecker, op.cit., p. 103.
(10) Yanis Varoufakis, "The Global Minotaur: America, Europe and the Future of the Global Economy" (London:Zed Books Ltd, 2015), pp. 24-25.
(11) Arkady Ostrovsky, "The Invention of Russia: From Gorbachev's Freedom to Putin's War" (New York:Viking, 2015), p. 82.
(12) Masha Gessen, "The Future is History: How Totalitarianism Reclaimed Russia," (New York:Riverhead Books, 2017), p. 73
(13) Gessen, ibid, p. 85
(14) Gessen, ibid, p. 92
(15) Gessen, ibid, pp. 106-107
(16) Gessen, ibid, pp. 107-108
(17) Gessen, ibid, p. 108
(18) Ostrovsky, op.cit., p. 131.
(19) Gessen, op.cit., p. 109.
(20) Richard Lourie, "Putin: His Downfall and Russia's Coming Crash," (New York:Thomas Dunne Books, 2017), p 58.
(21) Angus Roxburgh, "The Strongman: Vladimir Putin and the Struggle for Russia" (New York:I.B Tauris & Co. Ltd, 2012), p. 7.
(22) Daniel B. Baker, ed., "Explorers and Discoverers of the World," (Detroit:Gale Research, Inc., 1993), pp. 42-43 & 61.
(23) National Geographic Society, "1000 Events That Shaped the World," (Washington, D.C.:National Geographic Society, 2007), p. 233.
(24) Roxburgh, op.cit., p. 11.
(25) Roxburgh, ibid, pp. 10-14.
(26) Lourie, op.cit., p 81.
(27) Ostrovsky, op.cit., p. 235.
(28) Ostrovsky, ibid, p. 265.
(29) Gessen, op.cit., p. 110.
(30) Ostrovsky, op.cit., p. 136.
(31) Gessen, op.cit., pp. 127-128.
(32) Adam Tooze, "Crashed: How a Decade of Financial Crises Changed the World," (New York:Viking, 2018), p. 119.
(33) Lourie, op.cit., pp. 107-108.
(34) Tooze, op.cit., p. 128.
(35) Tooze, ibid, p. 255.
(36) Joseph E. Stiglitz, "Globalization and Its Discontents Revisited: Anti-Globalization in the Era of Trump," (New York:W. W. Norton & Company, Inc., 2018), p. 236.
(37) Stiglitz, ibid, p. 239.
(38) Stiglitz, ibid, p. 238.
(39) Tooze, op.cit., pp. 119-120.
(40) Tony Wood, "Russia Without Putin: Money, Power and the Myths of the New Cold War," (Brooklyn,NY:Verso, 2018), p. 65.
(41) Varoufakis, "Adults in the Room," op.cit., pp. 36-37.
(42) Varoufakis, "Adults in the Room," ibid, p. 367.
(43) Varoufakis, "Adults in the Room," ibid, p. 191.
(44) Varoufakis, "Adults in the Room," ibid, p. 200.
(45) Stiglitz, op.cit., pp. 116 & 164-165.
(46) Varoufakis, "Adults in the Room," op.cit., p. 327.
(47) Conan Fischer, "The Ruhr Crisis, 1923-1924," (New York:Oxford University Press, Inc., 2003), p. 24
(48) Fischer, ibid, p. 24
(49) Fischer, ibid, pp. 24-25
(50) Elspeth Y. O'Riordan, "Britain and the Ruhr Crisis," (New York:PALGRAVE, 2001), p. 10
(51) O'Riordan, ibid, p. 10
(52) O'Riordan, ibid, p. 11
(53) O'Riordan, ibid, p. 10
(54) Christopher Clark, "Iron Kingdom: The Rise and Downfall of Prussia, 1600-1947," (Cambridge, MA:The Belknap Press of Harvard University Press, 2006), pp. 296, 305, 307.
(55) Michael Worth Davison, MA, ed., "When, Where, Why & How It Happened," (London:Readers Digest Association Limited, 1993), p. 165.
(56) Fischer, op.cit., p. 15.
(57) Fischer, ibid, p. 15.
(58) Fischer, ibid, p. 16.
(59) Fischer, ibid, pp. 7-8.
(60) Fischer, ibid, p. 9.
(61) Fischer, ibid, p. 8.
(62) Fischer, ibid, p. 9.
(63) Fischer, ibid, p. 8.
(64) Fischer, ibid, p. 11.
(65) Fischer, ibid, p. 10.
(66) Fischer, ibid, p. 12.
(67) Fischer, ibid, p. 14.
(68) Fischer, ibid, p. 13.
(69) Margaret MacMillan, "Paris 1919: Six Months that Changed the World,"(New York:Random House Trade Paperbacks, 2002), p. 170.
(70) O'Riordan, op.cit., p. 5.
(71) O'Riordan, ibid, pp. 5, 10.
(72) Fischer, op.cit., p. 25.
(73) O'Riordan, op.cit., p. 17.
(74) Fischer, op.cit., p. 27-28.
(75) Martin Gilbert,"A History of the Twentieth Century, Volume One: 1900-1933," (New York:William Morrow and Company, Inc., 1997), p. 653.
(76) Gilbert,ibid, p. 654.
(77) Gordon A. Craig, "Europe Since 1815," 2nd edition, (New York:Holt, Rinehart and Winston, Inc., 1966), p. 626.
(78) Gilbert, op.cit., p. 654.
(79) Gilbert, ibid, p. 656.
(80) Craig, op.cit., p. 626.
(81) Craig, ibid, p. 631.
(82) Craig, ibid, p. 562.
(83) Craig, ibid, p. 632.
(84) Davison, op.cit., p. 283.
(85) Gilbert, op.cit., p. 663.
(86) Gilbert, ibid, p. 709.
(87) Gilbert, ibid, p. 709.
(88) Kate Moore, "The Radium Girls: The Dark Story of America's Shining Women" (Naperville, IL:Sourcebooks, Inc., 2017), p. 4.
(89) Moore, ibid, p. 16.
(90) Moore, ibid, pp. 16-17, 52.
(91) Moore, ibid, pp. 17, 24-25.
(92) Moore, ibid, pp. 17 & 22.
(93) Moore, ibid, p. 240.
(94) Moore, ibid, p. 52.
(95) Moore, ibid, p. 8.
(96) Moore, ibid, p. 32.
(97) Moore, ibid, p. 73.
(98) Moore, ibid, p. 78.
(99) Moore, ibid, p. 89.
(100) Moore, ibid, p. 97.
(101) Moore, ibid, p. 99.
(102) Moore, ibid, p. 129.
(103) Moore, ibid, pp. 134-135.
(104) Moore, ibid, p. 89.
(105) Moore, ibid, pp. 108-109.
(106) Moore, ibid, p. 79.
(107) Moore, ibid, p. 82.
(108) Moore, ibid, p. 83.
(109) Moore, ibid, p. 125.
(110) Moore, ibid, p. 84.
(111) Moore, ibid, p. 89.
(112) Moore, ibid, pp. 89 & 112.
(113) Moore, ibid, pp. 96-97.
(114) Moore, ibid, p. 97.
(115) Moore, ibid, p. 120.
(116) Moore, ibid, p. 97.
(117) Moore, ibid, p. 106.
(118) Moore, ibid, p. 115.
(119) Moore, ibid, pp. 107-108.
(120) Moore, ibid, p. 46.
(121) Moore, ibid, p. 73.
(122) Moore, ibid, p. 66.
(123) Moore, ibid, pp. 49-50.
(124) Moore, ibid, p. 50.
(125) Moore, ibid, p. 51.
(126) Moore, ibid, p. 52.
(127) Moore, ibid, p. 52.
(128) Moore, ibid, p. 155.
(129) Moore, ibid, p. 158.
(130) Moore, ibid, p. 171.
(131) Moore, ibid, p. 175.
(132) Moore, ibid, p. 177.
(133) Moore, ibid, pp. 181-183.
(134) Moore, ibid, p. 197.
(135) Moore, ibid, p. 216.
(136) Moore, ibid, pp. 190-191.
(137) Moore, ibid, p. 219.
(138) Moore, ibid, p. 227.
(139) Moore, ibid, p. 272.
(140) Moore, ibid, p. 122.
(141) Moore, ibid, p. 272.
(142) Moore, ibid, p. 266.
(143) Moore, ibid, p. 306.
(144) Moore, ibid, pp. 377-378.
(145) Moore, ibid, p. 41.
(146) Moore, ibid, p. 53.
(147) Moore, ibid, p. 43.
(148) Moore, ibid, p. 388.
(149) Moore, ibid, p. 98.
(150) Moore, ibid, pp. 98-99.
(151) Moore, ibid, p. 101.
(152) Moore, ibid, pp. 101-103.
(153) Moore, ibid, pp. 186-188.
(154) Moore, ibid, p. 232.
(155) Moore, ibid, p. 234.
(156) Moore, ibid, pp. 243-244.
(157) Moore, ibid, pp. 242-244.
(158) Moore, ibid, p. 247.
(159) Moore, ibid, pp. 247-248.
(160) Moore, ibid, p. 296.
(161) Moore, ibid, p. 301.
(162) Moore, ibid, p. 304.
(163) Moore, ibid, p. 308.
(164) Moore, ibid, p. 312.
(165) Moore, ibid, p. 335.
(166) Moore, ibid, p. 317.
(167) Moore, ibid, p. 322.
(168) Moore, ibid, p. 328.
(169) Moore, ibid, p. 369.
(170) Moore, ibid, p. 375.
(171) Moore, ibid, p. 399.
(172) Moore, ibid, p. 385.
(173) J.B. Bury and Russell Meiggs, "A History of Greece to the Death of Alexander the Great," Fourth Edition, (New York:St. Martin's Press, Inc., 1975), p. 203.
(174) Bury and Meiggs, ibid, p. 210.
(175) Bury and Meiggs, ibid, p. 211.
(176) Bury and Meiggs, ibid, p. 222.
(177) Bury and Meiggs, ibid, p. 225.
(178) Bury and Meiggs, ibid, pp. 533-534.
(179) Bury and Meiggs, ibid, p. 247.
(180) Bury and Meiggs, ibid, p. 317.
(181) Bury and Meiggs, ibid, p. 247.
(182) Bury and Meiggs, ibid, p. 341.
(183) Bury and Meiggs, ibid, pp. 344-345.
(184) Bury and Meiggs, ibid, p. 350.
(185) Bury and Meiggs, ibid, p. 351.
(186) Bury and Meiggs, ibid, pp. 355-356.
(187) Bury and Meiggs, ibid, p. 367.
(188) Bury and Meiggs, ibid, p. 372.
(189) Bury and Meiggs, ibid, p. 383.
(190) Bury and Meiggs, ibid, p. 424.
(191) Bury and Meiggs, ibid, p. 440.
(192) Bury and Meiggs, ibid, pp. 441-442.
(193) Stiglitz, op.cit., p. 153.