Welcome to the (New) Era of American Lawlessness

Caricatures by DonkeyHotey
Used with permission

By, Chris Ricchetti | 30 June 2022

Rebuttal by, James S. Eggert, Esq.

On the occasion of the end of the October 2021 Term of the Supreme Court of the United States

…to secure these [inalienable] Rights, Governments are instituted among Men, deriving their just Powers from the Consent of the Governed, …

Thomas Jefferson (1776)

Ordered liberty sets limits and defines the boundary between competing interests.

Associate Justice Samuel A. Alito, Jr. (2022)

Agreement Across the Abyss

As bitterly divided as this country is ideologically, there is greater than 60% agreement on at least a dozen important issues—including the right to abortion (61% – Pew, 2022), the right to gay marriage (71% – Gallup, 2022), reasonable gun restrictions (63-87%, depending on the proposed restriction – Pew, 2021), regulation of carbon emissions and public investments in clean energy (71-90%, depending on the proposal – Pew, 2020) and several others—including many aspects of police reform (63-89%, depending on the proposal – University of Maryland, 2020).

If democracy were working in the United States, we would already have Federal laws establishing and protecting abortion rights, marriage rights, gun safety, emissions standards, and the rest—because these are things that a supermajority of Americans want.

But we are falling far short of the promise of American democracy—government of, by and for the people. Objectively, a huge segment of our existing laws and regulations conflict with the will of the majority of our citizens, and precious few broadly supported proposals for new legislation are ever enacted.

How did this happen?

Origins of the Electoral College

The Framers of the Constitution were determined to immunize the Republic against the “Tyranny of the Majority,” a danger that Western political philosophers dating back to ancient Athens had long held was the chief vulnerability of democracy, and against potential abuses of Presidential power. The Southern delegates to the Convention were concerned specifically with protecting the institution of chattel slavery from Northern abolitionist “tyranny.”

One of several preventative devices that the Framers established was to grant both greater parliamentary and greater electoral representation per capita to the citizens of less populous (“smaller”) states than to the citizens of more populous (“larger”) states. Without these assurances—to allay Southern fears that the more numerous and more populated Northern states might garner Congressional and Presidential power sufficient to end slavery—the Union of the colonies would not have endured.

The Electoral College was born of the same concern—to protect minority interests generally (not only slavery) from being trampled upon by “mob rule.” Ironically, for all their grand declarations about all men being equal and liberal democracy being the best form of government ever devised, many of the Framers had serious reservations about the new Enlightenment values. It all sounded great on paper. But, at the time, liberal democracy was largely untested, and no one could be certain that We the People would act out of reason, rather than passion. Many feared that a populist tyrant might assemble a popular majority and ascend to the Presidency by swaying impressionable, under-informed voters—perhaps they were onto something!

“Every white, male landowner shall have his say at the ballot box,” they reasoned. “But, as a safeguard against passion-driven voters unknowingly electing an un-democratic, incompetent, or even tyrannical Executive, we will let a small cadre of the ‘smartest’ men from each state—the ‘Electors’—make the final decision.”

From the beginning, the Framers installed an oligarchic “escape hatch,” just in case liberal democracy turned out to be messier than anticipated[1]. Very likely they were influenced by those among the ancient Greeks who were enamored with the possibility of a non-tyrannical aristocracy comprised of wise, benevolent, virtuous leaders acting in the public good (e.g., Plato’s “Philosopher Kings” who embody the union of wisdom with political power). While most of their writings reflect a quite enlightened view of human nature, the Framers could be intermittently naïve. According to Alexander Hamilton, they hoped that the decentralized structure and short tenure of each Electoral College would minimize the likelihood of widespread corruption (see The Federalist, No. 68).

The Framers wanted to believe wholeheartedly in government by consent of the governed, but they could not quite bring themselves to go all-in. That is why they established the Electoral College. The elites of the Electoral College would save the people from their own irrationality, should that ever become necessary. The elites would have discretion to overrule the People’s Will, should the people ever make an “irrational” choice for President of the United States.

Over-Representation of Smaller States in Presidential Elections

The Electoral College as originally conceived was a temporary, independent body consisting of Electors, appointed by state legislatures[2], who had free reign to elect the President, regardless of general election results. In time, however, each state passed legislation compelling its Electors to vote in accordance with the will of the state’s voters, as reflected in the popular vote count. With Electors stripped of the power to vote independently, the Electoral College lost its raison d’être and electoral vote counting became a mere formality. The Framers’ oligarchic “escape hatch” was thereby rendered ineffectual.[3]

This disempowerment of the Electors brought the country a step closer to true representative democracy. That’s the good news.

Unfortunately, in the course of making this change, all but two of the states made a grievous error. They replaced the autonomy of the Electoral College with something almost as un-democratic: except for Nebraska and Maine, every state passed legislation allocating all of its electoral votes to the presidential candidate who wins its popular vote. As a result, presidential elections are decided based on fifty discrete state elections—each having a population-based “point value”—rather than one unified national election.

Consequently, in two presidential elections since 2000, the “winner” lost the national popular vote, yet he ascended to the Presidency on the basis of having tallied more electoral votes than his opponent.

If, instead, the states had decided to allocate their electoral votes in proportion to the number of popular votes cast for each candidate—as Nebraska and Maine do—any incongruity between the popular and the electoral vote counts would be extremely unlikely[4].

It is not the existence of the Electoral College per se that is yielding un-democratic election results in the modern era. It is the “winner-take-all” allocation of electoral votes by the states that is fatally flawed.

What, specifically, is wrong with the existing “winner-take-all” approach?

The over-representation of small-state residents in presidential elections was agreed to by the Framers and has been a feature of our electoral system from the beginning. But the disparity has gotten worse over time.

In 1790, each resident of Delaware (the smallest state) enjoyed 81% more representation in the Electoral College than each resident of Virginia (the largest state)—not quite double[5] (see orange line in Chart 1).

Today, the residents of Wyoming have 281% more influence on the electoral vote count than their friends in California. Another way of saying the same thing is that the electoral influence per capita of Wyoming is almost quadruple that of California (see blue line in Chart 1).

Chart illustrates increased disparity from 1790-2020 in Electoral College influence of small states versus large states
Chris Ricchetti Chart 1

But the problem is worse than that—much worse.

Due to the “winner-take-all” allocation of electoral votes adopted by forty-eight of the fifty states, “blue” voters living in “red” states are quite literally disenfranchised, as are “red” voters living in “blue” states. For example, not only does voting for the Republican presidential candidate in the true-blue state of Illinois have no effect on the outcome of the election, simply by virtue of living in Illinois (and being counted in the census), Republican voters actually add to the electoral influence of their Democratic opponents!

At this point in our history, most states are either solidly “red,” or solidly “blue,” which means that voters in the remaining “swing states” are the real deciders. In only twelve of the fifty states is there any reasonable chance that either presidential candidate could win. And there are probably only seven states now that are truly competitive—Florida, Pennsylvania, Ohio, North Carolina, Michigan, Wisconsin, and Iowa. Georgia may soon become competitive and, eventually, Texas will too.

Even in the swing states, where voting matters most, all of the votes cast for the candidate who ultimately loses a state’s popular vote are essentially disregarded, as 100% of each state’s electoral influence goes to the state’s popular vote winner.

This is no way to run a democratic republic. Even the Framers were not altogether satisfied with the Electoral College system they had devised. But after one hundred days of exhausting deliberations, it was the only system on which they could agree.

Over-Representation of Smaller States in The Senate

Here again, some degree of unequal representation was agreed to by the Framers—to assure smaller states of meaningful participation in the legislative process.

In 1790, Delaware’s per capita representation in the Senate was 12.7 times greater than Virginia’s (see orange line in Chart 2).

Today, Wyoming’s per capita Senatorial representation is 68.5 times greater than California’s—more than a five-fold increase in the Senatorial representation gap since 1790 (see blue line in Chart 2).

Chart illustrates increased disparity from 1790-2020 in Senatorial representation of small states versus large states
Chris Ricchetti Chart 2

In fact, the 25% of Americans who live in the smallest states are represented collectively by 61.0 senators, while the 25% of Americans who live in the largest states are represented by just 5.3 Senators[6] (see table). That leaves 33.7 Senators for the 50% of Americans who live in “medium-sized” states.

Table illustrates the disparity in Senatorial representation of small states versus large states
Chris Ricchetti Table 1

There is, of course, also a Senatorial representation gap between the “Two Americas.” Solid-Red America (33.85% of US population) controls forty-six Senate seats, while Solid-Blue America (43.19% of US population) controls only forty. Purple America (22.96% of US population) is represented by the remaining fourteen Senators.

We are a long way from the democratic ideal of “one person, one vote.”

The unequal distribution of Senatorial representation is hugely consequential.

The Senate has the power to confirm, to reject, or to delay indefinitely the seating of every judge across the entire Federal judicial system—the entire Third Branch of the Federal government—including District Courts, Circuit Courts, Appellate Courts, and the Supreme Court.

The Senate has the power to confirm, to reject, or to delay indefinitely the filling of all fifteen Cabinet positions, all ambassadors to foreign nations, all directors of regulatory and some non-regulatory Federal agencies, members of the Federal Reserve’s Board of Governors, and others—nearly twelve hundred positions appointed by the President.

When the Senate is controlled by the party opposing the President, Senators can (and do) wreak havoc on the Executive Branch, maintain vacancies in important positions for years, and greatly impede the President’s effectiveness. Even when the opposing party is in the minority, there is a lot that Senators can do to thwart the President’s efforts to govern.

The Senate has the power to remove impeached government officials from office—or to turn a blind eye to high crimes and treason—as it did twice for former President Trump. That is certainly a consequential power.

Perhaps most importantly, nostalgic devotion to the Cloture Rule (the “Filibuster”)[7] in combination with the wide disparity in Senatorial representation, makes it far too easy for Senators, acting on behalf of narrow, minority interests, to block the passage of urgently needed legislation that is supported by a supermajority of Americans. It is maddeningly difficult for even the most well-intentioned lawmakers to get an up-or-down vote on many bills that the People want enacted.

As years of orchestrated gridlock and inaction roll by, the disconnect between the Will of the People and the Laws that govern them grows more pronounced. The existence of this growing disconnect is conclusive evidence that representative democracy in the United States is floundering.

Only as one begins to appreciate the magnitude of the electoral and Senatorial representation gaps do the otherwise bewildering phenomena of American politics begin to make sense.

Add to all of this, the gerrymandering of Congressional districts, laws aimed at disenfranchising minority voters, unlimited “soft” money, American lobbyists advocating for dictators and other corrupt foreign interests, electioneering by foreign actors, corporations as people, and money as speech.

Is it any wonder that the Will of the People goes unheeded?

Where Congress Fails to Act, SCOTUS is Happy to Oblige

Congress’ abdication of its responsibility to legislate the People’s Will means that six (unelected) Supreme Court Justices will now settle all the important issues that the Congress has failed to address. As a direct consequence of Congress’ failure, six theocratic reactionaries—who see the world very differently than most Americans—are now poised to radically re-shape the contours of American life in all kinds of ways that a supermajority of Americans do not support. Now we must contend with the Tyranny of the Minority.

But don’t blame the Justices. It is solely because of Congress’ fifty-year failure to codify the right to abortion that the issue came before the Supreme Court in the first place.

If you don’t see the Constitution as a living, breathing document, then Justice Alito’s originalist legal reasoning is quite sound. He is right. There is no explicit right to abortion granted under the Constitution, nor is there any right to abortion deeply rooted in our history and tradition. To the contrary, as Justice Alito rightly points out, abortion at any stage of pregnancy was illegal in three-fourths of the States on the day that Roe was decided. Justice Alito is also quite correct in his assessment of the reasoning provided in Roe. It is weak and, in places, not even factually correct.

But what of stare decisis—and, with regard to Roe and Casey, deference to precedent-on-precedent? It is hard to disagree with Justice Alito’s assertion that bad precedents ought to be overturned—“bad” according to whom, though?

Bruen (most especially) and Dobbs illustrate the absurdity of Originalism. Originalists seem to be possessed of the quasi-religious belief that anything short of total fidelity to the language and world view of the Eighteenth Century will take us down a slippery slope of relativism and lead to the complete breakdown of the Constitution as an institution. And so, as we move further and further from the language and world view of the Eighteenth Century, originalists like Justice Alito insist that we remain forever anchored in the inerrant wisdom and foresight of the Framers.

Is not the very idea that every aspect of modern life should be evaluated from the two-hundred-and-thirty-five-year-old perspective of the Framers absurd on its face?!

Must we continue to pretend that the Framers had any meaningful frame of reference for the advances in medicine that would come? Or the immense lethality that can now be carried over one’s shoulder into schools, churches, movie theaters, night clubs, and grocery stores? Or the establishment of a national standing army (most of the Framers were adamant that this country should never have one) on the scale of the modern US Military? Or the awesome power of cable “news” and the internet to bring out the worst in us? Or the catastrophic danger of global climate change that now threatens virtually all advanced life on this planet? Or the many ways in which American demographics and values would change and evolve? Or the vast power and influence of global corporations (British East India Company notwithstanding)? Or the unconscionable economic inequality that too many years of under-regulated capitalism would bring? Or the emergence of offshore corporate and personal tax havens? Or cryptocurrencies? Or the profound impact of American politics on the entire world? Or the dependence of democracy throughout the world on the military might of the United States? Or the possibility of mutually assured nuclear annihilation?

Each of these developments calls us—carefully, thoughtfully, in good faith, and with fear and trembling—to find better ways to honor all of the Spirit and most of the Letter of the Constitution, as we seek to apply its wisdom and authority to the realities of modern life.

I am certain that Thomas Jefferson would agree. As I have written previously, Jefferson was, by the standards of today’s conservatives, a radical, extremist anti-originalist, who intended a living, breathing, evolving Constitution.

The (New) Era of American Lawlessness[8]

It is bad enough that the Congress cannot move the country forward by passing laws that reflect the Will of the People. Now we have an ensemble of Christofascists ready, willing, and determined to move American Law backwards—even further away from what most Americans want!

In a purportedly democratic society what incentive is there for citizens to comply with a system of laws that most of them do not support?

Democracy derives from and depends upon the consent of the governed—remember?

The six conservative Justices do not have the sound judgment or the common sense to see that ruling against the will of the supermajority on so many aspects of American life will inevitably result in serious erosion of the Rule of Law.

The Justices should not take it for granted that law enforcement will have the will to enforce every change in the law that results from the rulings they make. Cops and prosecutors at all levels—Federal, state, and local—are going to decide for themselves which laws they enforce and prosecute within their jurisdictions, and which laws they ignore.

Individual citizens will do the same, deciding for themselves which laws to follow and which to ignore, based upon their own values and beliefs, greatly undermining respect for Law—not just the “controversial” laws, but all laws.

Worst of all, the ideological divide in this country is now going to become institutionalized, as the various state and municipal legislatures make their own “red laws” and “blue laws” without regard for any ruling of the Supreme Court with which they disagree—resulting in a patchwork of conflicting, yet overlapping legal regimes based on radically different ideologies.

It will be a jurisdictional nightmare, replete with divided loyalties, corruption and injustice, law enforcement personnel gone rogue, and spotty accountability. In some cases, Due Process of Law will become all but impossible to ensure.

In Chicago, a state’s attorney will decide to make an example of some red-state visitors in possession of assault rifles—regardless of any applicable ruling from SCOTUS. This will bring thousands of AR-15-toting “patriots” to Chicago in protest. How’s that going to end?

At some point, in flagrant disregard for a dozen laws, a prosecutor and a judge in Texas will orchestrate the literal kidnapping of California citizens who in some way facilitate abortions for Texas citizens, convict and incarcerate them in Texas, and the Governor will refuse to let them go. Imagine the uproar.

We already know that this Court will uphold any and every state law that allows states to “rig” their elections. Eventually—probably sooner, rather than later—some obnoxiously un-democratic law in a swing state will decide a Presidential election, and all hell will break loose.

The rulings handed down this term concerning abortion, school prayer, the reach of Federal regulators, the further expansion of gun rights (just days after the enactment of some modest, widely supported gun restrictions), and the use of public funds to pay for religious education are only the beginning of the broad rollback of 100+ years of American jurisprudential evolution that the six ultra-conservative Justices are hell-bent on completing.

Together with their Congressional enablers, Justices Thomas, Alito, Gorsuch, Kavanaugh, Coney Barrett and Roberts will forever be remembered for having cracked the foundation of the Rule of Law and ushered in the (New) Era of American Lawlessness.

Five years from now, when “Laws Were Made to be Broken” will have become the mantra of the American people, you will know who to thank.

Oh, and if you prefer airplanes that have had regular maintenance, ribeye that is free of Mad Cow Disease, and water uncontaminated by mercury and E. coli—fuggetaboutit!!! To make those assurances, we’d have to infringe corporate “liberty.” And that is something we simply cannot allow in the Land of the Free.

From here on out, it’s every patriot for herself.

End Notes

  1. The Framers would have characterized the Electoral College as republican, rather than oligarchic—simply an additional layer of intermediate representation in a representative democracy. That view, it would seem, requires inordinate faith in the integrity of the state legislators who elect the Electors. Perhaps their scheme does bear a passing resemblance to the manner in which some modern European parliaments elect their heads of government, although under the US Constitution, Electors must be private citizens who do not hold any public office.

  2. “Each State shall appoint, in such Manner as the Legislature thereof may direct, a Number of Electors, equal to the whole Number of Senators and Representatives to which the State may be entitled in the Congress…”(Constitution of the United States of America, Article II, Section 1)

  3. Today, as a way of rewarding key operatives who help candidates win state elections, the winning political party in each state appoints “Electors” to formally cast the electoral votes it has won—an entirely honorary and symbolic role.

  4. …although still mathematically possible under certain very specific, and very improbable, circumstances.

  5. In 1790, there were 19,698 people living in Delaware for each electoral vote allocated to the state, vs. 35,600 residents of Virginia for each of its electoral votes.

  6. In 1790, the quartile gap was 8.3 senators to 3.0 senators (out of 30 senators in total).

  7. The Cloture Rule has the effect of raising the number of Senators needed to pass most legislation from a simple majority (51) to 60. Without the consent of 60 senators, most matters can be filibustered and never even brought up for a vote.

  8. Some will react to the title of this essay by asking, “When was the United States ever not lawless?” Their point is well-taken. There is no doubt that, in many respects, our courts and systems of justice are exemplary. In countless cases, they yield outcomes that are fair, even-handed, and just. And, in far too many cases, the legal process yields outcomes that are extremely unjust. It’s both/and, not either/or. Therefore, with gratitude for the lawfulness and justice that does exist in our system, it is fair to ask: When ever has there not been selective prosecution? When ever have “white” people (in aggregate) and people of color (in aggregate) not had entirely different experiences with law enforcement and the criminal justice system? When ever have our civil courts not been a sanctioned forum for the expression of our most primal instincts to inflict maximum pain on our perceived “enemies”—even when we know that our revenge is far more severe than and totally out of proportion to the injury we have suffered? When ever has our legal system not been tainted by corruption? When ever has there been equal justice under law? The parentheses around the word “New” are intended as an acknowledgment of those realities. And yet, what is unfolding now is “next-level” lawlessness.

The Lies that Will Not Die

By Chris Ricchetti | October 29, 2021

As an eighteen-year-old college freshman, I knew intuitively that much of what I was being taught about how the economy works was flawed, distorted, limited in its explanatory power, entirely uninformed by human psychology, or patently false. It was the Reagan Era—”morning in America”—and most of the conservatively-minded Economics Department faculty were amused by my frequent dissent. Figuratively, they would pat me on the shoulder and tell me, “Nice try, kid.”

In graduate business school, certain Nobel laureates politely dismissed my “socialist” thinking. At the University of Chicago, in the early nineties, faith in the power of the Free Market to usher in the Kingdom of Heaven was delivered with quasi-religious zeal. In every community, no matter how enlightened, there are certain things one simply does not question.

Now, after twenty-five years of working intimately with wealthy families, private business owners, and public company executives, here’s what I know about the Lies the Will Not Die:

➤ Broad-based consumer (i.e. middle-class) spending is the primary driver of economic growth and, therefore, employment (i.e. jobs).

➤ When a recession is deep enough that consumers are unable or unwilling to spend, government spending is the only evidence-based remedy for stimulating economic growth. Apart from the Federal government’s capacity for massive near-term spending, there is no “market-based” solution to this cyclical problem. It is unconscionable that the Congress is unfailingly abusive in deciding how to allocate the necessary spending, but even misguided, corrupt, and wasteful outlays are stimulative. Imagine how much more effective stimulus spending could be, if lawmakers pursued the greater good, rather than their own self-interest.

➤ Wealthy investors and business owners do not constitute a special class of “job creators.” They create jobs only when increasing demand for their products and services (which they largely do not control) dictates that more bodies are needed. Furthermore, firms have a strong tendency to resist hiring new workers, until it is absolutely necessary for them to do so.

➤ Taxing the rich—in fact soaking the rich, and I’m not saying we should do that, but there is abundant evidence from 1932-1981 to support this—has, at best, a modest effect on the amount of capital they invest in new or expanding businesses, and no effect whatsoever on their level of personal consumption.

➤ Increasing tax rates on the earnings of existing medium-sized and large businesses does not reduce the incentive for entrepreneurs to start new businesses (it turns out that entrepreneurs are motivated by several other things besides money—who knew?!). It does not even discourage existing firms from making new capital investments, from entering new markets, or from scaling up profitable business units.

➤ Yes, fairer (i.e. higher) business tax rates will send jobs overseas—but only because current tax policy makes it so ridiculously easy for American companies to reduce their US tax liability by going offshore. There must be non-negotiable responsibilities associated with the right to sell in US markets, including some level of taxation on global sales. Obviously, the answer is to close the loopholes, not to dispense with taxation.

Small businesses and startups should be taxed based upon a completely different set of principles that accounts for their vulnerabilities and the effects that tax policy does have on their business incentives and behavior.

➤ The preferential tax treatment afforded to capital gains, ordinary dividends, carried interest, and inherited capital assets does nothing to promote the general welfare and, in fact, is detrimental because it perpetuates ever-widening wealth inequality.

Undertaxing wealthy individuals and corporations does not incentivize anything that enhances the public good. The macroeconomic relationships postulated by supply-side economics have been shown to be tenuous at best (Arthur Laffer’s formulation of his own namesake curve and his positioning of the US tax system on it were both way off), and the “trickle-down” theory was always a fantasy. Economic growth is, first and foremost, about Aggregate Demand.

There’s not much debate about any of this anymore. Everyone knows these things are true. And yet, under Republicans and Democrats alike, US fiscal and tax policy continues to be heavily influenced by the Lies that Will Not Die.

Venture capitalist Nick Hanauer (2012) says it beautifully…

If lower income tax rates for the wealthy really worked we would be drowning in jobs…

His short, controversial 2012 TED Talk is a must-watch.

So Long, Tribe

By Chris Ricchetti | 3 October 2021

The End of an Era

The Cleveland Indians have played their last game as the Indians—a 6-0 shutout of the Texas Rangers today at Globe Life Field in Arlington.

The Tribe’s last-ever home game was an 8-3 win against the Kansas City Royals, on September 27, in which Cleveland center fielder Bradley Zimmer homered off his brother, Kansas City reliever Kyle Zimmer, to lead off the bottom of the 8th.

I have lived in Chicago since coming here for undergrad in 1985. But I grew up in suburban Cleveland and, though I raised a son here and have since become a passionate White Sox fan, I have never lost my deeply felt love for my boyhood team, the Cleveland Indians.

My father’s father emigrated to the United States from southern Italy and settled in Cleveland in 1919—four years after the ballclub elected to call itself the Indians. The team and its name have meant something now to four generations of Ricchettis, including some who have never lived anywhere near The Land. I carry with me many cherished memories of outings to Cleveland Municipal Stadium, and later, to “the Jake,” with my father, middle school, high school and college buddies, and my extended family.

I heartily support the name change and accept that it is long past time to move on from imagery and nomenclature that have been harmful. Whether or not and to what extent the harm was intended is not the point. Harm is harm.

Nonetheless, I am feeling sad and nostalgic today, as I watch the Indians Era come to a close.

A Club by Any Other Name

The Cleveland franchise dates back to 1901, when the American League, hitherto a minor league, declared itself a major. The minor-league forerunner to the 1901 ballclub had competed in the league, making Cleveland one of the eight charter members of the “upgraded” American League.[1]

In its early years, the team experimented with several monikers, starting with the “Bluebirds,” often shortened to the “Blues.” The players disliked the name and tried, unsuccessfully, to change it to the “Bronchos.” Inexplicably, some sportswriters continued to use the extremely unpopular name “Spiders” for several more years after the formation of the new major league franchise (see below, and Endnote 1).

In 1902, Napoleon “Nap” Lajoie, star second baseman with the Philadelphia Phillies, defected to the new American League, playing briefly for Connie Mack’s Philadelphia Athletics. But early in the season, he moved over to the Cleveland ballclub, lured by a three-year contract for $25,000—more than double what the Athletics were paying.

Nap was an immediate hit with Cleveland fans, and it wasn’t long before the team was renamed the “Naps.” In 1905, he became the club’s player-manager. The team struggled in the late oughts and early 1910s, leading some reporters to refer to them as the “Napkins.”

Napoleon "Nap" Lajoie
Baseball Hall of Fame Napoleon “Nap” Lajoie

Between 1912 and 1914, the team was known (unofficially) to some as the “Molly McGuires,” a reference to a group of Irish-American immigrants prone to violent retaliation against their employers over exploitive and dangerous working conditions. Whoever invoked the “Molly McGuires” as an alternative to the “Naps” must have been “trolling” club co-owner Charley Somers, who had made his fortune in the coal business—the industry in which the majority of real-life Molly McGuires unhappily labored.

After the 1914 season, Lajoie, very much past his prime, returned to the Athletics, precipitating the search for a new team name. With input from sportswriters, the team was renamed the “Indians” in 1915.

The Controversy

Baseball historians and fans have long debated whether the Indians were so named, at least in part, as a tribute to Louis Sockalexis, a Native-American who played the entirety of his brief, major league career (1897-1899) as an outfielder for the Cleveland Spiders—a National League team that found itself no longer able to compete at the major league level, following a dismal 1899 season.[2] Sockalexis, a member of the Penobscot Nation, was among the first Native Americans (many believe he was the first) to play major league baseball.

American Indian Magazine Louis Sockalexis
1897-03-11 St Paul Globe page 8 re Sockalexis signed to Cleveland
St. Paul Globe St. Paul Globe
March 11, 1897

For decades, the Cleveland Indians organization propagated the narrative that the team’s name was meant to honor Sockalexis, who, they insisted, was a “fan favorite.” It is true that, during his time with the Spiders, reporters and fans—with the encouragement of the club’s owners—often referred to the team as “Tebeau’s Indians,” purportedly in deference to both player-manager Oliver “Patsy” Tebeau and Sockalexis.

Skeptics have argued that because so many white people looked down upon Native Americans, it’s implausible that white owners of the early twentieth century would have named their team in honor of one. In a 2007 blog post, former Sports Illustrated writer and Cleveland native Joe Posnanski wonders, “Why exactly would people in Cleveland—this in a time when Native Americans were generally viewed as subhuman in America—name their team after a relatively minor and certainly troubled outfielder?”

Reporting on the name change in 1915, a writer for Cleveland newspaper The Plain Dealer opines that the name “also serves to revive the memory of a single great player who has been gathered to his fathers in the happy hunting grounds of the Abenakis,” perhaps reflecting both appreciation for Sockalexis’ athletic talent and insensitivity toward his indigenous heritage.[3]

NYU Professor Emeritus of Education and History Jonathan Zimmerman contends that, far from being a player beloved by fans, Sockalexis was the player that fans quite literally loved to hate. According to Zimmerman, the Indians moniker was intended not to honor Sockalexis, but to mock him. During his short stint in major league baseball, he endured constant taunts—frequently, but by no means exclusively—from opposing-team fans, for whom abusing Sockalexis apparently was an integral part of the “fun” of rooting against the Spiders. References to the “Cleveland Indians,” Zimmerman asserts, were intentionally sarcastic and demeaning.

Ed Rice, author of the Sockalexis biography, Baseball’s First Indian, agrees: “They called [the Cleveland Spiders] ‘Tebeau’s Indians.’ But it wasn’t meant to be flattering, of course. It was meant to make fun of the spectacle that Cleveland was going to be in 1897, putting an American Indian on the field.”

To muddy the waters further—because, why not?!—the Cleveland Spiders were sometimes referred to as “Tebeau’s Indians” and “Tebeau’s Braves” well before the club signed Sockalexis.

1895-10-03 Nashville Tennessean - Page 4
Nashville Tennessean Nashville Tennessean
October 3, 1895
1897-02-23 Baltimore Sun - Page 6
Baltimore Sun Baltimore Sun
February 23, 1897

Moreover, a bunch of Cleveland players and managers have been referred to as “Chief” or “Chief Wahoo,” both before and after the 1915 name change. And the “Chief (manager) / Indians (players)” metaphor has been used in reference to many teams, and may be as old as baseball itself.

Apart from any historical connection to Sockalexis, the name “Indians” may have appealed to white baseball fans of the time because it conveyed the supposed ferocity of a group that many regarded as “savages.” Shortly after the name change was announced, on January 17, 1915, the Cleveland Leader published this commentary: “In place of the Naps, we’ll have the Indians, on the warpath all the time, and eager for scalps to dangle at their belts.”

That same day, The Plain Dealer published a cartoon loaded with stereotypes and racist tropes, captioned “Ki Yi Waugh Woop! They’re Indians!”

The Plain Dealer

Beneath the cartoon, the paper reported the decision of the name selection committee convened by co-owner Charley Somers to solicit the input of sportswriters from Cleveland’s four[4] daily newspapers: “The title of ‘Indians’ was their choice, it having been one of the names applied to the old National League club of Cleveland many years ago.” Notably, the name was not intended to be permanent. The writer continues

The nickname, however, is but temporarily bestowed, as the club may so conduct itself during the present season as to earn some other cognomen which may be more appropriate. The choice of a name that would be significant just now was rather difficult with the club itself anchored in last place.

Perhaps the name was chosen to take advantage of the excitement surrounding the 1914 “Miracle Braves” of Boston, who had come from last place in midseason to win the National League Pennant. Perhaps the name “Indians” could replicate for Cleveland the “magic” of the Boston club’s sanitized Native American ethos (see comments over the phallus in the center of the cartoon above).

According to sport sociologist and Ithaca College Professor of Sports Media Ellen Staurowsky, there were no references to Sockalexis in any accounts of the name selection process published in any of the four Cleveland newspapers—compelling evidence that the choice of the name “Indians” in January 1915 was not a direct reference to Sockalexis. In a 1998 scholarly article on the subject, Staurowsky writes

As seen in the 1915 accounts, when the team faced the mammoth task of moving out of the basement in league standings while forging a new identity, there was no need to mention Sockalexis because it was the generic, plural "Indians" signifier that provided the marketing angle club President Charley Somers and the sportswriters sought.

However, use of the moniker in connection with the Cleveland Spiders, some eighteen years earlier, had been directly referential to Sockalexis, as evidenced by dozens of contemporaneous sources referencing “Indians” or “Tebeau’s Indians.” This one, about the newly-signed outfielder’s arrival in Cleveland, is from the March 27, 1897 issue of Sporting Life:

Sockalexis, the Indian, came to town on Friday, and in 24 hours was the most popular man about the Kennard House, where he is stopping... Why he has not been snatched up by some League club looking for a sensational player is beyond my comprehension... They're Indians now.

Perhaps something like the transitive property of equality (i.e., A=B and B=C. Therefore, A=C.) is applicable here:

➤ The 1897 Spiders were called Indians because of Sockalexis,

and

➤ The name Indians was chosen in 1915 because of the 1897 Spiders.

Therefore,

➤ The 1915 Indians were so named (indirectly) because of Sockalexis.

Cleveland-based sports historian Morris Eckhouse seems to agree: “Without Sockalexis, it’s unlikely the team would be called the Cleveland Indians.”

Of course, this tidy simplification leaves unresolved the question of why Sockalexis’ “Indian” heritage was evoked as a nickname for the Cleveland Spiders—was it out of disdain for him, or in celebration of his remarkable skill as an outfielder and as a hitter, or a confounding mixture of attitudes and beliefs that were characteristic of the time?

If, over these many years, anyone associated with Cleveland baseball—from owners, to managers and coaches, to players, to fans—has had any heartfelt intent to bestow honor upon Native Americans as a group and/or upon any specific Native American, or to empathize with their actual lived experience, it seems clear that none of us have done so very well.

Chief Wahoo

A precursor to the Native American caricature that came to represent the Cleveland Indians first appeared in 1932, on the front page of The Plain Dealer. For years thereafter, the “Little Indian,” as he came to be known, made regular appearances in the newspaper’s sports section, drawing readers’ attention to the latest Cleveland baseball news.

The first version actually commissioned by the Indians ballclub was designed by seventeen-year-old Walter Goldbach in 1947. The logo continued to evolve, culminating in the 1951 redesign that remained (with periodic minor design changes) until it was abandoned altogether after the 2018 season.

1932-05-03 The Plain Dealer page 1
The Plain Dealer The Plain Dealer – May 3, 1932
Walter Goldbach First Indians Logo 1947
Cleveland Indians 1947
Cleveland Indians Logo 1951
Cleveland Indians 1951
Cleveland Indians Logo 2014-2018
Cleveland Indians 2014-2018

Use of the nicknames “Chief” and “Chief Wahoo” in connection with certain Cleveland players predates the logo by several decades. In 1952, the nickname and the caricature were united, and Chief Wahoo became the official name of the Cleveland Indians mascot.

Some have noted that Chief Wahoo is actually a brave, not a chief, because his head is adorned with a single feather, whereas a chief would have worn a full headdress. Earlier team logos had included the full headdress.

Today, the twenty-eight-foot, neon-illuminated representation of Chief Wahoo stepping into his swing, that for thiry-one seasons (1962-1993) was mounted high above Gate D at Cleveland Municipal Stadium, is on exhibit in the Reinberger Gallery at the Western Reserve Historical Society.

Chief Wahoo at Western Reserve Historical Society
Dan Meek via Pinterest Chief Wahoo Exhibit

Wahoo is a switch hitter. He is the same on both sides and, back in the day, he rotated. Depending upon which side of Wahoo you looked at, he would appear to be batting righty or lefty. In his new home at the museum, it seems that he’ll be batting lefty forevermore.

Not All Heroes Are White

It is worth remembering that professional baseball in the 19th and early 20th centuries was an entirely different animal than the orderly, tightly-controlled product we see on our 4K and 8K televisions today.

In Sockalexis’ time, baseball was a rowdy, unsportsmanlike, often lawless, often violent brawl, played mostly by gritty, hardened, working-class immigrants, in which “might made right” and “winning at any cost” were both the expectation and the norm. Bullying, threats, intimidation, bribery, and flagrant physical assault were everyday occurrences. The game was a little cleaner by 1915, but not much.

Few players of that era were “honored” by sportswriters, teammates, or fans in the ways that decades of sports marketing have conditioned us to think that Sockalexis was “honored.” It simply was not part of the zeitgeist. It was raucous, take-no-prisoners entertainment, and the dignity of many was sacrificed in the production of it. In any such environment, people who are seen as “other” inevitably bear the worst of the pain. There is no reason to believe that Sockalexis would have been spared. As a Native American playing major league baseball just seven years after the Massacre at Wounded Knee, he was an American hero, simply for having had the courage to step onto the field.

The Dawn of a New Era

2016 Wild Card Game
Screen capture by Ted Berg, USA Today If nothing else, we can all understand that lots of different people see lots of different things in images such as this.

The Indians Era has come to an end. Cleveland’s Major League Baseball club will henceforth be known as the “Guardians,” a name inspired by the eight statues (“Guardians of Traffic”) capping the pylons of the Hope Memorial Bridge that spans the Cuyahoga River, leading to the ballpark from the west.

MLB Advanced Media, LP
Hope Memorial Bridge - Cleveland
Hope Memorial Bridge – Cleveland

Endnotes

  1. The 1901 Cleveland ballclub was an amalgam of two existing Cleveland teams. One of these, the Cleveland Lake Shores, were a minor league club affiliated with the American League, which promoted itself to major league status, effective for the 1901 season. Charley Somers, co-owner of the Lake Shores, was a driving force in the early development of the American League. He purchased the Lake Shores ballclub (formerly the Grand Rapids Rustlers) and moved it to Cleveland, in anticipation of the American League’s ascension to major league status. American League President Ban Johnson, Somers, and the other AL club owners were determined to break the National League’s near monopoly in professional baseball. The other existing Cleveland team, the Cleveland Spiders, were a major league club that competed in the National League. The Spiders roster had been decimated in 1899 when most of their star talent migrated to the National League club in St. Louis, at the direction of the Robison brothers—Frank and Stanley—who were the owners of both the Cleveland and the St. Louis National League teams. St. Louis was a larger market, and the Robisons had decided to go “all in” with their St. Louis team. They sold the Spiders remaining player contracts and other assets to the Cleveland Lake Shores. The combined club was thus a charter member of the new American League, calling itself the Cleveland “Bluebirds,” or “Blues,” for short.

  2. The Spiders performed miserably in 1899 because owners Frank and Stanley Robison (brothers) had bought the St. Louis Browns out of bankruptcy and transferred most of Cleveland’s star talent—including Cy Young and other eventual Hall-of-Famers—to the St. Louis club, renamed the “Perfectos.”

  3. According to Joe Posnanski and others, this lone sportswriter was the only reporter in any of the Cleveland newspapers to suggest an explicit connection with Sockalexis in the months after the name change was announced, in January 1915. Posnanski claims that Sockalexis was not named in The Plain Dealer a single time during the next ten years.

  4. Cleveland’s four daily newspapers of the time were the Cleveland Leader, the Cleveland News, the Cleveland Press, and The Plain Dealer.

T.J. the Anti-Originalist

By Chris Ricchetti | 16 September 2021

I set out on this ground, which I suppose to be self evident, that the earth belongs in usufruct to the living; that the dead have neither powers nor rights over it. The portion occupied by any individual ceases to be his when himself ceases to be, and reverts to the society.

Thomas Jefferson

Jefferson penned this declaration in Paris in a letter to James Madison on September 6, 1789, in which he examines the question of whether one generation has the right to bind another.

He goes on to say that the society may form rules for the disposition of land at death—to spouse, to children, to legatees, to creditors, etc. But there is no natural (i.e., no moral) right to a decedent’s property—not creditors, not even family have any moral claim to it. A legal successor’s interest, if any, is derived solely from the laws of the society of which they are members. It is a “municipal” (i.e., socially constructed) interest only.

Following a lengthy application of this principle to management of the public debt, Jefferson continues…

On similar ground it may be proved that no society can make a perpetual constitution, or even a perpetual law. The earth belongs always to the living generation. They may manage it then, and what proceeds from it, as they please, during their usufruct.

In fact, Jefferson argues that laws should “sunset” automatically, so that inaction on the part of government would not perpetuate the status quo by default: “…a law of limited duration,” he says, “is much more manageable than one which needs a repeal.”

Thomas Jefferson never publicly advocated for an actual sunsetting constitution. Presumably, he regarded the amenability of the U.S. Constitution to be a more workable actualization of his usufruct philosophy. But mechanics of revision aside, there is little doubt that Jefferson was, by today’s standards, a radical, extremist anti-originalist, who intended a living, breathing, evolving constitution.

Thoughts on the Twentieth Anniversary of 9/11

Photo by Jin S. Lee
Director of Photography
9/11 Memorial & Museum

By Chris Ricchetti | 11 September 2021

America has been attacked, and it has been changed. This is the first great test of the new century for this nation, for its new president. It’s also a great test for us all—wherever we live, whatever our age, whatever our beliefs.

These were the words of NBC Nightly News anchor, Tom Brokaw, on the evening of September 11, 2001. He could not have known how portentous his words would prove to be.

The months between 9/11 and the end of 2001 were magical. For a fleeting moment, we were unified by the shared sadness of a common tragedy. People across the country were warmer and kinder, more patient, and more respectful. We were less competitive, noticeably less selfish, and more willing to help each other. For a few short months, the small irritations of daily living did not set us off. We simply let them go. Everyone, it seemed, was a little more human—more accepting of our own humanity, and more tolerant of the humanity of others. It was the kind of awakening that only grief can inspire.

As a country, we were unified. And the free world was with us. Only weeks after the Supreme Court had summarily delivered the 2000 election to George W. Bush, even I, for a moment, was willing to set aside the bitterness of electoral defeat and to rally around the President. Scores of Democrats joined with their Republican colleagues in calling for solidarity.

By October, we were in Afghanistan, where the US-led military coalition swiftly dealt a crippling blow to those who had attacked us (al-Qaeda) and toppled the de facto government of Afghanistan that had given them aid (Taliban). The Taliban fled to Pakistan, where they would regain the confidence to fight us again, while the remaining al-Qaeda scattered. We pursued them wherever they congregated, in many other places around the globe.

That wasn’t good enough for the “neo-cons” who postulated that we could re-make Afghanistan in the image of New Jersey. And we naively set out to do just that. It’s a mistake we repeatedly make—believing that it is within our power to control the political aspirations of entire populations.

Then Dick Cheney seized upon the opportunity to realize a whole portfolio of his own, personal ambitions, at the expense of military families and American taxpayers—not to mention hundreds of thousands of Iraqi casualties, and the de-stabilization of the Middle East that has been catastrophic for the entire world. Knowing that Saddam Hussein had ordered the attempt to kill Bush’s father in 1993, it must have been easy for Cheney to sell the invasion of Iraq to the President. Likely, war in Iraq was already on Bush’s agenda, as we know it was on Cheney’s, well before 9/11 provided them a pretext.

The Bush Administration, aided by the CIA, undertook an aggressive campaign to manufacture public support for another war. To achieve this, they deployed an arsenal of bald-faced lies: that Iraq played a role in the 9/11 attacks, that Iraq had a robust chemical and biological weapons program, including large stockpiles of weapons of mass destruction, and that Iraq had procured uranium from Niger and was on the brink of becoming a nuclear power—a rationale for war that the honest-by-nature General Colin Powell had trouble delivering to the UN Security Council with a straight face.

In October 2002, the Congress authorized[1] the President to use military force, if necessary, to compel Iraq’s compliance with its obligations under the cease-fire agreement of 1991 and numerous subsequent UN Security Council resolutions. Bush had “pinky promised” to pursue every possible diplomatic effort first, and to use our military might only as a last resort. Within five months, the full spectrum of diplomatic possibilities had apparently been exhausted, because Bush-Cheney ordered the invasion of Iraq. How dare they destroy so many lives under false pretenses.

In the ten-year period that followed, Halliburton stock performed three times better than the S&P 500.

It took less than a month to oust the Ba’athist regime of Saddam Hussein and to set up a provisional government. But we stuck around for another decade, certain that we could inspire Iraq’s three belligerent ethno-religious groups to set aside ancient hatreds and come together to form their own liberal democracy. Because we are that exceptional. Heck, if Donald Rumsfeld was to be believed, we could even do it on a budget!

Down the rabbit hole we went, in pursuit of a staggeringly expensive conflict[2] that made us less safe, more isolated, and less free.

Bush did not have it in him to meet the challenge of his era—to channel the fleeting spirit of unity, tolerance, and compassion that pervaded after 9/11 toward any positive ends. He squandered the goodwill and good faith that arose from our collective suffering in that magic moment. And we let him. We were all participants in the Grand Distractions that allowed that spirit of brother-and-sisterhood to go back into hibernation.

Since January 2020, we Americans have been dealing with a new collective sadness. COVID-19 has upended our way of life, killed hundreds of thousands of us, and scarred millions more with permanent health problems. This time, no magic moment of unity has arisen from our grief. This time, a global tragedy deepened our national divisions and hastened the unraveling of American society that was already under way. There are many reasons for it and plenty of blame to go around, but what’s different this time is that the nation is in a very different emotional state than we were on 9/11.

The chaos, the over-the-top intensity, and the utter irrationality of our public discourse, the literal insanity of it, our extreme villainization and dehumanization of each other, our intractable stubbornness—all are clear signs that we are acting out of our most primitive emotions. Individually and as a society, we have regressed to the bottom of our brain stems—to the amygdala, where we marinate continuously in fear and rage. We are no longer thinking, and our emotional repertoire is constrained. Hence the emergence of “alternative facts.”

It is well established in the field of neuropsychology that humans are incapable of thinking clearly or cogently in this hyper-aroused state. Nor can we feel the distinctively human emotions of empathy and compassion. In survival mode, the ventromedial prefrontal cortex, the part of the brain that is believed to modulate empathy, defers to the amygdala—essentially going “offline” until the perceived existential threat has passed. Under the constant influence of social media algorithms and other opportunistic media, many of us now spend most of our time “on edge” in basic survival mode. Almost all of us spend far too much time there.

The “Two Americas” hold vastly different views of what America means and what it means to be American. We cherish our respective conceptions of America with the same mix of reverence, tenderness, devotion, and love that we bestow upon our children, our partners, and members of our families. We may articulate “reasons” for loving the people that we love. We may think deeply about how to put our love into practice. We may make cognitive commitments that sustain our cherished connections as emotions inevitably wax and wane. But at their root, these attachments are purely emotional—as is our love for our country.

We are evolutionarily and biologically wired to respond aggressively to anyone we perceive as threatening to someone we cherish. This extends to whatever subgroup of our fellow citizens we perceive to be enemies of the America we love. We are stuck in attack mode, and we are getting stuck-er every day.

The situation does seem hopeless. It is certainly not sustainable. It would be entirely reasonable to conclude that Vladimir Putin and Xi Jinping have won Round 2 of the Cold War by a knockout.

On the other hand, given the primal impulses at work and the media forces keeping the combatants on each side marching in unison, one of the more astonishing features of the era is that we have not already erupted into all out civil war. Maybe we are a ticking time bomb. Maybe we are not. Time will tell, as time always does.

In the meantime, as Sting so eloquently puts it, “Men go crazy in congregations. They only get better one by one.” Lasting societal progress begins at the individual level. Our hope for salvation lies in improving both our collective capacity for critical thinking and our collective emotional intelligence—one individual at a time.

Reason alone cannot save us. It never has. We have been thinking without drawing upon the powerful perceptions of our emotional brains for most of the time we have existed on this planet. If reason alone could lead us to the Truth, the smartest among us would already have agreed on everything. Disagreements between equally “smart” people generally have their basis in unconscious emotions that filter out certain data, assign significance to other data, and shape the workings of the rational mind. If we were all processing the same information in the same way (i.e., via “pure” thought), we would all come to the same conclusions.

The challenge we face as a nation is fundamentally emotional. There will be no thinking our way out of this. We can try to engage in rational debate. We can pretend that articulating our “reasons” will persuade others to support our positions. But a good-faith exchange of views is not possible when “people like YOU are destroying MY America!” Compromise and working through areas of disagreement are obviously out of the question.

This is why our social media rants never change any minds, and why there is no longer any actual debate in the purported “greatest deliberative body in the world,” the United States Senate.

In the face of a fundamentally emotional problem, greater emotional awareness, maturity, and sophistication are the way through. There is no getting around the fact that a critical number of us must be willing—and make the effort—to learn how better to navigate the landscape of our emotions for there to be any meaningful change on a societal level.

Sing along with me now… “Men go crazy in congregations; they only get better one by one. One by one.”

Endnotes


  1. On October 10, 2002, the Authorization for Use of Military Force Against Iraq Resolution of 2002 passed the House by a vote of 296-133. It passed the Senate by a vote of 77-23, the following day.

  2. Since 9/11, annual “defense” spending has more than doubled. With minimal oversight and much of it unaccounted for, we’ve spent nearly $5 trillion prosecuting the War on Terrorism. All of it—yes, ALL of it—was borrowed money. We owe $2 trillion more in military pension benefits. This gargantuan waste has been a windfall for defense contractors who collect nearly half of the Pentagon’s budget, year after year. Throughout this period, there have been more contractor employees than military personnel on the ground. In recent years, there have been twice as many. Over the past twenty years, defense stocks, including LMT, BA, GD, RTX, and NOL, have outperformed the S&P 500 by about 60%.

100 Years of MLB Radio

By Chris Ricchetti

August 5, 2021—Today marks the 100th anniversary of the first live broadcast of a major league baseball game over commercial radio.

At 6:00 pm, on the evening of November 2, 1920, Pittsburgh-based station KDKA had aired the first commercial radio broadcast—coverage of the returns of the Warren G. Harding-James M. Cox presidential election (the first in which white women were allowed to vote), announced by Leo Henry Rosenberg (1896-1988), to a small audience of shortwave radio listeners. The eighteen-hour broadcast—originating from the top floor of one of the buildings at the two-million-square-foot Main Works of the Westinghouse Electric & Manufacturing Company in East Pittsburgh—lasted until noon the following day. Real-time information about the election was supplied to KDKA via telephone by reporters at the Pittsburgh Post and the Pittsburgh Sun.

Then, on April 11, 1921, KDKA broadcast the first live sporting event—a boxing match between Johnny Ray and Johnny Dundee—originating from Motor Square Garden (Pittsburgh).

These events led up to the first baseball broadcast, on August 5, 1921 from Forbes Field in Pittsburgh. The play-by-play coverage of the game between the Pittsburgh Pirates and the Philadelphia Phillies was voiced by Harold W. Arlin (1895-1986), who had been the primary announcer on KDKA’s evening broadcasts during the previous months. The twenty-five-year-old Arlin announced the game using a repurposed telephone receiver as a microphone, and other makeshift equipment, situated behind home plate. The Pirates—who went on to finish second in the National League standings with a record of 90-63—won the August 5th game by a score of 8-5.

The 1921 Pirates roster included the colorfully named infielders, Rabbit Maranville (5’5″, 155#), Cotton Tierney (5’8″, 175#), and Pie Traynor (6′, 170#), and outfielder, Possum Whitted (5’8½”, 168#).