Echoes in Cyberspace: A History of Technology and Plagiarism
by Rachel Veroff
—@uncletypewriter on Twitter, responding to
Melania Trump’s plagiarism of Michelle
Obama at the 2016 Republican National
Earlier this year, New York Daily News writer Shaun King was falsely accused of plagiarism. The accusation sent a shockwave of panic through the writing community. King is a civil rights activist and prolific journalist who built his name on covering sensitive issues like injustice and racial inequality—issues that many journalists avoid for fear of institutional backlash. King is a hero among grassroots writers and writers who care about social progress. The accusation against him was deeply upsetting.
Fortunately, just as fast as King’s critics could pick up their pitchforks, the record was set straight. A glitch in the Daily News’ online CMS (content management system) had caused two paragraphs of block-quoted text to lose their indented formatting. The same glitch caused links to properly cited sources to be dropped. If you’ve ever copy-and-pasted a body of text into WordPress, you know how easily these formatting problems can happen. Then, his editor failed to catch the error before hitting “publish.”
This series of mistakes was so mind-bogglingly stupid, so heart-wrenchingly understandable, it makes you cringe to think about. The resulting uproar on Twitter was serious enough that the editor lost his job. In times before social media, an honest snafu like this might have been addressed behind closed doors. A query might have been directed privately to the writer or the publication. Instead, King’s accusers incited a public spectacle that could not be ignored.
In the days following the false alarm, Glenn Greenwald chimed in to warn against the dangers of “online mob journalism.” He emphasized the importance of using care and wisdom when dealing with such grave matters as people’s reputations and livelihoods. Plagiarism is not a word to take lightly. It’s a word that strikes fear in the hearts of professional writers in all fields. It’s a word that ruins careers. But this instance is only one in a hailstorm of controversies that have plagued the writing world for ages. The only difference now, in the age of the Internet, is that these controversies are being illuminated in an entirely new way.
A few days later, with the fallout of King’s plagiarism scare still fresh in my mind, I clicked open a review of the movie Nina, the much-criticized biopic about Nina Simone. Normally I only glance through film reviews, but on that day, my attention halted on the first line. The article opened with these words from a poem by Langston Hughes:
What happens to a dream deferred?
Does it dry up
like a raisin in the sun?
Or fester like a sore—
There’s nothing surprising about using this verse to set the tone for a critique of colorism in contemporary American cinema. But one thing made me uneasy: the lines did not appear to be attributed. In fact, the quote was not presented as a quote at all. Instead, the words beamed from my computer screen as if they were the first sentences of the article itself. Even the line breaks had been removed.
The article was smart, insightful, and well-argued, but my attention kept drifting back to that opening line. I thought, I know those words are from a Langston Hughes poem. But is it safe to assume that everyone does? Would my mother recognize those lines? Would my dentist? Would a typical college student? These questions troubled me.
I imagine the writer intended the lines as a salute to the influential jazz poet. And it’s clever: making a sly tip of the hat to the canon shows the writer’s skill. The cadence of the poem lilts easily into the rest of the piece. This is obviously a famous line from American poetry—right? Good writers fudge boundaries like this all the time. I wanted to give the writer the benefit of the doubt.
At the same time, the review would not have suffered at all if it opened with the same words properly cited. Even if poetic license were a valid concern, there are plenty of creative ways to slip nondisruptive citations into a text. In Maggie Nelson’s The Argonauts, for example, references to diverse passages from philosophy and critical theory are noted neatly in the margins. It’s not a standard method for citing sources, but it works for that book, which is a feat of poetic form in its own right.
In the case of this ordinary film review—billowing across cyberspace like a renegade tumbleweed and lodging itself into a forgotten recess of collective consciousness—there is something dangerously cavalier about not even italicizing, not even including a link to the original poem.
The word “plagiarism” comes from the Latin plagiare, which means to kidnap or steal. The first time it was used to describe intellectual theft was in the first century AD, when a Roman poet named Martial accused rival poets of reciting his work in the streets for money. The furious Roman took a stand and delivered an impassioned, lyrical rant against those scoundrels who would dare “kidnap” his words, thus securing his spot in the annals of history forever.
In his rant, Martial explained that his outrage was twofold: first, he was angry that others were earning money from his work. Second, he was personally upset to not receive credit for his literary accomplishments. This was the first recorded instance of copying being likened to robbery. The case was simplistic because Martial’s plagiarists were not particularly clever thieves: they were street performers looking to make a dollar. The notion of copying as theft did not emerge again in popular discourse until many centuries later.
In ancient and medieval Europe, writing was not considered intellectual work. The printing press hadn’t been invented yet, so the task of preparing texts was more tedious than creative. A scribe’s job was to copy documents, functioning as little more than a secretary. Religious papers and other texts did not include writers’ names at all. These documents were circulated, re-hashed, and re-copied repeatedly without much concern for crediting the individuals who penned them.
In terms of literary writing, there were a very small number of respected poets, like Homer and Cicero, who everybody else studied and transcribed laboriously. Books were rare items, so they were treated reverentially. Being able to read and write at all was a sign of class distinction. “Scholars” were people who had access to books, and “good writing” meant showing a mastery of the ancient classics. Good writing meant, quite simply, skillful reproduction.
The same was true in painting. Aspiring painters spent years—decades, even—in apprenticeship, learning the craft by strictly copying their teachers. And the focus wasn’t on artistic expression: most paintings before the 1400s were commissioned by the Catholic Church. These were masterful homages to religious tradition; they were not trying to be innovative. Painters rarely signed their own work.
The Renaissance changed all that. The Gutenberg press, in particular, led to a flowering in the creative arts, transforming the role of writers in society. Whereas before writers were consumed mostly with the labor of transcribing, now they had technology to do that for them. Writers were free to pursue new ideas, and they found more opportunities to distribute their work.
The concept of authorship came into vogue across all creative fields. In 1547, Michelangelo carved his name like graffiti onto the Pietà at St. Peter’s Basilica, a Vatican cathedral, in a momentous act that changed the course of art history forever. The statue was supposed to be for the glory of the church. The sculptor’s genius in interpreting Mary mourning her son after the Crucifixion shouldn’t have mattered—except that suddenly, it did. The Pietà is an incredibly soft, poignant, human statue; it is also the only work that Michelangelo ever signed. The Dark Ages were over, and artists wanted to be recognized.
This shift in how people thought about intellectual merit and originality did not come without complications. For the first time since the squabbling street performers of ancient Rome, writers and artists began accusing each other of copying. To accuse someone of stealing words or ideas became a serious insult, one that could result in lawsuits and duels. To art historians, this flip was fascinating but paradoxical. Before the 1500s, being a clever imitator meant you were a master of your craft—an artist. Now, this same skill could get you called a fraud.
Shakespeare and his peers brought the word “plagiarism” into the popular lexicon when they started jockeying it about in their fashionable battles of wit. Of course, it’s common knowledge that Shakespeare stole most of his historical plots directly from Holinshed. He even borrowed the word “plagiarism” from Martial’s ancient Latin, along with many other words from unlikely sources.
Shakespeare is not the only famous writer to get caught mining other people’s works for material. Accusations of plagiarism have been so rampant throughout history that it would be impossible to provide a comprehensive list of offenders. Oscar Wilde; T.S. Eliot; Martin Luther King, Jr.; J.K. Rowling; Jane Goodall; Graham Swift; Alex Haley; and Led Zeppelin have all faced scrutiny at some point in their careers for accusations of idea-theft. Oscar Wilde was so well-known for borrowing from his peers that he once remarked to his friend, the painter James Whistler, “I wish I’d said that, James,” to which Whistler replied, “Don’t worry, Oscar, you will.”
These writers’ talents transcended their transgressions. Wilde may have had a tendency to ventriloquize, but he was witty and prolific enough to maintain his reputation. And while T.S. Eliot’s “The Waste Land” is little more than a pastiche of lesser-known poets, the reading public has largely chosen to dismiss evidences of theft in “The Waste Land” as allusions or homages to earlier works—not dishonesty.
“I can sum my thoughts up on this in two lines,” the novelist Julian Barnes once wrote on the topic of plagiarism. “When Brahms wrote his first symphony, he was accused of having used a big theme from Beethoven’s Ninth. His reply was that any fool could see that.”
In Arne Birkenstock’s quirky 2014 documentary Beltracchi: The Art of Forgery, notorious German art forger Wolfgang Beltracchi admits to having made a fortune reproducing paintings by Max Ernst, Rembrandt, and Kees van Dongen, at least 300 of which are still in circulation as authentic artworks. He even baked his canvases in the oven and sprinkled them with dust to add to the illusion of antiquity. Beltracchi was arrested in 2010 and served jail time for his deceptions, but in the film he comes across as a compelling criminal: brilliant, bohemian, and likeable. You can’t help but appreciate his attention to detail, the ambition and creativity with which he duped art historians and elite buyers. His con was so spectacular and well-executed, it left many viewers with a distinct sense of admiration. In an age where digital reproduction has become the norm, the Beltracchi case shows us that a painstakingly forged copy can itself take on an aura of authenticity.
Stealing can be a kind of art—at least, it takes a real artist to get away with breaking the rules. As Eliot once said, “Immature poets imitate, mature poets steal; bad poets deface what they take, and good poets make it into something better, or something different.” In other words: all culture is plagiarism. Art mimics life and other art. Repeating an idea in a new context is a way of interacting with it, engaging with it. Art is a conversation.
Actors are called artists when they nail their impressions; i.e., Kevin Spacey’s King Lear isn’t great because he’s Kevin Spacey. It’s great because he is King Lear. And in classical music, the musician’s role is to translate notes on the page into sound precisely as the composer intended it. The point is not the special individuality of the performer. In fact, the real artistry lies in the performer’s technical ability to erase all signs of herself, leaving only the music. The place for originality in such work lies only in the smallest subtleties of interpretation. Is writing all that different? When you get down to it, isn’t the “creativity” in writing really strictly about the skillfulness involved in conjuring truths that readers are already intimately—surprisingly—familiar with?
Plagiarism sometimes happens at the professional level because of external pressures like deadlines. A writer who is dealing with a too-large workload, or racing against the clock to turn in an assignment, is more likely to make a slip. This can be as innocent as mis-transcribing notes from an interview. The original source of a phrase or argument can get erased in the scramble to type up a story.
Consider the case of 31-year-old New Yorker staff writer Jonah Lehrer, who was forced to resign after allegations surfaced that he had consistently recycled his own material for new stories. This sparked debates about whether or not self-plagiarism is actually dishonest. What could have possibly compelled a rising young writer with a bright future at The New Yorker to commit such a grave, icky mistake?
For Lehrer, it was the pressure to meet the expectations of the job. Twenty-seven-year-old reporter Jayson Blair cited the same reason when, in 2003, an investigation into the integrity of his work at The New York Times exposed instances of fraud in 36 out of 73 articles he had written over the course of four years at the newspaper. In addition to copying, Blair fabricated quotes and facts on numerous occasions. He claimed to have traveled to other cities for stories that he in fact never went to. He went so far as to falsify receipts for expense reports to corroborate his fabrications.
It also came to light that Blair had a tendency to sensationalize the details of stories, to make them more poignant. He even won an award for a story about a high school student who died of a cocaine overdose. When further investigation revealed that the student had actually died of a heart ailment, Blair’s reputation was forever tarnished. For the sake of what? A prize? A congratulations?
At its murky depths, there is a much darker reason behind why many writers fabricate and plagiarize. The pressure to meet expectations from a job or employer is one thing, but the pressure writers put on themselves to produce good work is another. Most writers grapple at some point with the ghost in the back of the mind that whispers I want to be good enough. Please let me be good enough. This is the most disconcerting explanation that serious plagiarists give when they’re caught lying, and ultimately it is the least forgivable. These authors were not able to produce work consistently at a high level without cheating—so they cheated.
In the 1990s, Stephen Glass, a 25-year-old associate editor at The New Republic, was caught committing his own kind of forgery: he had been making up stories for the magazine and selling them as true. The scope of his orchestrations—he invented elaborate scenes that never happened, sources he never spoke to, and names of companies that didn’t exist—was shocking and in itself strangely impressive.
During his two-and-a-half year tenure at the magazine, Glass wrote 41 stories, 27 of which were later proven to be utter lies—masterfully constructed, beautifully written fiction. The scandal nearly destroyed the magazine, and Glass’s friends and mentors in the industry were left feeling betrayed and manipulated.
Vanity Fair published a riveting, in-depth exposé on Glass’s fabulism, describing how he faked notes from imaginary events to corroborate his fictions. He faked diagrams of who sat where at behind-closed-door meetings of Washington officials that never happened. He even inserted spelling mistakes and other errors into his stories so fact checkers would catch them and feel satisfied. He dodged queries into his stories from other reporters by enlarging his fabrications.
Obviously, Glass wasn’t performing all this theater because he was a lazy writer. His provocative and colorful stories catapulted him into journalistic celebrity in a very short amount of time. He seemed to have been driven by an impulse to make the stories better than they really were. He was desperately eager to please his colleagues. But as soon as people caught on to how mind-bogglingly wide his web of lies was spun, his reputation plummeted just as quickly as it had risen.
Eighteen years later, Glass’s career still hasn’t recovered. He went to law school and passed the New York state bar exam, but was denied admittance to the bar because of his past misconduct. He now holds a non-lawyer staff position at a firm in California. Glass’s former friends and editors at The New Republic have commented that they still feel oddly unsettled about his mind games, and many have declined to vouch for his character in morality fitness tests that would allow him to practice law in the state of California.
The line between embellishment and fraud is a fine one. H.L. Mencken, A.J. Liebling, and Joseph Mitchell all had a penchant for marbling their reportage with literary flair. Mitchell especially, best known for his work at The New Yorker during the 1940s, ’50s, and ’60s, had a knack for writing composite characters who were just too good to be true. These characters evoked the vivid personalities of New York’s marginal neighborhoods—the South Street Seaport fish markets and Chinatown mob dens—better than anything straightforward reporting could accomplish.
Today, Mitchell is understood to have been one of the inventors of literary journalism. His stories wouldn’t pass the strict fact-checking standards of most newspapers, but, for some reason, his writing is still widely circulated and cherished. His readers all seem willing to accept the particular blurring of fact and fiction that characterizes his style.
In a recent book, Man in Profile: Joseph Mitchell of The New Yorker, biographer Thomas Kunkel pores over the magazine’s archives in an attempt to explain why Mitchell’s unique brand of fabulism passes where others have failed. It turns out that New Yorker founding editor Harold Ross, the man who essentially invented fact checking, was well aware of Mitchell’s stretches of truth. Ross encouraged him. Ross often gave Mitchell notes on his articles that drew out the creative, invigorating stories Mitchell is now known for.
One reason that we forgive Mitchell his departures from fact could be the subject matter of his work. Mitchell wrote about eccentric people on the fringes of New York society: barkeeps, bohemians, and prostitutes. Mitchell genuinely loved the real-life people who informed his stories. He could often be spotted walking in the fish markets and interacting with the diverse characters he wrote about. He befriended many of them, and his approach to investigating their lives and personalities was not exploitative. Rather, like a novelist, he took care to represent his sources in a sympathetic, human light. However, Kunkel’s research suggests that Mitchell might have invented more of his journalism than most of us realize.
A big part of Mitchell’s appeal lied in the supposed realism of his work. So in 1948, when he revealed that one of his most popular characters, a fishmonger named Mr. Flood, did not exist, he lost a few fans. People reported feeling betrayed. A story they loved—and believed to be true—turned out to be a manipulation of the truth. A more recent example of this particular kind of genre-confusion is James Frey’s alleged “memoir,” A Million Little Pieces, which should have been marketed as fiction to begin with. Random House, the book’s publisher, was eventually forced to offer refunds to readers who felt they had been defrauded by the book’s false claims.
It says much about the American reading public today that we are so intensely hungry for true stories. In an era obsessed with reality television and celebrity gossip, if Frey hadn’t pretended A Million Little Pieces was true, it might never have found a publisher at all.
In contemporary poetry, Kenneth Goldsmith’s work pushes the boundaries of an opposite extreme: intentional plagiarism. Goldsmith calls his peculiar brand of borrowing “uncreative writing.” For example, his book Day is a typed copy of the September 1, 2000, edition of The New York Times. The book is 836 pages long, and it took him one year to type. Significant passages are financial tables.
“It’s a great book, and I didn’t write any of it,” Goldsmith said in an interview with The New Yorker. “When you take a newspaper and reframe it as a book, you get pathos and tragedy and stories of love.” Elsewhere, Goldsmith has called himself “the most boring writer who ever lived.”
The concept of “uncreative writing” is more compelling than the actual texts it produces. Probably no one has sat down and read Day from cover to cover. Nevertheless, he is the Museum of Modern Art’s first poet laureate, and he teaches writing at the University of Pennsylvania. His course requires students to plagiarize and appropriate texts they didn’t write and present them as their own. For one entire semester, students are penalized for self-expression, originality, sincerity, and creativity. Surprisingly, many students say they finish the class feeling invigorated and inspired.
When a writer is forced to fill a page by copy-and-pasting other texts, self-expression still finds ways to break free. The curatorial process itself is creative, especially for college students who have spent the majority of their writing lives patchworking words and ideas on the sly; the exercise in intentional plagiarism forces them to consider the biases and passions that guide their choices in what to steal. The most creative students end up being the ones who choose the most wisely—the ones who re-frame their lifted work with implications and skill.
Goldsmith claims that uncreative writing is in many ways a reaction to the Internet. Writers are no longer individuals struggling with their imaginations to produce original work. Tons and tons of original work is readily available online. In the digital age, our task now is to navigate all the information, to make meaning out of context more so than content. The writer’s role now is almost surgical: re-typing, re-casting, skimming, deleting, re-Tweeting, bookmarking, archiving, assembling.
Art critic Marjorie Perloff calls this new trend in writing “unoriginal genius.” She argues that today’s writer is more of a programmer of information than an isolated, romantic figure. She coined the phrase “moving information” to refer to this new act of pushing information around, as well as being emotionally moved by that process. Thanks to technology, we might very well be on the brink of a new era of writing altogether, one where plagiarism, like in past eras, is readily present, accepted—even encouraged.
In 2010, The New York Times reported that plagiarism is on the rise at universities. Students often claim they “did not know” it’s unacceptable to lift entire passages from Wikipedia. These students assume that because information on Wikipedia is in the public domain, it doesn’t belong to anyone, so copying isn’t stealing. No matter what your take on the importance of originality, such lack of concern for where ideas come from is disconcerting.
Cases like these are a result of bad training in ethical versus unethical writing. The film critic who began her review with a few verses of unattributed poetry also falls into this category: she should have known better. The editor who approved her article should have known better. Such an infraction may seem minor, but it paves the way for more serious violations to occur.
The Internet is slowly erasing the stigma that once surrounded plagiarism. Paradoxically, it may also be the best thing that ever happened to ethics in writing. Instances of falsification and copying come to light much more quickly online than they would in print. The transparency and immediacy of discourse on the Web means more accountability, and Web-linking makes it easier than ever to cite sources. Thanks to the democratic nature of public forums like Twitter, where any discerning reader can cry foul and provoke an immediate discussion, it would be almost impossible for a serial plagiarist to last as long today as Glass or Blair did only 15 years ago.
Consider that the student writers who misunderstand the nuances of plagiarism today are of the same generation that grew up with pirated music, file-sharing, and open-source software. Take 17-year-old German author Helene Hegemann: her 2010 debut novel Axolotl Roadkill—about the Berlin drug and music underground—contains pages lifted from lesser-known Berlin writers. In the face of accusations, the teen writer released the following bold statement: “There is no such thing as originality anyway, just authenticity.”
Hegemann apologized publicly for not being more open about her sources, but the scandal did not prevent her from being named a finalist for the $20,000 Leipzig Book Fair Prize that year. The judges apparently saw something worthwhile in her style of mixing source materials in the same way the DJs in her novel sample each other’s work. If you follow this line of thinking to its end, you might find yourself wondering if the concept of proprietary authorship won’t become entirely obsolete in the future. As Roland Barthes predicted way back in 1967, l’auteur est mort—“the author is dead.”
The notion of a text, image, or lyric as singular almost disintegrates online. Snippets of sound bytes ricochet across the Web and echo inside our brains ad infinitum. It’s as if we’re transitioning into an era of real-time “hive mind,” where the most creative work that artists and DJs are putting out consists mostly of remixing earlier influences, over and over. This kind of hyper-referentiality in creative content is in fact deeply rooted in the technologies we’re using today. But there is an even more powerful undercurrent at work here, too.
The Internet may be the next frontier for high-speed information sharing, but our analog brains haven’t quite caught up. We’re living in a moment of history where it is surely more important than ever for writers and editors to fine-tune our instincts of right and wrong, but we are struggling. If we’re not careful, we run the risk of erasing integral elements of the very ideas we’re trying to spread: their origins, meaning, and value.
Whether it’s done carelessly or maliciously, erasing people’s names from their work is a powerful act. It is no coincidence that T.S. Eliot, Oscar Wilde, Shakespeare, and other examples of successful plagiarists in history are mostly all white men. Kenneth Goldsmith only really gets away with calling his work “art” because he has the privilege of working inside the status quo. Meanwhile, women, minorities, and other marginalized communities continue to struggle for equal footing at major media companies, often receiving the short end of the stick in deals and contracts that dictate what will happen with their work.
The responsibility of editors and writers is to proceed with thoughtfulness, humility, and care. Our responsibility is to uphold a diversity of voices, whether we’re splicing jazz poems into our essays or mixing our friends’ ideas into our fiction. There is a right way to do it and then there is the lazy, clumsy way, a way that presents serious dangers.
Social media plays a valuable role here, too. While false statements can sometimes spread like wildfire on social media, the truth has a way of emerging to douse the flames. In scandals like allegations of plagiarism, the truth usually surfaces very quickly: the hive mind is on it. This is an incredible process to watch in action. The literate Web is truly democratic consensus at work, shuffling the best ideas to the top of the conversation based on shares, likes, and re-Tweets alone. Social media, when used properly, has the potential to restore faith in humanity.
Especially in cases like what happened to Shaun King earlier this year, where it was very likely his political enemies who made the most noise trying to tarnish his reputation, forums like Twitter become a real-time lifeline. Twitter is the main reason King was vindicated so quickly, albeit only after a great deal of panic and turmoil. It’s also the reason I was able to reach out to the author of the previously mentioned film review discreetly, without embarrassing her.
Today, we have Melania Trump’s embarrassing debacle to thank for the reminder that stealing someone else’s words is wrong. Tomorrow, the reminder will come from somewhere else. It is up to all of us to uphold ethical standards in our work. Whether it be the journalist hot on the trail of uncovering facts or the poet whose task is to pursue the higher truths, truth is the underlying intent behind all good writing. Let’s do what we can to honor that.