Category Archives: Reading-Response Blog Posts

The Cyclical Nature of Novels and Culture

In his essay, “Graphs, Maps, Trees”, Franco Moretti comes to the conclusion that, due to the cyclical nature of the graphing of literature and genres, there is never a definite winner in regards to the trends that occur in writing over periods of time. When debating on the topic of Male or Female dominance of British novels, he states:

“…No victory is ever definitive, neither men nor women writers ‘occupy’ the British novel once and for all, and the form oscillates back and forth between the two groups.”

This conclusion got me thinking. He makes a very valid point: the trends involving novels tend to be very cyclical. When thinking about this, I realized that not only can the trend in novels be in a sort of cycle, but our culture as a whole. Sometimes I talk to my mother about clothes and what is, for lack of better words, “in” or “out” at the time. I tell her that guys now wear shorts that sit only above their knees and sunglasses that have larger lenses (such as Aviators) whereas girls have been wearing higher wasted shorts/pants in the summer and high boots in the colder times of the year. After she processes everything, she almost always says, “Those types of clothes have made a comeback?! I remember your father and I always wearing those types of clothes when were together back in high school and college.” Even music nowadays tends to follow old school rules, with artists such as Justin Timberlake creating jazzy and retro beats; even setting up his concert stages to represent the classiness of the earlier decades of the 90’s. There has never been a certain culture or trend that has thrived and dominated America, but several cultures that come and go and then repeat themselves in later years.
In regards to what Moretti states, I think he is correct when he says that human culture, or in this case, novels, has a cyclical nature.

How Big Events Shape the Novel World

Screen Shot 2014-11-11 at 11.55.53 PM

In his essay, “Graphs”, Moretti asks “What would happen if literary historians, too, decided to shift their gaze from the extraordinary to the everyday, from exceptional events to the large mass of facts?”. He shows how this is possible by using charts to display the rise and fall of novel production in Britain, Italy, France, India, Spain, and Japan from the 1700’s to 1900’s. After correlating trends with external factors of all magnitudes, however, he suggests an interesting theory to explain the volatility of the novel world.

Using data from numerous scholars, Moretti shows that British novelistic genres between 1740 and 1900 were segmented (page 17, Figure 9), and that the decline of one genre always coincided with the rise of another. He realized that there were 5 big shifts in the novelistic field during this time frame. This leads him to theorize that these shifts were caused by the birth of new generations that differed significantly from the preexisting ones. To appropriately addresses the question, “But since people are born every day, not every twenty-five years, on what basis can the biological continuum be segmented into discrete units?” he states that the birth of these generations were caused by large-scale, external events, such as war or natural disaster. For example, harsher living conditions in 19th century Britain created a generation that would find Gothic literature more appealing than Epistolary literature due to its darker subject and would hence explain its rise at the same time of the decline of epistolary subjects at this time.

There is a saying that we are a summation of our experiences. Moretti understood this and applied it to explain that the trends were being affected external influences applied over an entire country, resulting in new generations of people with personalities distinct from former ones.

Quantitative Research and its Misconceptions

In this week’s reading, author Franco Moretti argues about trends in novel literature over a span of several decades, and how literary history is defined by data sets and not by individual works. He states that:

Quantitative research provides… data, not interpretation. Quantitative data can tell us when Britain produced one new novel per month, or week, or day, or hour for that matter, but where… and why–is something that must be decided on a different basis.

While I do agree that quantitative data does provide data, such data sets are capable of exhibiting a limited form of interpretation. Collection of large data sets can be passed off to outside viewers as being completely unbiased, but for the most part, data mining does not exist for the sake of mining data; the motivation is almost never that circular. Big data, therefore, presents a watered-down form of interpretation of a subject through both the data it provides, and that which has been purposefully omitted from it. Applying a more concrete argument to a data set is essential to strengthening the claims made by both, but the fact is that data sets are assembled for specific, situational purposes, and therefore carry within them implicit arguments to be defined by the viewer/reader.

Genres, Generation Cycle

The book, ‘Graphs, Maps, Trees’ by Franco Moretti, explores the cultural effects on novels. The book emphasizes the effect various aspects of culture have on the novels being read at a point in time. Moretti claimed in the novel that the regular rhythmic decline of genres of novels after 25 or 30 years is caused by ‘generations’. Moretti writes “but (almost) all genres active at any time seem to arise and disappear together according to some hidden rhythm.” This is show in the fig 9 of the novel that certain genres last for some years and a new set of genres replace them. He also states “this, then, is where those 25 – 30 years come from: generations.”

Screen Shot 2014-11-11 at 11.36.26 PM

Moretti argues that if one genre replaces another one then there can be a reasonable internal cause but when several genres randomly vanish collectively from the literary field, and another different unrelated group of genres enters, then it must relate to a change in audience. This is very true because only something external to the genres could cause the genre swap. It is rather strange for the group of genre to disappear randomly so it only makes sense that the audience dies out and a new one comes in. This leads to the idea of ‘generations’.

He says that “books survive if they are read and disappear if they aren’t: and when an entire generic system vanishes at once, the likeliest explanation is that its readers vanished at once.” This is similarly true because people in the same generation think alike. Though the term ‘generation’ is not exactly certain but it can account for the 25 – 30 years period of each genre. When a generation dies out and a new one comes in, they come in with different tastes so they demand a new genre and when their time is up, the cycle continues. In order to answer the Moretti’s question “since people are born every day, not every twenty-five years, on what basis can the biological continuum be segmented into discrete units?” Mannheim answered the question in ‘The Problem of Generations’ that it doesn’t matter when a new generation style emerges, what matters is the cultural trigger action that creates a bond between members of a generation. Mannheim referred to this process as ‘dynamic destabilization.’ This is a very good way to look at generations because it is wrong to categorize based on a regular time interval. The cultural effects make a stronger effect on the members of the generations giving them certain similar tastes. All these assumptions summed up can accurately explain the cultural effect on the genres. Moretti used these together to show that the generations have an effect on the active genres during a time period.

Truth Be Told- Moretti’s assumptions in “Graphs, Maps, and Trees”

In the reading from Franco Moretti’s: “Graphs, Maps, and Trees,” Moretti argues that literary history cannot be fully grasped by studying individual books, but that it must be studied by analyzing the system of literature as a whole, using large sets of data such as graphs, maps, and trees. Using such literary data, Moretti makes strong claims about various cultures around the world, including the culture of Japan beginning in the 1700’s (page 9). Moretti attributes the growth and decline of novels in Japan to the politics of that era, specifically because of:

A direct, virulent censorship during the Kansei and Tempo periods, and an indirect influence in the years leading up to the Meiji Restoration, when there was no specific repression of the book trade.”

The growth and decline of the novel in Japan is shown in the graph below, which does indicate a number of shifts in the amount of novels being produced per year, however Moretti’s claim makes many assumptions about the political arena in Japan, which is not supported with any further evidence.

Moretti graph(Page 10)

 Although Moretti’s assumptions about Japanese history are not supported with factual evidence, they are historically significant and accurate. The Kansei and Tempo periods in Japanese history saw harsh censorship and government control, due to military dictatorships, which occurred from 1787-1793, as well as 1830-1844. The Meiji Restoration began in 1868, when the strict government was overthrown. This led to a rise in independence and creativity in Japan. These periods in Japanese history greatly affected the publications of books in Japan, which was accurately predicted by Moretti in his study of big sets of literary data, shown in the graph above. Therefore, Moretti’s assumptions about Japanese politics are very accurate, which further enhance his claim that a nation’s culture can be predicted by studying literary systems.



Encyclopedia Britannica:

Moretti’s Ever-Changing Novel

As separate as they sometimes may appear, scientific reasoning and literature’s paths cross occasionally, creating this selection by Moretti. He argues that the novel has changed its role in literature since the 1700s. What used to sweep a continent by storm and have lasting cultural impact has now metamorphosed into a revenue-generating machine, “A new novel per week, by contrast, is already the great capitalist oxymoron of the regular novelty: the unexpected that is produced with such efficiency and punctuality that readers become unable to do without it.” Moretti even goes to such lengths to compare it to the film industry and its reputation for watered-down writing, “—novels make readers lazy, stupid, dissolute, insane, insubordinate: exactly like films two centuries later—.”

The author makes a legitimate claim in this piece. Even over the past 5 years, I have noticed the “here and gone” fanaticism that comes hand in hand with a new novel. However, this is not the case for all new fiction. Some novels quietly fade into the background, only to make it to the shelves for a quick stay. The real issue with this phenomenon is how the human race changes its preferred form of communication almost constantly. I believe the social aspect of reading has dramatically changed how and why people read, and that will continue to shift as long as humanity continues to share information.

Language and Culture

The book, Uncharted: Big Data as a Lens on Human Culture, explores the relationship between language and culture. Aiden and Michel asserts that there is a significant change in form of language and the use of language as the culture changes. And in order to improve the study of culture, study of language is crucial. This idea is deeply related with the program they created, the Ngram. They’ve studied how does the use of certain language changes over time. For example, the word ‘tea’ was way more used than the word ‘coffee’ in English history. Yet, since 1970s, coffee has become dominant as the main beverage among common people, and thus has become much more used word than ‘tea’. Such example shows that tracing the use of language can lead to better understanding of culture in general. The book compares the culture to dinosaurs. Both have a common characteristic that through traces from the past it can be found and studied. Just as the study of dinosaur is made through fossils, the study of culture can be improved by its trace from the past, which is introduced as use of language.

The assumption might not be always right. Sometimes language reacts later than the change of trend in culture. Language cannot directly mirror the cultural trend or changes. Yet, it is true that language is the best way to observe cultural changes. Language is easily observable through books or different works of literature. It is the most commonly used method of communication and exchange of ideas. Tracing the culture through language, which is a cultural fossil, may not be the exact way of examination, yet it is definitely a revolutionary method.


Enter the minds of Aiden and Michel, the two men that managed to turn something as freeflowing  and liberal as the English language into something so incredibly concrete and predictable. Aiden and Michel measure the trends of certain words in the English language and track their usage. Using this method, they can then predict just how the English language will evolve, and more specifically will be able to see just which words will still be around in say, a year from now. With enough data, they can even go as far as to predict exactly when it is a particular word will be phased out with surprising accuracy.

Aiden and Michel also pioneer the use of irregular verbs that take on a vowel change to signify a tense change and how they still have managed to coexist with the more simple verbs that take on suffixes to signify a tense change. After careful research, they determined that the irregular verbs are a remnant of a language from nearly 12,000 years ago. This makes sense, as English came from this language, as did a number of other languages. As a result, the data that Aiden and Michel have collected here could very easily be applied to every other language that came form the mother language which is a rather huge breakthrough.

G.K. Zipf and the Fossil Hunters – Reading Response

This chapter out of the book Uncharted written by Aiden and Michel focuses on the appearance of certain words in the English language. More specifically, it focuses on irregular verbs. by analyzing the appearance of the irregular and “regular” versions of the same verb, the phasing out of the irregular verb form can be predicted mathematically based solely on the frequency of the verb’s use in English language. The  clear example that is presented is the word throve vs. thrived. Clearly we mostly use the word “thrived” instead of “throve” but this isn’t the case when we look that the comparison between the words “drove” and “drived.” According to Aiden and Michel, the only difference between these two verbs which both have been irregular at some point is the fact that “drove” was used much more often than “throve.” As Aiden and Michel state on page 44,

“…once one took frequency into account,
the process of regularization was mathematically indistinguishable
from the decay of a radioactive atom. Moreover, if we knew
the frequency of an irregular verb, we could use a formula to compute its half-life.”


For the most part, Zipf, Aiden and Michel used literary resources to make their predictions on verb frequency. They state that the sole factor that influences the “regularization” of irregular verbs is the frequency of the verb in question in literature. Although their prediction may be correct to a certain extent, they disregarded the effect of social influences from their main argument. On the second page of the anecdote Burn, baby, burnt,  it states,

“A few days later, he saw another distressing headline, this one in the Los Angeles Times: “Kobe Bryant Says He Learned a Lot from Phil Jackson.” The student knew nothing about Phil Jackson, but was still shocked that Kobe had learned from Phil. If anything, he should have learnt.”

Although the pure analysis of the frequency of irregular verbs such as “learnt” may be a good determinant of the future of the regularization of that particular verb, it does not take into account any social factors or any other determinants that affect the frequency of the verb. It may be that the regularization of the verb starts with the simple news headline that used the regularized version of the verb which inadvertently sparked it’s popularity within the general public and as a result it eventually sets off the cycle for the word to make its way into formal literature. These social effects may or may not speed up or slow down the process of the regularization of certain words; for example, if kids are being taught generation after generation that the correct past tense for “drive” is “drove” and not “drived,” then these social pressures may affect the eventual outcome of the word, regardless of frequency. The significance of social factors on the regularization of irregular verbs can only be determined through further careful analysis.

Early Ancestors of Big Data

Erez Aiden and Jean-Baptiste Michel discuss the matter of the evolution of language and came to the conclusion that something similar to natural selection might be affecting modern forms of communication. They use the example of how in  English, the “-ed” past-tense ending of Proto-Germanic  replaced the Proto-Indo-European form of indicating tenses by vowel changes. The only words unaffected by this change were irregular verbs. To test their theory  Aiden and  Michel came up with an idea they dubbed culturomics, meaning the use of large amounts of digital information  and big data to track changes in language and culture.

In Aiden and Michel’s book, Uncharted, they make the claim that language is the primary method for communicating culture around the globe. Since it has written form they state that it is a convenient data set for scientific analysis. Language is the basis for communication but not the only method to communicate the arts and other manifestations of human intellectual achievement regarded collectively.The author’s seem to make the assumption that language is the primary form of sharing culture. What I question is whether there is a way to test their theory on other forms of expression to see if a form of Darwinian evolution affects not only our genes, but our culture as well.

The Evolution of Language

In Aiden and Michel’s “Uncharted: Big Data as a Lens on Human Culture”, several arguments and claims are made concerning language. It focuses on specific words and develops an idea that states that culture can be defined by our use of words in a language. The article focuses specifically on a man by the name of George Kingsley Zipf, who came up with this theory. It was his idea that words were not all equal, and that there were certain words that a culture valued more than others in a language. In an experiment, Zipf counted every time that a word was used in the book, Ulysses, and recorded it, rating its importance, only to find that his theory was proven to be true. People tend to value words such as “the” and “I” much more than ones like “quintessence”. From my own perspective, this seems obvious considering the first two words portray ideas and connections that are needed in our everyday lives, whereas the latter word is not always necessary to all scenarios because of how specific its definition is. However, what was interesting is this: Zipf found that “There was in inverse relationship between a rank of a word and its frequency of use” (Aiden 34). In other words, the higher up on the list a word appeared, the less important to the language it was.

Now, think about the American language. The reason why many people say it is so hard to learn is because of all the irregulars that are present within it. These irregulars seem to follow no rules and conjugate as they please. In this article, it points out something interesting. The words that appear toward the top of Zipf’s list, or the ones that are more important, tend to have irregular qualities while those near the end of the list all tend to follow the same rules and conjugate accordingly. Now, there’s another theory that I want to bring up that is mentioned quite a bit in this article. This theory states that irregular words and conjugations will change with time. More simply, words will be conjugated differently in the future than they are now. How close does this come to the truth, though? While this theory makes sense considering the evolution from “old english” to current language, I do not believe that our language will change that drastically in the future. The transformation from old English to current English involved the creation, if you will, of an entire new language. The way people pronounced words was different, and the words themselves were completely different as well. The word “thou” is not the same as “you”. I agree with the idea that language may change over time; however, I strongly doubt that conjugations will be the only things that change in our language. If our language is going to change, it will all have to change together, for as long as the past generations are teaching current generations, the word “stinked” will always be incorrect.

Culture Is Influenced by Natural Selection

“If you take the long view, the question of why we say drove and not drived becomes a scientific quest for the forces that shape evolution of human culture. For a long time, we had no idea how to even begin to uncover those forces.” (Problem Child) By addressing this question that has been frequently asked by children, the authors (Erez Aiden and Jean-Baptiste Michel) of the reading incorporate phenomena from different disciplines we acknowledge in order to understand the reason behind it. By solving the question, we may find the force that shapes the evolution of our cultures.

Aiden and Michel believe that the selection for the words we use follows a rule similar to the Biology theorem established by Charles Darwin more than 150 years ago. “The reason we still say drove – whereas we’ve abandoned other irregular forms, like thrive, in droves – is that drive is far more frequent than thrive.” (The Once and Future Past) According to many linguists nowadays, the regularization of words is refer to the process where the transformation of words that is less frequently used will be forgotten by people and those words will be used based on the most popular transformation in the society. Just as the language evolution, our culture evolution follows the rules of natural selection. The selection based on the frequency of encounter implies the selection maximizing the possibility to survive. In our daily life, people adopt the most popular use of language so that the communication with the others will provoke fewer conflicts. As a result, we use different slangs or jargons to get recognized by a certain group of people. For the similar reason, it is our nature to remove the memory of the things that we do not use frequently in order to reserve the capacity to store something more important in our brain. For example, we will not try to memorize all the mathematical formulae we need to use if they will be easily accessible online.

Culture represents the different ways of life a certain group a people adopted based on their experiences, and it changes whenever people learn from trails and errors. Since the language we are using has never stopped changing, our culture will never cease to change as well.

The Backbone of an Argument

Claims that are used in arguments must be properly supported in order to contribute as a whole. If the original information is changed or exaggerated, the overall credibility of the work could be subject to question. Darrell West’s report on big data’s application in education (link) retains its credibility because it uses reliable and accurate citations as a backbone for its argument.

To prove that West’s report can be trusted, one must look closely at how he cites his sources and how those sources shape his argument (or how he shapes his sources to match his argument). At the bottom of each page that contains an external reference, West points the reader to his sources.

Screen Shot 2014-10-14 at 11.17.02 PMTo prove that West’s use of of other researchers knowledge  is consistent with their research, it is necessary to look closer at the reference in the footer. By taking the title of Joseph Beck and Jack Mostow work listed in the footer as source number 5 and searching for the document online, one can easily find an abstract of the original document (link). While this work also contains references to external sources, the aspect that west was referring to (reading one story multiple times does not lend to as much learning as reading a variety of stories) was researched and carried out by the authors of the source. This makes this document the primary source for this particular piece of information in West’s report.

The work by Joseph Beck and Jack Mostow contained information that was consistent with what West claimed in his report:

Screen Shot 2014-10-14 at 11.32.39 PM

West’s use of the source was honest and accurate. He brought in external information, properly sited it, and correctly reported the content of the source. His individual interpretation of the source (and how it affects education), as with any citation, is what provides backing for his argument. In this case, the source was referring to the effects of rereading on learning and West showed that this can be applied to education through the use of computer  aided education. The source provided backbone information and West shaped it in a way to support his argument.


The Strongest Link: Finding Suitable Sources for Your Research Paper

My post revolves around my pursuit of the source denoted at the end of Page 5 in this week’s class reading; for those who didn’t know to which i was referring, it is 16 McGraw-Hill, “Building the Best Student Assessment Solution,” New York: Acuity, 2009.


I was actually surprised at the ease with which I found what I feel was the original source of information behind the citation; it was literally the second link I encountered when searching for the citation in full on Google. I found that the webpage contained a full-length PDF file explaining the Best Student Assessment, and I felt I could trust the site’s authenticity. I decided to experiment with three other sources found in the weekly reading and they, as well, were discovered within the first 5 links when searched on Google. I am not pointing this out to say that all research paper sources can be found this easily; I just assume I had good luck.

Regardless, finding a reliable source does not effectively constitute a research paper, and the source that is employed in its construction must be utilized in such a manner that its general focus coincides with that of the paper. In addition, the research paper must accurately refer to the source and make either make effective use or reinterpretation of its contents, to reinforce the paper’s claim.

In the case of Big Data and the Best Student Assessment Source, both the paper and its source promote the process of data mining (or data warehousing) as a new and effective means of pushing along student achievement and improving the student learning experience. Both the purpose of the paper and the source agree on this point, and the two possess a certain synergy when paired together, and serve to further reinforce the author’s claims on data mining in education.

Life begins at conception?

One of the major controversies at hand in America is the argument of when life begins. This issue has plagued american society for many years and the american people are very divided upon this issue. One of the many papers written about this controversy is “Life Begins at the Beginning” by Dr. Fritz Baumgartner, MD (1).

Upon searching for further information about the veracity of this article in its goal to establish that Life begins at conception, I found several articles that support the idea posted in this website. The website itself states several very heated reasons why the idea that life begins at conception is the correct view. The article itself poses a  one sided and heated explanation of why Dr. Fritz Baumgartner is correct and other scientist are wrong. Based on the bias of the author it seems that the sight may not be trusted. This is because some bias and heated statements are not supported by factual evidence. This can be seen used in cases where the heated author is lacking evidence and lashes out at their opponent as a defense. However, upon further investigation of the website and its resources this is not the case. At the end of the article, Dr. Baumgartner’s education and work history is posted to add veracity to the heated author’s claims. In addition the article contains other resources that authors claims such as the article “Scientist attest to life beginning at Conception” by Randy Alcorn (2). In this article Alcorn lists off the names of prominent scientist with research history who all have shared ideas about when life begins.

The article “Life Begins at the Beginning” is support well and even though it is slightly heated the its resources are trustworthy.



Government Secrecy: Bad or Bad?

Greenwald uses a quote from a Washington Post article claiming (in the context of No Place to Hide) that “much of our government’s business [is] so large, so unwieldy, that no one knows how much money it costs, how many people it employs, how many programs exist within it or exactly how many agencies do the same work”. By Google searching the quote, I was readily able to locate the original source, an article entitled “A hidden world, growing beyond control”, at the very first link.

In Greenwald’s context, the quote supports the claim that too much government business is “conducted in secret”. In the original source, Greenwald interestingly chose to leave out a few words when he quoted this evidence (found in the first paragraph). The original source says that the government’s business is “so large, so unwieldy, and so secretive”. Greenwald may have done this in order to hide the fact that the quote as a whole is not about government being conducted in secrecy, but it is merely a portion of the claim being made in the original piece. Leaving that part out creates the illusion that the entire quote is about government secrecy. The Washington Post article does not, in fact, talk about individual privacy being an issue whatsoever. It instead focuses on the claim that the government and its individual departments and agencies are growing so much that it is becoming counterproductive. Priest and Arkin, authors of the Post article, claim that this is occurring because the government is too secretive. If it was more transparent, its abundant wastefulness would be exposed, and it could be made more efficient. It does not claim that the NSA is too invasive; it claims that it collects unnecessarily copious amounts information that clog the inlet that useful information comes through. It claims “secrecy within the intelligence world hampers effectiveness”.

The evidence is reliable because its from a credible news source like the Washington Post. The quote is misleading in the Greenwald piece, however, because it leads the reader to believe that the Washington Post supports Greenwald’s claim. This is not necessarily true because the Washington Post never addresses that claim and goes about the issue of government secrecy in an entirely different direction. Therefore, the quote’s reliability is compromised because Greenwald evidently manipulates it to fit his claim, which is not the purpose it was originally intended to serve.

Vatican proposes a change in viewpoint on gays?

The original source is an article from CNN about the possibility that the Vatican’s viewpoint about gays could be changing in the near future. The evidence in this article was a quote from a report that was released as a summary of the discussions within the Vatican this past week. However, it didn’t list a source, so I googled the quote and found it in an article on patheos. This article mentioned that the quote was from the Relatio, so I googled this as well. It led me to ZENIT The world seen from Rome and with the title “Synod14: Full Text of Relatio Post Disceptationem” before the original quote among other text from the same document. I then googled Relatio Post Disceptationem and found a website for the Bollettino with “Synod14 – Eleventh General Assembly: “Relatio post disceptationem” of the General Rapporteur, Card. Péter Erdő, 13.10.2014” as the page title. When I went to the homepage of the website I discovered that this was the official bulletin for the Holy See Press Office. This is when my search stopped because the official bulletin containing the updates within the Vatican seemed like a trustworthy source.

It was somewhat difficult to find the original source of information because it was an official document that was quoted many times. The original source was an update from the Vatican and the article I read was a both an informative article as well as a summary of the reactions occuring in the United States. Though the purposes of the sources of the information were different, the meaning of the quote did not change when used in each article. In addition, both sources were to inform the public about the change of the Vatican, and both sources do just that.

Maybe Snowden Isn’t That Bad


The article “Switzerland May Take Edward Snowden in Return for Testimony on Spying” talks about a recent event where U.S. fugitive Edward Snowden would be granted safe passage to Switzerland in exchange for helping work on an important Swiss legal case. Swiss authorities state that his addition to the case team would be a huge help because of his extensive experience in the legal and intelligence gathering field. Switzerland also would not comply with a U.S. extradition request should he be accused of treason or divulging state secrets.
This article shows that Snowden is a man of great experience and despite the reputation he has acquired among those in the United States, he is still viewed in the world with a fairly positive light. To further reinforce this point, German officials also wanted to invite Snowden into their country, but had to refuse him due to a possible clash with the United States asking for his extradition. In fact, the picture above is of German protesters in Berlin, holding up signs to “Welcome Snowden” and the like.
It was rather easy for Snowden to find asylum in another country, and there’s a good chance that he will find asylum in many more due to the fact that a lot of countries want to hear what he has to say. A lot of countries want to know if the American NSA has been doing anything shady within foreign borders, and though it is technically treason for him to speak out, if he’s in another country nobody can really stop him from doing it.

60 Minutes: Inside the NSA

60 Minutes

In this fifteen minute excerpt, 60 Minutes interviews a few of the leading generals in the National Security Agency about the latest procedures and data collection techniques used to promote our homeland security. Lately, due to the recent leaks released by the now infamous Edward Snowden, the NSA has been in the spotlight, being criticized for its pervasiveness and lack of restraints on its tracking abilities. Over the course of these interviews, the NSA explains itself in an attempt to prove to the public that it does not use invasive procedures to gather its information and data; a misconception that has gone viral since the Snowden incident.

Like almost every citizen in America, I have never been directly affected by the processes the NSA uses to obtain information. As explained in the video, the Agency collects personal data from all citizens, yet all sources are anonymous and noninvasive. Several people still feel like they are being spied on mostly due to their misunderstanding/lack of understanding of the NSA’s procedures and exaggerations of the truth. Much speculation has come from only a few, extremely rare, instances where Agency employees have broken their on rules and snooped on their subjects. However, matters dealing with privacy were in the hot seat due to the extremely large number of leaks released by Edward Snowden. A considerable amount of people see Snowden as a man of the people, releasing top-secret government files that reveal the truth about government actions to the people. On the other hand, the NSA views him as a traitor to the American people because he has the ability to reveal several faults in the safety precautions and procedures that could lead to countless problems for the USA’s safety in the future. Although he can sometimes be titled as a “hero” of the people, Snowden sees himself only as an average American exercising his rights.

The most bothersome clip throughout this segment begins around the 10:15  mark, when John Miller asks the leading general of the Agency of the amount of power and the breadth of knowledge that is contained within the millions of files stolen by Snowden. He admits that there are several files that were taken that could lead to several problems for American security, containing information about the weaknesses in the the country’s defense system and its lack of knowledge regarding other nations around the world, such as China, North Korea, and Russia. Fortunately, those files have not been released to the public but the General understands that their release would cause several problems for homeland security.

Like I stated earlier, I have never been bothered by the way the NSA performs is job, but the chance that a revelation could come in the near future is quite nerve-racking and could lead to several more problems for the NSA and the nation’s defense as a whole. Although some leaders in the NSA would like to bargain or correspond with Snowden in order to receive the lost information, others wish to grant him no mercy and to not let him get away with such a huge incident. Although there has been no deal made between the two parties, I feel like it is safe to assume that the NSA is working to retrieve the files and make the best of the Edward Snowden situation.

Edward Snowden saga

Edward Snowden

This article analyzed the overall story of Edward Snowden, what he have done and what happened afterward. The article starts by mentioning the top secret surveillance activities of NSA, which stands for National Security Agency, reported by Glenn Greenwald in Guardian, the British newspaper, on June 5. On June 9, 2013, admitting that he was the source of this disclosure, Edward Snowden, the former agent of CIA, revealed the fact that many private information of US citizens, such as phone calls or emails, were being collected by NSA. He, then, fled to Hong Kong to avoid potential punishment by US government. However, US government tried to bring him back to US, and asked extradition of Snowden to many of the countries, including Hong Kong. Edward Snowden, realizing that there were not much place to stay, moved to Moscow, Russia, and asked for asylum to Russian government. Russian government, with few rules attached, granted Snowden asylum, letting him to stay in Russia until 2017.

Disclosure of Edward Snowden, though the fact revealed was only related to US, was a warning signal to many people around the globe. Whether the secret revealed by Snowden is true or not, they began to recognize that how unprotected their privacy was. Yet, this article does not only praise Snowden as a hero who risked his life to reveal the truth, but also gives attention to the opinion vilifying Snowden. As Obama mentioned, there cannot be 100% security with 100% privacy. There are also thoughts saying that Snowden has magnified thee action of NSA, overstating the faults of government policy. The fact that activities of NSA have prevented more than 50 potential terrors since 2001 is a solid evidence for this opinion.

It’s also interesting to see how Snowden’s movement further affects the international relations. The tension between Russia and US was sharply increased because Russia granted asylum to Snowden without extraditing him. Though Russia told Snowden that he would be extradited if he takes any action that harms US, Russia not sending Snowden back to US made President Obama uncomfortable.