Saturday, August 31, 2019

Piracy Case Essay

In general I do support the act of piracy in certain ways. Piracy does not mean it is bad; some may disagree or agree the term piracy is debatable on different perspective. Let’s take an example if we look at piracy in an objective way it is bad, but what if we change it to a subjective perspective? The term piracy is debatable. The topic of piracy is actually quite hypocritical because societies mainly do it even though they know it is illegal. Of course in general our society thinks that piracy is a bad thing, because some resources are license .Take an examples for movies or games, these developers strive hard to produce these entertainment purpose, but by downloading privately it will waste their efforts as their sales of the product decreases. Some people say that piracy is a good thing; its true piracy may be a good thing because it affects our current society on a larger scale no matter in education, entertainment, or any other field. What Google is trying to do here is to reduce the act of piracy, but we all know it is a hard task. The internet itself actually opens up a wide field for piracy; if we want put a stop piracy we might as well ban the usage of internet. Sometimes piracy is a good thing because we know that the internet consist of all source of information by downloading it, people from all around the world can benefit from it. Some society that came from rural areas couldn’t afford the resources, so they have to use an alternative way to acquire them. By doing so the act of piracy could be good. To conclude it we could say that the act of piracy is actually good in some way .In my opinion I think it goes both ways.

Friday, August 30, 2019

Is Walmart good for America? Essay

As the largest retailer in history, it’s no surprise that Walmart is the target of both vicious attacks and effusive praise. According to its own website, Wal-Mart Stores, Inc. operates more than 8,000 stores, employs more than 2.1 million people, and sells more than $400 billion worth of goods in every year. Though this bulk intimidates those who fear for the viability of â€Å"mom and pop† retailers, Walmart’s great strength is that it devotes its considerable power to American consumers. Its size enables it to provide services that other retailers cannot, and it has deservedly become an integral part of the modern American economy. Criticisms of Walmart’s effect on small retailers fall flat because of Americans’ role in that effect. Consumption is the only democratic component of the corporate world: small retailers fail because Americans choose Walmart. Walmart provides cheaper, better, more accessible services than its competition. While competing stores’ closings produce touching hard-luck stories, the shift to Walmart is beneficial for society, because Walmart is much more efficient at every stage of its business. The benefits of this efficiency are less personal and more broadly spread than the costs to smaller competitors, but such dissemination of value demonstrates one of the best qualities of Walmart – its egalitarianism. Walmart provides a good that is accessible to virtually all Americans. The 2006 book The Walmart Effect estimates that 97% of Americans live within twenty-five miles of a Walmart, and Walmart’s low prices assure that the store is also economically accessible. As long as consumers continue to choose Walmart (for understandable reasons), the onus is on small retailers to find better ways to compete. The second main argument against Walmart deals with its impact on suppliers. Because Walmart has such immense buying power, it carries great influence with manufacturers. Fortunately, Walmart uses its substantial bargaining power in the interests of American consumers by demanding ever-decreasing prices. Though manufacturers often complain about this pressure, it forces  constant innovation, which ultimately benefits consumers. Walmart has much to teach American businesses. Despite its size, Walmart is a paragon of corporate efficiency. It has compiled the largest sales data-set of any American retailer and analyzes this data using the second largest supercomputer in the world (trailing only the Pentagon). Aided by this number-crunching, Walmart excels at knowing what its consumers want. Walmart’s purchasing decisions thus reflect American preferences. In short, Walmart is a driving force in the American economy leading to smarter, more streamlined production, and (as always) lower prices for consumers. The benefits of Walmart’s efficiency are not only economic, as illustrated by the company’s response to Hurricane Katrina. Walmart’s response to the hurricane was lauded even by its critics: it donated more than $20 million worth of merchandise, including food for 100,000 meals, and it promised jobs for all of its displaced workers. But what I wish to extol is not Walmart’s largesse, which bore immediate public relations benefits, but rather the utility of their efficient distribution system. The first supply truck to arrive at the Superdome after the hurricane came from Walmart, not from FEMA. The administrative particulars of Walmart’s response to the hurricane, detailed in a study by Steven Horwitz, are both fascinating and inspiring. Walmart’s existing distribution chain was – and is – able to deliver needed goods faster and more efficiently than a government agency, which (besides being inept) had no existing infrastructure to respond to the disaster. The Coast Guard, another organization praised for its post-Katrina efforts, was great for rescuing people from flooded houses, but it was incapable of providing them with sufficient supplies afterwards. Without the aid of Walmart, the aftermath of the hurricane would have been even more catastrophic. Regardless of its reputation or its value to society, Walmart is here to stay. Consumption drives our daily lives and accounts for some 70% of America’s GDP. As long as Walmart continues to increase the accessibility and quality of consumption, it will remain America’s top retailer and continue to grow. Whether or not you choose to shop at Walmart, everyone should appreciate it as an outstanding American institution.

Thursday, August 29, 2019

Case Study About Trust Report Essay

Trust is the ability to rely confidently, either on an individual or in this scenario the company’s product. It is judged on three dimensions; namely, the ability to be technically competent, its benevolence, that is, the interests and motives, and, finally, the integrity. Positive judgment is a good reflect on the customers’ will to take part in the organization’s dealings. This act may involve buying the company’s products, investing in its stocks, or being an employee. In case any of the attributes become questionable, it may make the customers wary and reluctant in risk taking (Kourdi & Bibb, 2007). Distrust in the organization may increase inefficiencies of innovation and damage relationships. Causes leading to the loss of trust Toyota Motor Corporation is a Japan based motor manufacturer. Its headquarters are in Aichi, Japan. This corporation was founded in 1937 and had been since among the best performing motor manufacturers and dealers in the world. With more than 3 billion yen as profit in a fiscal year, as per the financial report of 2013, Toyota could be said to be among what Forbes magazine would name the top 100 best corporations (Kourdi & Bibb, 2007). Since the year 2004 to 2010, there had been several complains on Toyota Motors concerning engines and accelerators. On 28 August 2009, a tragic accident occurred in San Diego involving a family travelling in a Toyota Lexus. The car lost control and all the passengers died. Toyota, known for its impeccable repute for reliability and quality products suddenly had to deal with trust crisis. A deficiency in attributes that lead to trust of the company’s products and services in form of a scandal can lead to instant lack of trust (Blackshaw, 2008). An effective response to a trust scandal or failure needs interventions that are aimed at curbing distrust and  rebuilding trustworthiness. Distrust regulation can be done through enforcing controls, conditions, and constraints to employees in order to rectify the failure. Intervening may require the removal of guilty parties, the change of the cultural norms of the organization, and introduction of new or the revision of incentives (Blackshaw, 2008). This is not sufficient. Statements and actions too are needed to demonstrate trustworthiness. Statements that show the company’s compelling ability, integrity, and benevolence are required. Apologies, transparency, and ethical practice are required as well. How effective do you consider the taken mitigation actions? Effective repair of trust should undergo simple steps. The first is immediate response to Toyota Corporation belated communications; belated recalls and public apologies damaged its reputation more than the original accident (Liker, 2004). The company ended up losing its sales, investors, and market share. They also lost customer confidence. Toyota Company expressed concern by realizing a statement where they apologized to the family of the victims. It also pledged to carry out investigations. However, the company, regrettably, did not point out the possible causes. This seems like an effective immediate response but it is required for a company to point out to possible causes. Later, the floor mats were suspected to be the likely cause of two accidents that had occurred earlier, but this did not prompt the company into issuing a customer warning (Liker, 2004). They acted upon the suspicions five days after the analysis of the cause was confirmed. This was nineteen days after the fatal accidents. In order to rebuild customer, employee, and investor trust, Toyota Motors released a statement assuring their customers that the floor mats were in good conditions and safe. They praised them as being among the safest mats. This statement was later challenged by NHTSA who accused the company of releasing misleading and inaccurate reports. In a bid to save itself from further downfall, Toyota Motors reacted by giving a remedy to the sticky floor mats. This action caused discretion among investors who thought of the company to have had unclear motives when they released the first statement (Liker, Hoseus, & Center for Quality People and Organizations, 2008). This further dented the trust of the shareholders. The mitigation process of the Toyota Company took  a long time, hence more damage to be controlled. It was ineffective at the beginning, which was a blow to the shareholders. Although the company founder Akio Toyoda later sent out apologies and through the wall street journal expressed his commitment to reforming the company towards better and safe products with the aim of repairing the damage that had been done (Liker, Hoseus, & Center for Quality People and Organizations, 2008). The company through the court compensated the family that had lost their relatives through the accident. This was a step to convey the company’s acceptance of the guilt. Consequences of not addressing trust issues Failure to respond to issues and address the remedies publicly can lead to severe disciplinary actions on a company. These actions may include its termination and payment of fine; Toyota Company due to its sluggish manner of responding to the claims against its products was fined $16.4million (Pelletier, 2005). This is because the company failed to warn its customers thereafter. Toyota accepted its penance. Do you believe that the company’s reputation can be re-build, or will they suffer the consequences also in the years to come? Despite the tarnishing of Toyota Corporation’s reputation, the customers’ and investors’ trust will be rebuilt. The actions that the company undertook such as restricting the company’s management team and procuring a new safety system have seen the company rise to becoming once again among the most profitable companies in the world (Pelletier, 2005). The company is rebuilding itself since the 2009 failure. It has had numerous innovations and recently announced mass hiring of employees. References Bibb, S., Kourdi, J., & Bibb, S. (2007). A question of trust: The crucial nature of trust – and how to build it in your work and life. London: Cyan. Blackshaw, P. (2008). Satisfied customers tell three friends, angry customers tell 3,000: Running a business in today’s consumer driven world. New York: Doubleday. Liker, J. K. (2004). The Toyota way: 14 management principles from the world’s greatest manufacturer. New York: McGraw-Hill.Top of Form Top of FormLiker, J. K., Hoseus, M., & Center for Quality People and Organizations. (2008). Toyota culture: The heart and soul of the Toyota way. New York: McGraw-Hill. Pelletier, R. (2005). It’s all about service: How to lead your people to care for your customers. Hoboken, N.J: John Wiley & Sons Bottom of Form Bottom of Form

ANALYSE THE WIDER IMPACT(S) UPON THE WORLD ECONOMY OF THE RISE OF THE Essay

ANALYSE THE WIDER IMPACT(S) UPON THE WORLD ECONOMY OF THE RISE OF THE IMPORTANCE OF CHINA - Essay Example On the other hand, economic growth is always accompanied with military power advancement (Perkins 2009). China invests highly in military power and therefore makes future security levels unpredictable since no one is sure of how the nation will use its power if by any chance it happens to be the world’s most powerful nations. Therefore, since China’s growth is progressive and promising, the rest of the world ought to be ready for the expected advanced impact. Export of both products and labour as well as foreign direct investments makes the greatest percentage of Chinese GDP growth. The implication is that China’s economy greatly depends on its relation with other economies. To China, the overdependence on exports pose minimal threat if any, given that China does not only enjoy competitive advantage over other economies but also has a government policy that supports foreign relations. Notably, China’s initial economic bump up recorded in early 1970’s was attributed to internal factors with negligible dependence on exports. As matter o fact, this initial economic growth resulted from increase in domestic consumption and government expenditure (Zhu & Kotz 2010). China is densely populated and thus provides a reliable market for its manufacturing products. Notably, the increased domestic consumption resulted from a transformation of Chinese economy from agricultural nation to an industrial economy, but maintained a some how closed economy. By then, China’s growth had little or no economic impact on the global economy. The rapid growth in GDP provoked an urge for extra market and hence led to the incorporation of ‘market reform’ policy in 1978 (Zhu & Kotz 2010). This policy saw China enter the global market in search for market of their excess production. China might not be well endowed with capital, as

Wednesday, August 28, 2019

Lack of adequate clinical data on non-pharmacological aspects relevant Literature review

Lack of adequate clinical data on non-pharmacological aspects relevant to intervention - Literature review Example Type 2 diabetes is one of the types of diabetes that are responsible for the deaths. This type of diabetes creates an extra expense on public health. Ali (2010, p. 21) mentions that the health department does not have a clinical audit data therefore it is not possible to get information on whether the patients receive the appropriate diabetes care. According to his survey conducted prospectively, it emerged that there are particular moral and ethical issues of concern relating to the end of life care of diabetes. His study covering non-pharmalogical interventions in Type 2 diabetes was carried out over a period of three weeks. Ali issued out questionnaire to fifteen patients that included three teenagers and six male patients and same number of female patients. Ali (2010, p. 34) identified primary prevention measures appear to be the best options for the first time patients. This includes among others, specific assistance to patients to reduce weight, reduction of calories, pharmacot herapy, and increased physical activity. All these options fall under structured lifestyle programs. Whitaker (1987, p. 59) explains in his research that the health department needs to carry out an all-inclusive approach to managing Type 2 diabetes condition. In this method, new mechanisms will involve integration of the community, health policies, and practices when implementing primary prevention strategies. Bernstein (2005, p. 23) mentions the importance of structuring the lifestyle of people in his research and says that it reduces morbidity and premature deaths brought by Type 2 diabetes. Having applied non-probability sampling criteria, his study avers that effective management entails giving the community a chance to participate in public health care, which is an integrative primary prevention methodology.... This approach puts the strength of countering the Type 2 diabetes at the community level where the health department empowers people to take care of their health conditions. Primary health care prevention measures reduce the extra expense that diabetes puts on the public. The burden incurred by the public justifies their involvement in prevention measures. Bernstein (2005, p. 51) explains that it is essential to note at this level that the cost of treating Type 2 diabetes and maintaining the condition is excessively high and many people may not afford. In this case, conducted the study several times adds to its authenticity. Furthermore, the cost of treating Type 2 diabetes may redirect a large portion of income from other core functions. Conversely, (Weaknesses) The treatment has harmful side effects including causing hypoglycaemia. These issues pose a challenge to people who cannot easily access medical care. Ezrin (1999, p. 41) disagrees with other scholars in his studies that the health department needs to consider these facts and involve the community in preventing the occurrence of Type 2 diabetes. He posits that since most of the schlars applied the non-probility sampling technique, they denied others people an opportunity to participate in the research which may have changed the flow and conclusion of the studies. According to him, other benefits of preventing Type 2 diabetes by modification of lifestyles comes with secondary benefits to the community. Most researchers did not capture this due to the sampling module used. Following the approach Ezrin (1999, p. 49) says reduces chances of getting certain cancer and heart diseases, low risks of hyperlipidemia, and hypertension. Storrie (1998, p. 31) supports Ezrin in the sense that

Tuesday, August 27, 2019

A Quit Smoking Education Program For Parents Assignment

A Quit Smoking Education Program For Parents - Assignment Example Instead, what works for one may not work for another user, and vice versa. Nonetheless, certain key factors (a ‘best-practice’ process) assist most people to quit: the user should make the decision to quit; set a quit date; prepare on how to react to quitting obstacles; getting support family, friends or even successful quitters; if necessary, get medication; and finally stay quit by finding ways to deal with relapse and sustaining the quit status (Stead et al., 2008). In order to generate a cost-effective delivery model, and eventually succeed in implementing an improved curriculum, a healthy canteen, a staff exercises group and a school vegetable garden. For an improved curriculum, a number of decisions and planning steps need to be followed when preparing each aspect. The instructors should equip themselves for the challenge, without continuous training ensured to equip them with the necessary skills; both in helping the users opting out of smoking as well as those in need for advice against the practice. Once the course has been developed, assessment instruments should also be generated in form of checklists, objective tests, or rubrics (Jarvis & Wardle, 1999). However, for the medical attention that might be required by the smokers battling with relapse, the medical staff should ensure availability, sustainability, and continuity. As such, patients can benefit from ‘walk-in’ sessions, appointment and patient-follow-up system, and availability regardless of pharmacist workload. Jarvis, M, & Wardle, J. (1999). Social patterning of individual health behaviors: the case of cigarette smoking. In Marmot M, Wilkinson R, editors Social determinants of health. Oxford: Oxford University  Press.

Monday, August 26, 2019

Urban Design Term Paper Example | Topics and Well Written Essays - 500 words

Urban Design - Term Paper Example Any such construction work should be in sync with the natural environment. Whyte supported the Cluster-Zoning concept, lately referred to as planned unit development. The idea was to secure more open landscape by constructing houses in comparatively limited spaces. Under the fifth planning principle, it was decided to zero-in the region where natural ecology needed to be secured. The purpose was to carry out developmental work only in such areas that would not result in the long term loss of the environment. The impact of this planning principle was visible to a limited extent, as it could not offer a complete solution to the migration outcomes to suburban and rural areas (Barnett 39). The prevalent urban form suffers from the insecurity of not finding long-term solutions of sustainable urbanism. Self-validation mars the impact to be created from the spirit of pursuing a comprehensive policy. For example, when a certified green building is enveloped by paved parking or a residential locality becomes unsustainable because of not being energy-efficient or land usage is sometimes afflicted by faulty construction. The Urban Growth Boundaries (UGB) were created with the aim of demarcating land usage beyond a limit. The UGB succeeded in its purpose of developing land within an allotted region but its sustainability purpose of quality of the developed land was compromised. It turned out to be well-positioned but bad development (Farr 28). All efforts made by Congress for New Urbanism (CNU) have not delivered results because it devoted its efforts to bringing amendments in the traditional regulatory practices to make them modern with urban sustainability. There are still unfriendly singular parameters on the built environment that are detrimental to climate changing sprawl. Further, the United States Green Building Council’s (USGBC) targets of Leadership in Energy and Environment Design (LEED) are flawed. LEED has not been

Sunday, August 25, 2019

Anemia Essay Example | Topics and Well Written Essays - 250 words - 1

Anemia - Essay Example The body may also fail to generate red blood cells to lead to the shortage in the body. Alternatively, the rate of degeneration of red blood cells may be higher than the rate of regeneration to identify shortage into anemia. Anemia also exists in different forms and examples are â€Å"iron deficiency anemia,† â€Å"vitamin deficiency anemia,† â€Å"anemia caused by underlying diseases,† and anemia that results from hereditary diseases (Women’s Health, 2012, p. 1). Symptoms exist that indicate possible existence of anemia and examples include â€Å"fatigue, weakness, dizziness, headache, low body temperature, pale skin, and shortness of breath† (Women’s Health, 2012, p. 1). Existing tests for the condition are limited to physical examination on the symptoms and treatments such as blood transfusion, suppression of the immune system, facilitated generation of red blood cells, and consumption of necessary supplements exist. The treatments focus o n each cause of the condition. In severe conditions, anemia can lead to cardiac arrest (Chem,

Saturday, August 24, 2019

Revolutions and Political Change Essay Example | Topics and Well Written Essays - 2000 words

Revolutions and Political Change - Essay Example icans attempted to apply the doctrine of popular sovereignty prior to the territorial struggle over slavery which is attributed to the emergence of the Civil War. Political scientist Donald S. Lutz observed that in the American notion, popular sovereignty meant placing ultimate and unyielding authority in the people given that there are varied ways to which sovereignty can be expressed covering multiple institutional possibilities be they passing of laws, elections, and recalls (Constitution Society, n.d.). The American Revolution marked a departure in the concept of popular sovereignty as it had been known and used in the European historical context (Constitution Society, n.d.). Thus, with the revolution, the Americans were able to substitute the sovereignty that had existed in the King George III personage. Goldstone (2014) concurs that prior to this, however, the power of declaring war, levying general taxes, making peace were vested on the Federal government with the government of the Union drawing similarities with the King’s Government in the old French monarchy. The spirit of popularity and conciliation would have the Federal legislature of the Union composed of a Senate and a House of representatives. Another parallel can be drawn in the executive powers. The executive powers in the Northern States were limited and partial while the English represented supremacy. Thus, pursuant to popular sovereignty, the president acted just as the executor of the law that the populace would him against his life, his honour, his pledge and when he was incompetent, the people could vote him out as per the constitutional agreement. de Tocquiville (1831) explains that he Queen/King was independent in their decisions and exercises representing a monarchy which the people were expected to concede to. The duration of the two powers, also show discrepancies. While the term of the president was subject to the executive authority. The monarchy was undisputable and would only

Friday, August 23, 2019

Marks and Spencer Company Case Study Example | Topics and Well Written Essays - 2000 words

Marks and Spencer Company - Case Study Example It can be summed up as: The company has a triangular top management structure; this triangular management structure consists of the three Board Committees that are the Audit Committee, the Remuneration Committee, and the Nomination Committee. All the three board committees supervise and exercise power on different aspects of the company's activities and operations. Also, these committees further consist of different members with specialised tasks. The Audit Committee consists of three members and manages the financial activities inside the company, such as supervising the company's periodic audits, coordinating with the external auditors conducting annual mandatory audits for the company, and superintending the process of annual statutory accounts formulation and furnishing it to the shareholders of the company. The Remuneration Committee undertakes the task to manage the remuneration related activities for the company's employees such as bonuses and increments etc. The Nomination Committee manages the activities concerning the appointment and designation of the company's directors and managers. Apart from the above-mentioned committees working as part of the board, the company's top management also consists of seven directors and one group secretary also acting as the head of Corporate Governance. The financial base consists of two Group Finance Directors, Ian Dyson and Alison Reed, who undertake the responsibility to control and regulate the financial matters in the company. MARKS AND SPENCER-THE COMPANY HISTORY Marks and Spencer plc has had centuries old history of expansion, augmentation and amplification. It started when a Jewish immigrant Michael Marks alone opened a store in the year 1884. He continued to run and manage the small business alone for a decade and after 1894 it was joined by another individual i.e., Thomas Spencer. The business continued to expand and grow under the management of these two legends. They both believed in expanding the scope of their business and therefore, by the end of the 19th century, they managed to open and run 24 stalls and 12 shops in England. Especially in the beginning of 20th century, they practically stepped into the corporate world by getting registered as a private company under the name of "Marks and Spencer Ltd". After adopting this name, the company continued to gain popularity and public acceptance all over the Great Britain with a strategy of selling the British-made products only. By this way, the company earned prestige and stabilised relationship with reputable manufacturers and suppliers in the Great Britain. The son of Michael Marks, Simon Marks, became the Chairman of the company in the year 1916 along with his brother in law Israel Sieff who with joint efforts turned the company into a retail chain store. In the year 1926, the company's shares were float ed in the stock exchange and thus Marks and Spencer continued its further expansion with a significant share of public in its capital and profits. In the mid of the 20th century, the company expanded the scope of its business operations

Thursday, August 22, 2019

Joe Gransden Jazz Jam Essay Example for Free

Joe Gransden Jazz Jam Essay In the recent course of music, it is evident that changes and development are present. The variety of genres had expanded which led to the arrival of brand new breed of artist that is able to represent various entities and cultures. Moreover one of the genres which are developing is Jazz. For many years, different jazz musicians are present. In the current state of music, it is clear that music is flowing. Due to such realization it is needed as a researcher to see a jazz performer to fully see the development of music as well as the instrumentation of the band. More so, the researcher shall provide details to which different kinds of observation during the performance of Joe Gransden Quartet Jazz Jam. The Joe Gransden Quartet Jazz Jam is a group which is composed of different individuals who are actively performing different types of instrumentations. The most important instrument is the trumpet which is usually utilized in every song of the band. The band is greatly utilizing instruments rather than a vocal instrument. There is a satisfying sound of the base and drums and the cello which highlighted by the shallow sound by different instruments with a lighter sound. More so, the whole set which was played by the band were all complete with different instruments which add drama and emotions with the songs. Definitely the whole band is very great for they are able to present their craft and attain a response from the audience. For the band is jazz, the audience is calm and very appreciative of the music. Due to the excellence o the band, it is inevitable that the audience will be entertained and amused on how individuals could utilize music as a form of emotional and artistic perspective. On the other hand, attending such concert is similar to drinking a cold glass of water. Such kind of band is refreshing to the ears and to the soul. The music of the band and their musicality is unquestionable. Moreover, the excellence of each member in their instruments does not need any improvement. The association of each instrument is in line with each other. Upon this, the researcher sees that the Joe Gransden Quartet Jazz Jam is a unique band which does not present that jazz is a genre which is unlike any other. The presentation is highly classical in the perspective of the researcher. The instruments and the notes are learning to the classical music which is usually done in operas. More so, the instrument was playing a huge role in its own unique way. All the instruments were complementing each other. Therefore, the instruments are all highlighted in their own way. Looking at the perspective of the solo acts, it is slightly evident that each of the musicians has their own flare in providing a different style in playing the instrument. In addition to this, the lead of the band –Joe Gransden is an important player in the band. For he leads the whole group towards his visions, he is allowed the most pristine individual in the group. Although most of the musicians in the group is great in their on instruments, it is through the vision and knowledge of Joe Gransden that the group have gone to the levels and acknowledgement of the press and the audience. Moreover, the whole band was a breath of fresh air for the audiences who are interested

Wednesday, August 21, 2019

Evolution Of The Video Essay Example for Free

Evolution Of The Video Essay Abstract The ubiquitous development of technology and computers has changed the way people live, work, play and interact. The profile of business has also changed dramatically throughout the years. With the development of faster bandwidths, videos were also introduced as part of the myriad of services that cyberspace had to offer to its growing patrons. Raynovich (2005) wrote that the video is slowly migrating into cyberspace to cater to the more sophisticated demands of the tech savvy. Several technological innovations in the video-Internet interface are streaming, Internet TV, video conferencing and online gaming. The interface between the Internet and video is inexorable as customers demand better quality and easy access to the medium. The development of the video on the Internet is still in constant flux. The current video Internet protocol still needs more time to evolve. It is apparent that video on the Internet is the wave of the future and something to look forward to. Introduction The ubiquitous development of technology and computers has changed the way people live, work, play and interact. The profile of business has also changed dramatically throughout the years. Technological advancements dominate the shift in business strategies of many firms and made traditional business models obsolete. Upheavals wrought by these developments have forced many corporations to restructure and seek new directions. Financial markets are not spared from the upheaval. World capital markets throughout the globe are now interlinked via satellite, networks and technology. Globalization has linked formerly independent economies. When a cataclysm occurs within a globally linked financial system, the entire global market feels the ripples of the event. Businesses are no longer isolated entities that operate autonomously. People can live and work in virtual reality. It is no longer important that one be physically present in a given work area. Because of computers and connectivity, people can choose to work where they like, when they like and how they like to do their job. With the click of a mouse, an ordinary worker can communicate with his counterpart elsewhere in the globe to discuss work and exchange ideas. The development of computers gave birth to the Internet. In the early 1960’s, a few visionaries saw great potentials in information exchange within the scientific and military circles (Howe, 2005). By 1969, ARPANET originally conceived by Advanced Research Projects Agency (ARPA) became online. Only four computers from partner universities in the southwestern US (UCLA, Stanford Research Institute, UCSB, and the University of Utah) were able to establish contact (Howe 2005).The Internet, or simply the Net, is a publicly accessible worldwide system of interconnected computer networks that transmit data by packet switching using a standardized Internet Protocol (IP). A few other institutions are liked to network. Initially, the web provided information services like †electronic mail, online chat, and the interlinked Web pages and other documents of the World Wide Web† (Internet, n.d.). With the development of faster bandwidths, videos were also introduced as part of the myriad of services that cyberspace had to offer to its growing patrons. Raynovich (2005) wrote that the video is slowly migrating into cyberspace to cater to the more sophisticated demands of the tech savvy. Legacy video networks like cable television, television broadcasts and the DVD and VHS formats are the prevailing formats for the past decades. However, with the entry of faster bandwidths and improved connectivity, it is apparent that the Internet is the wave of the future in video technology. Developments of the Video in the Internet By mid-1990’s, service providers began introducing photos, audio, video and animations. It had broadened the scope of the Internet from just merely text-based transmissions. Real Audio ver. 1.0 developed by Progressive Networks in 1995 allowed Internet users to view real time images on the Internet without the need for downloading the file. This new technological breakthrough is known as streaming. Streaming allowed consumers to access audio files immediately with less download time. The user immediately received a transmission of the audio files as soon as it was released. In 1997, the same company introduced Real Video. This time, images were streamed and transmitted over the Internet. Microsoft was not far behind when they introduced Netshow 2.0 that used better bandwidths. It was later renamed Window Media Player 6.0 in 1999. The program allowed users to play both audio and video streaming formats. By 2000, the improved program can accommodate MP3 formats. In 2003, an improved version of Window Media Player 9.0 allowed users to queue, cross-fade and playback audio and video clips. A video smoothing technology was also included in the new version that allowed content encoding at lower speeds. This was ideal for slow Internet connections (Shaw, n.d.) In streaming, there are two types of server. One is a streaming server and the other is a regular web server. A streaming server sends data in packets and determines the speed of the user’s connection. The server buffers the data so the video could be viewed continuously even when the speed becomes intermittent. The stream server sends video files in three ways – unicast, multicast or reflected multicast (Streaming video on the Internet, 2000). On a regular web server, the video files are treated as regular file transfer. The files are also buffered to ensure continuous play. The video is played back not on the server’s but on the user’s computer storage. The diagram in figure 1 illustrates the process. Aside from streaming, Live Web broadcasting or live webcast is another way of transmitting video tracks onto the Internet. As a computer playbacks the video content in a computer, a streaming server accepts the broadcast. Anyone accessing the server at the time of broadcast would be able to view the video as it is being played (Streaming video on the Internet, 2000). Figure 1 – Process of video streaming (Streaming video on the Internet, 2000). . Streaming video is particularly useful as a learning tool used in many technology driven classrooms. Shepard (2003) differentiated streaming video from the traditional mediums of video like CD-ROM, DVD or VHS tapes. The publishers of CD-ROM or DVD inadvertently lose copyright control of their products once purchased while in streaming, the publisher can control copyright because video streams may not be stored on the viewer’s computer. Compared to VHS, streaming is more flexible and interactive (p.297). Streaming videos allow students to access demonstration or lecture at their own pace. Some of the important points of a lecture for example are hyperlinked to other sources that students can explore. Video streaming can also be used to facilitate examinations where teachers may post their questions and the students may send their answers. Video streaming presents an alternative form of learning transformation and allows teacher-student interaction. Another development in the video aspect in the Internet is the introduction of Internet television. Internet television allows viewers to access television programs on the Internet. However, the Internet offers more versatility and interactivity. The programs are watched on the user’s computer systems while according user’s more control over what they watch and obtain ancillary information over the Internet simultaneously (Noll, 2004, p.4). Presently, web TV received lukewarm responses from the users. Web TV allows users access to the internet through the use of the keyboard attached to a telephone line and a television set to provide display. The HDTV offers clearer transmission because it broadcasts programs in digital format. The scan lines are doubled compared to a conventional television and uses the UHF band width. Raynovich (2005) wrote that in the future, improvements in the Internet access and bandwidth would allow integration of the Internet and the video without mimicking existing cable television business models. The future of Internet television would ignore linear programming where the providers control the viewing choice and schedule. The Internet television should allow users to access programs anytime, anywhere and anyway the viewer would want it. Internet protocol television or IPTV is a current development in Internet television. Mike Volpi, senior vice president and general manager, Routing and Service Provider Technology Group, in an interview cited the new developments in Internet television. IPTV is not simply television delivered over the Internet. It uses the same language and technology of internet. The principle of IPTV follows the tradition of traditional television, cable or satellite but delivered with â€Å"a higher degree of personalization and searchability† (Cisco, 2006). On IPTV, the users are allowed to pick their favorite television programs and watch them on-demand. The IPTV’s interactivity differentiates it from traditional television and cable broadcasts. Video and audio conferencing have been in use for many years using a variety of mediums like the telephone, television and the Internet. When using video conferencing on the Internet through streaming. The first video conferencing was â€Å"Ericsson’s demonstration of the first trans-Atlantic LME video telephone calls† (Roberts, 2004). The network video protocol (NVP) was introduced in 1976 and packet video protocol (PVP) in 1981(Roberts, 2004). Video conferencing has also become one of the popular medium of communication but limited in scope. Not all telephone companies offered the service to their customers. The Virtual Room Videoconferencing System (VRVS) was developed at Caltech-CERN on July of 1997. The initial intention was to provide the communication tools for researchers and scientists involved in the Large Hadron Collider Project and scientists in the High Energy and Nuclear Physics Community in the U.S. and Europe. It has since been expanded to include other professions like geneticists, doctors, and a host of other scientists that requires such a facility (Roberts, 2004). In 2000, Microsoft introduced the software NetMeeting to support video conferencing using the computer. There are two ways to conduct video conferencing on the web – the point to point and multipoint services. Point to point or P2P can link two locations with live audio and video feeds while the multipoint system can provide a link to three or more locations. The P2P uses a protocol of H.323 to establish contact between two points. When connected, both parties can now exchange audio and video over the Internet. For the multipoint system, a multipoint control unit or MCU is necessary to make three or more connections on H.323 protocol (Hunter, n.d.). When Steve Russell developed the first computer game â€Å"Space War† in 1961, videogames became a byword for many homes in the United States. Entrepreneurs saw an opportunity in the videogames industry. Thus marked the beginnings of major leaders in videogames. Nolan Bushnell, the Atari founder was the first to convert video games into a lucrative venture. He developed games without the need for complicated computing requisites and sold them to the public. A modest 1500 units were sold through a pinball company. In 1972, Atari introduced Pong and generated revenues ten times more than the pinball machine. Bushnell later designed a simpler machine for home use. By 1976, the industry players had grown to twenty and their combined earnings grew from $200 million in 1978 to $1 billion in 1981. The biggest players at that time include National Semiconductor, Fairchild, General Instrument, Coleco, and Magnavox (Aoyama and Izushi, 2003, p.427). After several years of successful ventures, the market for video games crashed in 1983-1984. Aoyama and Izushi (2003) attributed the crash to oversupply and sub-standard designs of software (p.427). With the introduction of 3D and multimedia in the 1990’s, it had revolutionized gaming to include network gaming. By the late 1990’s, the MUD or multi-user domain protocol became a requisite in most videogames to allow multiple players for online gaming (Newman, 2004, p.115). The trend in online gaming is changing so rapidly that what is in vogue today may be obsolete in a few months. In online gaming, the players are allowed to pit against each other despite geographic and spatial distance. The Internet also allows online chat while players are competing against each other. The ubiquitous technology of the Internet had extended videogames from an individual’s living room into a global domain. Issues with Internet Video The main problems that usually hound providers are bandwidth and economics. In streaming technologies, most users have limited capacity modem speed. While the speed rate slowly improves, there are still gaps that need to be addressed. For example, streaming video files require a minimum of 2500 to 5000 compression ratio. A multimedia video consumes about 2.4M bits/second, 80 times more than the bandwidth capacity of a regular 28.8K modem connection (Currier, 1996). To have good transmission, the bandwidth must be slightly higher than the usual. The second issue is the time delay that video and audio content may experience on the Internet. Unpredictable load and traffic may disrupt transmission thereby producing corrupted images or audio. Disruption can cause the loss of data. The solution to the problem is to change the analogue lines into digital ones to increase bandwidth. A time delay of two seconds can render video conferencing useless. TCP/IP drop rate of 5% will inevitably translate to transmission loss. The level of acceptance for IPTV or Internet TV is still low. The proliferation of video and Internet television is also highly dependent on costs. Very few investors at the moment are willing to invest money into the medium. The medium also competes with traditional programming delivery of regular television broadcast and cable service. Conclusion The interface between the Internet and video is inexorable as customers demand better quality and easy access to the medium. The development of the video on the Internet is still in constant flux. The current video Internet protocol still needs more time to evolve. It is apparent that video on the Internet is the wave of the future and something to look forward to. References Aoyama, Y. and Izushi,H. (2003) Hardware gimmick or cultural innovation? Technological, cultural, and social foundations of the Japanese video game industry. Research Policy 32: 423-444. Cisco, 2006. Ciscos vision for the evolution of video communications and entertainment: Mike Volpi discusses the strategic importance of video in communications and media markets. Retrieved February 18, 2007 from: http://newsroom.cisco.com/dlls/2006/ts_121206.html Currier, B. (1996). Is the Internet ready for video? Retrieved February 18, 2007 from: http://www.synthetic-ap.com/qt/internetvideo.html Howe, W. (2005) An anecdotal history of the people and communities that brought about the Internet and the Web. Retrieved February 18, 2007 from: http://www.walthowe.com/navnet/history.html Hunter, J. (n.d.) Video Conferencing An Introduction. Retrieved February 18, 2007 from: http://ezinearticles.com/?Video-ConferencingAn-Introductionid=70930 Internet (n.d.) Retrieved February 18, 2007 from: http://en.wikipedia.org/wiki/Internet Newman, J. (2004). Videogames, London: Routledge. Noll, M.A. (2004). Chapter 1:Internet Television: Definition and prospects in Internet Television. Darcy Gerbarg, Jo Groebel and Eli Noam – (eds). Mahwah, NJ. Lawrence Erlbaum Associates:1-8. Raynovich, R.S. (2005). Video is the Internet. Retrieved February 19, 2007 from: http://www.lightreading.com/document.asp?doc_id=72472 Roberts,L.P. (2004). The history of video conferencing Moving ahead at the speed of video. Retrieved February 19, 2007 from: http://ezinearticles.com/?The-History-of-Video-ConferencingMoving-Ahead-at-the-Speed-of-Videoid=5369 Shaw, R. (n.d.). The evolution of rich media. Retrieved February 18, 2007 from: http://www.imediaconnection.com/content/2618.asp. Shepard, K. (2003). Questioning, promoting and evaluating the use of streaming video to support student learning. British Journal of Educational Technology 34(3): 295–308. Streaming video on the Internet. (2000). Retrieved February 1997 from: http://www.dps.com/custserv/doclib.nsf/55f584d47a8fd27585256bf300554e9f/9cb11874854c451c85256aaf00681f80/$FILE/Streaming%20Video%20White%20Paper%20v1-0.pdf

The History Of Virtualization Information Technology Essay

The History Of Virtualization Information Technology Essay Introduction Virtualization is one of the hottest innovations in the Information Technology field, with proven benefits that propel organizations to strategize for rapid planning and implementation of virtualization. As with any new technology, managers must be careful to analyze how that technology would best fit in their organization. In this document, we will provide an overview of virtualization to help shed light on this quickly evolving technology. History of Virtualization Virtualization is Brand New Again! Although virtualization seems to be a hot new cutting edge technology, IBM originally used it on their mainframes in the 1960s. The IBM 360/67 running the CP/CMS system used virtualization as an approach to time sharing. Each user would run their own 360 machine. Storage was partitioned into virtual disks called P-Disks for each user. Mainframe virtualization remained popular through the 1970s. During the 1980s and 1990s, virtualization kind of disappeared. During the 1980s, there were a couple of products made for Intel PCs. Simultask and Merger/386, both developed by Locus Computing Corporation, would run MS-DOS as guest operating systems. In 1988, Insignia Solutions released Soft PC which ran DOS on Sun and Macintosh platforms. The late 1990s would usher in the new wave of virtualization. In 1997, Connectix would release Virtual PC for the Macintosh. Later, Connectix would release a version for the Windows and subsequently be bought by Microsoft in 2003. In 1999, VMware would introduce its entry into virtualization. In the last decade, every major player in servers has integrated virtualization into their offerings. In addition to VMware and Microsoft, Sun, Veritas, and HP would all acquire virtualization technology. How Does Virtualization Work? In the enterprise IT world, servers are necessary to do many jobs. Traditionally each machine only does one job, and sometimes many servers are given the same job. The reason behind this is to keep hardware and software problems on one machine from causing problems for several programs. There are several problems with this approach however. The first problem is that it doesnt take advantage of modern server computers processing power.[11] Most servers only use a small percentage of their overall processing capabilities. The other problem is that the servers begin to take up a lot of physical space as the enterprise network grows larger and more complex. Data centers might become overcrowded with racks of servers consuming a lot of power and generating heat. Server virtualization tries to fix both of these problems in one fell swoop.[16] Server virtualization uses specially designed software in which an administrator can convert one physical server into multiple virtual machines. Each virtual server acts as a unique physical device that is capable of running its own operating system. Until recent technological developments, the only way to create a virtual server was to design special software to trick a servers CPU into providing processing power for several virtual machines. Today, however, processor manufacturers such as Intel and AMD offer processors with the capability of supporting virtual servers already built in. In the virtualized environment, the hardware doesnt create the virtual servers. Network administrators or engineers still need to create them using the right software. [11] In the world of information technology, server virtualization is still a hot topic. Still considered a new technology, several companies offer different approaches to server virtualization. There are three ways to create virtual servers; full virtualization, para-virtualization, and OS-level virtualization. In all three variations there are a few common traits. The physical server is always called the host. The virtual servers are called guests. The virtual servers all behave as if they were physical machines. However, in each of the different methods uses a different approach to allocating the physical server resources to virtual server needs. [11] Full Virtualization The full virtualization method uses software called a hypervisor. This hypervisor works directly with the physical servers CPU and disk space. It performs as the stage for the virtual servers operating system. This keeps each server completely autonomous and unconscious of the other servers running on the same physical machine. If necessary, the virtual servers can be running on different operating system software like Linux and/or Windows. The hypervisor also watches the physical servers resources. It relays resources from the physical machine to the appropriate virtual server as the virtual servers run their applications. Finally, because hypervisors have their own processing needs, the physical server must reserve some processing power and resources to run the hypervisor application. If not done properly, this can affect the overall performance and slow down applications. [11] Para-Virtualization Unlike the full virtualization method, the para-virtualization approach allows the guest servers to be aware of one another. Because, each operating system in the virtual servers is conscious of the demands being placed on the physical server by the other guests, the Para-virtualization hypervisor doesnt require as much processing power to oversee the guest operating systems. In this way the entire system works together as a unified organization. [11] OS-Level Virtualization The OS-level virtualization approach doesnt use a hypervisor at all. The virtualization capability is part of the host OS, instead. The host OS executes all of the functions of a fully virtualized hypervisor. Because the OS-level operates without the hypervisor, it limits all of the virtual servers to one operating system where the other two approaches allow for different OS usage on the virtual servers. The OS-level approach is known as the homogeneous environment because all of the guest operating systems must be the same. [11] With three different approaches to virtualization, the question remains as to which method is the best. This is where a complete understanding of enterprise and network requirements is imperative. If the enterprises physical servers all run on the same OS, then the OS-level approach might be the best solution. It tends to be faster and more efficient than the others. However, if the physical servers are running on several different operating systems, para-virtualization or full virtualization might be better approaches. Virtualization Standards With the ever-increasing adoption of virtualization, there are very few standards that actually reign as prevalent in this technology. As the migration to virtualization grows, so does the need for open industry standards. This is why the work on virtualization is viewed by several industry observers as a giant step in the right direction. The Distributed Management Task Force (DMTF) currently promotes standards for virtualization management to help industry suppliers implement compliant, interoperable virtualization management solutions. The strongest standard to be created for this technology was the Standardization of Management in a Virtualized Environment. It was accomplished by a team who builds on standards already in place. This standard lowers the IT learning curve and complexity for vendors implementing this support in their management solutions. Its ease-of-use makes this standard successful. The new standard recognizes supported virtualization management capabilities, including the ability to: discover inventory virtual computer systems manage lifecycle of virtual computer systems create/modify/delete virtual resources monitor virtual systems for health and performance Virtualization standards are not suffering as a result of poor development but rather because of the common IT challenge involved in pleasing all users. Until virtualization is standardized, network professionals must continue to meet these challenges within a dynamic data center. For example, before the relationship between Cisco and VMWare was established Ciscos Data Center 3.0 was best described as scrawny. 150 million dollars later, Cisco was able to establish a successful integration that allows the VFrame to load VMware ESX Server onto bare-metal computer hardware something that previously could only be done with Windows and Linux and configure the network and storage connections that ESX required. In addition, Microsoft made pledges only in the Web services arena, where it faces tougher open standards competition. The companys Open Specification Promise allows every individual and organization in the world to make use of Virtual Hard Disk Image Format forever, Microsoft said in a statement. VHD allows the packaging of an application with that applications Windows operating system. Several such combinations, each in its own virtual machine, can run on a single piece of hardware. The future standard of virtualization is in Open Virtual machine Format (OVF). OVF doesnt aim to replace the pre-existing formats, but instead ties them together in a standard-based XML package that contains all the necessary installation and configuration parameters. This, in theory, will allow any virtualization platform (that implements the standard) to run the virtual machines. OVF will set some safeguards as well. The format will permit integrity checking of the VMs to ensure they have not been tampered with after the package was produced. Virtualization in the Enterprise Microsofts Approach (Toms needs references) Virtualization is an approach to deploying computing resources that isolates different layers-hardware, software, data, networks, storage-from each other. Typically today, an operating system is installed directly onto a computers hardware. Applications are installed directly onto the operating system. The interface is presented through a display connected directly to the local machine. Altering one layer often affects the others, making changes difficult to implement. By using software to isolate these layers from each other, virtualization makes it easier to implement changes. The result is simplified management, more efficient use of IT resources, and the flexibility to provide the right computing resources, when and where they are needed. Bob Muglia, Senior Vice President, Server and Tools Business, Microsoft Corporation The typical discussions of virtualization focus on server hardware virtualization (which will be discussed later in this article). However, there is more to virtualization than just server virtualization. This section presents Microsofts virtualization strategy. By looking at Microsofts virtualization strategy, we can see other areas, beside server virtualization, where virtualization can be used in the enterprise infrastructure. Server Virtualization Windows Server 2008 Hyper-V and Microsoft Virtual Server 2005 R2 In Server virtualization, one physical server is made to appear as multiple servers. Microsoft has two products for virtual servers. Microsoft Virtual Server 2005 R2 was made to run on Windows Server 2003. The current product is Windows Server 2008 Hyper-V, which will only run on 64-bit versions of Windows Server 2008. Both products are considered hypervisors, a term coined by IBM in 1972. A hypervisor is the platform that enables multiple operating systems to run on a single physical computer. Microsoft Virtual Server is considered a Type 2 hypervisor. A Type 2 hypervisor runs within the host computers operating system. Hyper-V is considered a Type 1 hypervisor, also called a bare-metal hypervisor. Type 1 hypervisors run directly on the physical hardware (bare metal) of the host computer. A virtual machine; whether we are talking about Microsoft, VMWare, Citrix, or Parallels; basically consists of two files, a configuration file and a virtual hard drive file. This is true for desktop virtualization as well. For Hyper-V, there is a .vmc file for the virtual machine configuration and a .vhd file for the virtual hard drive. The virtual hard drive holds the OS and data for the virtual server. Business continuity can be enhanced by using virtual servers. Microsofts System Center Virtual Machine Manager allows an administrator to move a virtual machine to another physical host without the end users realizing it. With this feature, maintenance can be carried out without bringing the servers down. Failover clustering between servers can also be enabled. This means that should a virtual server fail, another virtual server could take over, providing a disaster recovery solution. Testing and development is enhanced through the use of Hyper-V. Virtual server test systems that duplicate the production systems are used to test code. In UCFs Office of Undergraduate Studies, a virtual Windows 2003 server is used to test new web sites and PHP code. The virtual server and its physical production counterpart have the exact same software installed, to allow programmers and designers to check their web applications before releasing them to the public. By consolidating multiple servers to run on fewer physical servers, cost saving may be found in lower cooling and electricity needs, lower hardware needs, and less physical space to house the data center. Server consolidation is also a key technology for Green computing initiatives. Computer resources are also optimized, for example CPUs will see less idle time. Server virtualization also maximizes licensing. For example, purchasing one Microsoft Server Enterprise license will allow you to run four virtual servers using the same license. Desktop Virtualization Microsoft Virtual Desktop Infrastructure (VDI) and Microsoft Enterprise Desktop Virtualization (MED-V) Desktop virtualization is very similar to server virtualization. A client operating system, such as Windows 7, is used to run a guest operating system, such as Windows XP. This is usually done to support applications or hardware not supported in the current operating system (This is why Microsoft included Windows XP mode in versions of Windows 7). Microsofts Virtual PC is the foundation for this desktop virtualization. Virtual PC allows a desktop computer to run a guest operating system (OS) which is independent instance of an OS on top of their host OS. Virtual PC emulates a standard PC hardware environment and is independent of the hosts hardware or setup. Microsoft Enterprise Desktop Virtualization (MED-V) is a managed client-hosted desktop virtualization solution. MED-V builds upon Virtual PC and adds features to deploy, manage, and control the virtual images. The images can also be remotely updated. The virtual machines run on the client computer. Also, applications that have been installed on the virtual computer can be listed on the host machines Start menu or as a desktop shortcut, giving the end user a seamless experience. MED-V can be very useful to support legacy applications that may not be able to run on the latest deployed operating system. The virtual images are portable and that makes it useful for a couple of scenarios. Employees that use their personal computers for work can now use a corporate managed virtual desktop. This solves a common problem where the personal computer might be running a home version of the operating system that does not allow it to connect to a corporate network. This also means that the enterprise only makes changes to the virtual computer and makes not changes to the personal computers OS. The other scenario where portability plays a factor is that the virtual image could be saved to a removable device, such as a USB flash drive. The virtual image could then be run from the USB drive on any computer that has an installation of Virtual PC. Although this is listed as a benefit by Tulloch, I also see some problems with this scenario. USB flash drives sometimes get lost and losing a flash drive in this scenario is like losing a whole computer, so caution should be exercised so that sensitive data is not kept on the flash drive. Secondly, based on personal experience, even with a fast USB flash drive, the performance of the virtual computer running from the USB flash drive is poor as compared to running the same image from the hard drive. Virtual Desktop Infrastructure (VDI) is server based desktop virtualization. In MED-V, the virtual image is on the client machine and runs on the client hardware. In VDI, the virtual images are on a Window Server 2008 with Hyper-V server and run on the server. The users data and applications, therefore, reside on the server. This solution seems to be a combination of Hyper-V and Terminal Services (discussed later in this section). There are several benefits to this approach. Employees can work from any desktop, whether in the office or at home. Also, the client requirements are very low. Using VDI, the virtual images can be deployed not only to standard desktops PCs, but also to thin clients and netbooks. Security is also enhanced because all of the data is housed on servers in the data center. Finally, administration is easier and more efficient due to the centralized storage of the images. Application Virtualization Microsoft Application Virtualization (App-V) Application virtualization allows applications to be streamed and cached to the desktop computer. The applications do not actually install themselves into the desktop operating system. For example, no changes are actually made to the Windows registry. This allows for some unusual virtual tricks like being able to run two versions of Microsoft Office on one computer. Normally, this would be impossible. App-V allows administrators to package applications in a self-contained environment. This package contains a virtual environment and everything that the application needs to run. The client computer is able to execute this package using the App-V client software. Because the application is self-contained, it makes no changes to the client, including no changes to the registry. Applications can be deployed or published through the App-V Management server. App-V packages can also be deployed through Microsofts System Center Configuration Manager or standalone .msi files located on network shares or removable media. App-V has several benefits for the enterprise. There is a centralized management of the entire application life cycle. There is faster application deployment due to less time performing regression testing. Since App-V applications are self-contained, there are no software compatibility issues. You can also provide on-demand application deployment. Troubleshooting is also made easier by using App-V. When an application is installed on a client, it creates a cache on the local hard drive. If an App-V application fails, it can be reinstalled by deleting the cache file. Presentation Virtualization Windows Server 2008 Terminal Services Terminal services, which has been around for many years, has been folded into Microsofts Virtualization offerings. A terminal server allows multiple users to connect. Each user receives a desktop view from the server in which they will run applications on the server. Any programs run within this desktop view actually execute on the terminal server. The client only receives the screen view from the server. The strategy employed here is that since the application will only use resources on the server, money can be spent on strong server hardware and money saved on lighter strength clients. Also, since the application is only on the server, it is easier to maintain the software, since it only needs to be updated on the server and not all of the clients. Also, since the application runs on the server, the data can be stored on the server as well, enhancing security. Another security feature is that every keystroke and mouse stroke is encrypted. The solution is also scalable and can be ex panded to use multiple servers in a farm. Terminal services applications can also be optimized for both high and low bandwidth scenarios. This is helpful for remote users accessing corporate applications from less than optimal connections. User-State Virtualization Roaming User Profiles, Folder Redirection, Offline Files This is another set of technologies that have been around since Windows 95 but have now been folded into the virtualization strategy. A user profile consists of registry entries and folders which define the users environment. The desktop background is a common setting that you will find as part of the user profile. Other items included in the user profile are application settings, Internet Explorer favorites, and documents, music, and picture folders. Roaming user profiles are profiles saved to a server that will follow a user to any computer that the user logs in to. For an example, a user with roaming profiles logs on to a computer on the factory floor and changes the desktop image to a picture of fluffy kittys. When he logs on to his office computer, the fluffy kittys are also on his office computers desktop as well. When using roaming profiles, one of the limitations is that the profile must be synchronized from the server to the workstation each time the user logs on. When the user logs off, the profile is then copied back up to the server. If folders, such as the documents folder, are included, the downloading and uploading can take some time. An improved solution is to use redirected folders. Folders, such as documents and pictures, can be redirected to a server location. This transparent to the user, for the user will still access his documents folder as if they were part of his local profile. This also helps with data backup, since it is easier to backup a single server than document folders located on multiple client computers. A limitation with roaming user profiles occurs when the server or network access to the server is down. Offline files attempt to address that limitation by providing access to network files even if the server location is inaccessible. When used with Roaming User Profiles and Folder Redirection, files saved in redirected folders are automatically made available for offline use. Files marked for offline use are stored on the local client in a client-side cache. Files are synchronized between the client-side cache and the server. If connection to the server is lost, the Offline Files feature takes over. The user may not even realize that there have been any problems with the server. Together, Roaming User profiles, Folder Redirection, and Offline Files are also an excellent disaster recovery tool. When a desktop computer fails, the biggest loss are the users data. With these three technologies in place, all the user would need to do is to log into another standard corporate issued computer and resume working. There is no downtime in trying to recover or restore the users data since it was all safely stored on a server. Review of Virtualization in the Enterprise Virtualization can enhance the way an enterprise runs the data center. Server virtualization can optimize hardware utilization. Desktop virtualization can provide a standard client for your end users. Application virtualization can allow central administration of applications and fewer chances of application incompatibilities. Presentation virtualization allows central management of applications and allowing low end clients, such as thin clients and netbooks, to run software to perform beyond the hardware limitations. User state virtualization gives the user a computer environment that will follow them no matter what corporate computer they use. Benefits and Advantages of Virtualization Virtualization has evolved into a very important entity and a platform for IT to take a step into computing history, being used by countless companies both large and small. This is due to Virtualizations capability to proficiently simplify IT operations and allow IT organizations to respond faster to changing business demands. Although virtualization started out as a technology used mostly in testing and development environments, in recent years it has moved toward the mainstream in production servers. While there are many advantages of this technology, the following are the top 5. Virtualization is cost efficient Virtualization allows a company or organization to save money on hardware, space, and energy. Using existing servers and/or disks to add more performance without adding additional capacity, virtualization directly translates into savings on hardware requirements. When it is possible to deploy three or more servers on one physical machine, it is no longer necessary to purchase three or more separate machines, which may in fact have only been used occasionally. In addition to one-time expenses, virtualization can help save money in the long run as well because it can drastically reduce energy consumption. When there are fewer physical machines this means less energy to power (and cool) them is needed. Virtualization is Green GreenIT is not just a fashion trend. Eco-friendly technologies are in high demand and virtualization solutions are certainly among them. As already mentioned, server virtualization and storage virtualization lead to decreased energy consumption; this automatically includes them in the list of green technologies. Virtualization Eases Administration and Migration When there are fewer physical machines, this also makes their administration easier. The administration of virtualized and non-virtualized servers and disks is practically the same. However, there are cases when virtualization poses some administration challenges and might require some training regarding how to handle the virtualization application. Virtualization Makes an Enterprise More Efficient Increased efficiency is one more advantage of virtualization. Virtualization helps to utilize the existing infrastructure in a better way. Typically an enterprise uses a small portion of its computing power. It is not uncommon to see server load in the single digits. Keeping underutilized machines is expensive and inefficient and virtualization helps to deal with this problem as well. When several servers are deployed onto one physical machine, this will increase capacity utilization to 90 per cent or more. Improved System Reliability and Security Virtualization of systems helps prevent system crashes due to memory corruption caused by software like device drivers. VT-d for Directed I/O Architecture provides methods to better control system devices by defining the architecture for DMA and interrupt remapping to ensure improved isolation of I/O resources for greater reliability, security, and availability. Dynamic Load Balancing and Disaster Recovery As server workloads vary, virtualization provides the ability for virtual machines that are over utilizing the resources of a server to be moved to underutilized servers.   This dynamic load balancing creates efficient utilization of server resources. In addition, disaster recovery is a critical component for IT, as system crashes can create huge economic losses. Virtualization technology enables a virtual image on a machine to be instantly re-imaged on another server if a machine failure occurs. Limitations and/or Disadvantages of Virtualization While one could conclude that virtualization is the perfect technology for any enterprise, it does have several limitations or disadvantages. Its very important for a network administrator to research server virtualization and his or her own networks architecture and needs before attempting to engineer a solution. Understanding the networks architecture needs allows for the adoption of a realistic approach to virtualization and for better judgment of whether it is a suitable solution in a given scenario or not. Some of the most notable limitations and disadvantages are having a single point of failure, hardware and performance demands, and migration. Single Point of Failure One of the biggest disadvantages of virtualization is that there is a single point of failure. When the physical machine, where all the virtualized solutions run, fails or if the virtualized solution itself fails, everything crashes. Imagine, for example, youre running several important servers on one physical host and its RAID controller fails, wiping out everything. What do you do? How can you prevent that? The disaster caused by physical failure can however be avoided with one of several responsible virtualized environment options. The first of these options is clustering. Clustering allows several physical machines to collectively host one or more virtual servers. They generally provide two distinct roles, which are to provide for continuous data access, even if a failure with a system or network device occurs, and to load balance a high volume of clients across several physical hosts.[14] In clustering, clients dont connect to a physical computer but instead connect to a logical virtual server running on top of one or more physical computers. Another solution is to backup the virtual machines with a continuous data protection solution. Continuous data protection makes it possible to restore all virtual machines quickly to another host if the physical server ever goes down. If the virtual infrastructure is well planned, physical failures wont be a frequent problem. However, this solut ion does require an investment in redundant hardware, which more or less eliminates some of the advantages of virtualization. [12] Hardware and Performance Demands Server virtualization may save money because less hardware is required thus allowing a decrease the physical number of machines in an enterprise, it does not mean that newer and faster computers are not necessary. These solutions require powerful machines. If the physical server doesnt have enough RAM or CPU power, performance will be disrupted. Virtualization essentially divides the servers processing power up among the virtual servers. When the servers processing power cant meet the application demands, everything slows down. [11] Therefore, things that shouldnt take very long could slow down to take hours or may even cause the server to crash. Network administrators should take a close look at CPU usage before dividing a physical server into multiple virtual machines. [11] Migration In current virtualization methodology, it is only possible to migrate a virtual server from one physical machine to another if both physical machines use the same manufacturers processors. For example, if a network uses one server that runs an Intel processor and another that uses an AMD processor, it is not possible to transfer a virtual server from one physical machine to the other. [11] One might ask why this is important to note as a limitation. If a physical server needs to be fixed, upgraded, or just maintained, transferring the virtual servers to other machines can decrease the amount of required down time during the maintenance. If porting the virtual server to another physical machine wasnt an option, then all of the applications on that virtual machine would be unavailable during the maintenance downtime. [11] Virtualization Market Size and Growth Market research reports indicate that the total desktop and server virtualization market value grew by 43% from $1.9 Billion in 2008 to $2.7 Billion in 2009. Researchers estimate that by 2013, approximately

Tuesday, August 20, 2019

Bush and Hitler - Parallel Lives :: Politics Political

Bush and Hitler - Parallel Lives The 70th anniversary wasn't noticed in the United States, and was barely reported in the corporate media. But the Germans remembered well that fateful day seventy years ago - February 27, 1933. They commemorated the anniversary by joining in demonstrations for peace that mobilized citizens all across the world. It started when the government, in the midst of a worldwide economic crisis, received reports of an imminent terrorist attack. A foreign ideologue had launched feeble attacks on a few famous buildings, but the media largely ignored his relatively small efforts. The intelligence services knew, however, that the odds were he would eventually succeed. (Historians are still arguing whether or not rogue elements in the intelligence service helped the terrorist; the most recent research implies they did not.) But the warnings of investigators were ignored at the highest levels, in part because the government was distracted; the man who claimed to be the nation's leader had not been elected by a majority vote and the majority of citizens claimed he had no right to the powers he coveted. He was a simpleton, some said, a cartoon character of a man who saw things in black-and-white terms and didn't have the intellect to understand the subtleties of running a nation in a complex and internationalist world. His coarse use of language - reflecting his political roots in a southernmost state - and his simplistic and often-inflammatory nationalistic rhetoric offended the aristocrats, foreign leaders, and the well-educated elite in the government and media. And, as a young man, he'd joined a secret society with an occult-sounding name and bizarre initiation rituals that involved skulls and human bones. Nonetheless, he knew the terrorist was going to strike (although he didn't know where or when), and he had already considered his response. When an aide brought him word that the nation's most prestigious building was ablaze, he verified it was the terrorist who had struck and then rushed to the scene and called a press conference. "You are now witnessing the beginning of a great epoch in history," he proclaimed, standing in front of the burned-out building, surrounded by national media. "This fire," he said, his voice trembling with emotion, "is the beginning." He used the occasion - "a sign from God," he called it - to declare an all-out war on terrorism and its ideological sponsors, a people, he said, who traced their origins to the Middle East and found motivation for their evil deeds in their religion.

Monday, August 19, 2019

Sanchez Essay -- essays research papers

The short story "Sanchez," written by Richard Dokey, is a story about Juan Sanchez and his family. "Sanchez" is told in many different settings, which are all unique and represent various feelings that Dokey portrays to his readers. The settings are described realistically; they affect Juan and Jesus in personal ways. The settings vary from a small village in Mexico to the Sierra Nevada in California. At first the story is set in Stockton in the San Joaquin Valley. Jesus, Juan's son, got his first job in a cannery called Flotill. Stockton is shown to be a working town where Juan had lived before. To Jesus, Stockton is his future and his hopes are large enough to shield him from the "skid row" section of town. Jesus was to live in a cheap hotel while he worked in the cannery. The hotel was described as stained, soiled, and smelly (151). Jesus is proud of his room and his job, but Juan only sees them as disappointing. Stockton, for Juan, brings back memories of hard work and time away from his wife, La Belleza. La Belleza was the prime focus of Juan's life and if he was away from her, he definitely wasn't happy; this is why Juan has bad feelings for Stockton. From the hotel, we, as readers, are taken through the town of Stockton. There are torn buildings and rubble all over the place. A "warm and dirty" pool hall was Jesus' "entertainment" (152). This smoky pool hall was recreation for Jesu... Sanchez Essay -- essays research papers The short story "Sanchez," written by Richard Dokey, is a story about Juan Sanchez and his family. "Sanchez" is told in many different settings, which are all unique and represent various feelings that Dokey portrays to his readers. The settings are described realistically; they affect Juan and Jesus in personal ways. The settings vary from a small village in Mexico to the Sierra Nevada in California. At first the story is set in Stockton in the San Joaquin Valley. Jesus, Juan's son, got his first job in a cannery called Flotill. Stockton is shown to be a working town where Juan had lived before. To Jesus, Stockton is his future and his hopes are large enough to shield him from the "skid row" section of town. Jesus was to live in a cheap hotel while he worked in the cannery. The hotel was described as stained, soiled, and smelly (151). Jesus is proud of his room and his job, but Juan only sees them as disappointing. Stockton, for Juan, brings back memories of hard work and time away from his wife, La Belleza. La Belleza was the prime focus of Juan's life and if he was away from her, he definitely wasn't happy; this is why Juan has bad feelings for Stockton. From the hotel, we, as readers, are taken through the town of Stockton. There are torn buildings and rubble all over the place. A "warm and dirty" pool hall was Jesus' "entertainment" (152). This smoky pool hall was recreation for Jesu...

Sunday, August 18, 2019

Marianne Moores Life Essay -- Biographies Biography Poetry Writers Es

Marianne Moore's Life Marianne Moore was born on November 15, 1887 in Kirkwood, Missouri. Her father, who was an engineer, suffered a mental breakdown before her birth and was hospitalized before she could meet him. Moore lived with her mother, her brother, and her grandfather in Missouri until her grandfather’s death in 1894. Moore’s mother moved the family briefly to Pittsburgh and then to Carlisle, Pennsylvania. Moore attended Metzger Institute through high school and then enrolled at Bryn Mawr College in 1905. At Bryn Mawr Moore she published poems in two of the school’s literary magazines: Tipyn O’Bob and the Lantern. She majored in history, law, and politics, and graduated in 1909. After graduating Moore took secretarial courses at Carlisle Commercial College and then taught bookkeeping, stenography, typing, commercial English, and law. [i] In 1915 Moore began to publish poems professionally. Moore first published seven poems in the Egoist, which was a London magazine edited by Hilda Doolittle. Four poems were published in Poetry: A Magazine of Verse. Five of her poems were published in Others. In 1916 Moore moved with her mother to Chatham, New Jersey, to live with her brother, who was a Presbyterian minister. When he joined the Navy in 1918 Moore and her mother moved to Manhattan. It was at this time that she became friendly with other artists such as Alfred Kreymborg, photographer Alfred Stieglitz, poets Wallace Stevens and William Carlos Williams. H.D., T.S. Eliot, and Ezra Pound also esteemed her. In 1920 Moore’s work began to appear in the distinguished pro-modernist magazine, the Dial. From 1921 until 1925 Moore worked as an assistant in the Hudson Park branch of the... ...] Marianne Moore Chronology, http://mam.english.sbc.edu/TSE.html [xiv] Engel Works Cited Books: Elizabeth W. Joyce, Cultural Critique and Abstraction (London: Associated University Press 1998) Charles Molesworth, Marianne Moore: A Literary Life. (New York: Atheneum Publishing Company, 1990) Websites: Elaine Oswald and Robert L. Gale, On Marianne Moore’s Life and Career, (Modern American Poetry). http://www.english.uiuc.edu/maps/poets/m_r/moore/life.html Bernard F. Engel, Marianne Moore, (Heath Online Instructor’s Guide) April 13, 2004. http://college.hmco.com/english/heath/syllabuild/iguide/moore.html Marianne Moore, (Academy of American Poets) April 13, 2004. http://www.poets.org/poets/poets.cfm?45442B7C000C0F02; Marianne Moore Chronology, http://mam.english.sbc.edu/TSE.html

Saturday, August 17, 2019

Birth Order

This paper ought to review and study the possible connection between child order and personality, with respect to the theory as proposed by Alfred Adler. The paper would take a look at what the theory is all about and whether there are enough evidence that could support Adler’s claim.Alfred Adler has been one of the pioneers of psychoanalysis, although he ventured away from some of Sigmund Freud’s theories.   It was due to their difference in ideas and contextual analysis that the separation from the Psychoanalytic school happened.   Adler created his own school of psychology which he called â€Å"Individual Psychology†. In this group, the concern focused on the so-called inferiority complex that humans possess (Ansbacher, 1964).With respect to Adler’s theory, humans are originally weak and helpless. Humans are born without knowledge and must always be guided by those who are â€Å"superior† or by someone who is older (Ansbacher, 1964).Because o f this, children strive to learn further and try in every way to exceed or at least achieve the same level as those people whom they perceive to be superior to them. This is what Adler speculated as the â€Å"inferiority complex†, the driving force that fuels a human’s emotions, actions and reasons (Mosak, 1999).According to Adler those people who strive to become the best or those who try in every way to succeed are people who have very high inferiority complex, while those who are easy-going are more likely less affected by the inferiority complex (Mosak, 1999).However, a very strong inferiority complex might also have a negative effect on a particular person, most specifically when the person has failed in most of his/her endeavors. These experiences might leave the person feeling wrecked, hopeless and unmotivated to strive towards future goals because of the overwhelming failure that the person has encountered.Adler supported Freud’s hypothesis concerning th e effects of parenting styles on the life of the person. Adler hypothesized that there are two parenting styles, pampering and neglect, which affect the life of adult person.Pampering would be the process in which the parent gives too much attention to the child and protects the child, so that the child is presented with an ideal world.   This does not help the child develop into a full social individual. Due to this kind of treatment, the child might develop a severe inferiority complex and might be shocked when faced with the realities of life on their own. (Drescher & Stone, 2004)  The child might not be able to perform well on his/her own because he/she was used to the experience of having his/her parents on his/her side whenever problems arose. Also the child would be doubtful not only of his/her abilities but also of the decisions that he/she makes.With regards to the neglectful parenting style, the child is said to be exposed to all the extremes and problems that the worl d can offer.   Since the child was left alone, he/she might be force to struggle on his/her own which leads to his/her mistrust to the people around him/her.   Because of this, the child reared under a neglectful parenting style finds it very hard to create or build relationships with other people especially with his/her peers. (MacDonald, 1971)Therefore, Alfred Adler believed that parents shall make sure that the child was protected from the evils of the world at the same time the parents are obligated to present to the child the harsh realities of life. (Stein, 2007) Simply put, the child must be protected but that does not mean that the parent would deny them the knowledge and opportunity of knowing how to solve problems independently.Aside from parenting, another aspect that greatly affects a child’s performance in his lifetime according to Adler is his/her birth order. Indeed, this assumption has since garnered criticism and support from psychologists and social scie ntists.According to this â€Å"theory†, the birth order of the child determines how he views himself as a person with respect to how the child is treated in his family. It is also the factor which is responsible for the behavior and personality of the person in his adult life. (Stein, 2007)The Firstborn:According to Adler’s theory, the firstborn of the family is treated with outmost care and spoiled before the other siblings arrived. The situation for this child starts out with excited parents and the child usually has everything he/she needs. When the younger sibling is conceived there is a sudden shift of attention of the parents. The new baby now receives more attention that the firstborn child. Thus the firstborn, in Adler’s theory, tries his/her best to regain this attention. The child desperately seeks the return of his/her parents’ full love and interest. (Stein, 2007)Most firstborns feel dethroned by their younger siblings because they now have to share everything with the next child; thus, a firstborn learns to share. Also, the firstborn are given more responsibility than the other children, since they are the eldest; this could cause them to develop an authoritarian persona. (Mosak, 1999)  The parents’ expectations for the firstborn are usually very high; they are pushed into the situation of being responsible and setting an example for their younger siblings. These experiences might led the first born child to develop the qualities of a good leader, although usually firstborn children have lower self confidence than other children. (Stein, 2007)The Middleborn:Middle children do not encounter the same expectations and are not spoiled as the firstborn; however, they still get a portion of the attention that the firstborn child enjoys. What makes being the middle child interesting is that they not only experience a drive to be superior to the first child, but also must compete for attention and status among the youn ger siblings. These experiences could cause inferiority with reference to their older siblings but superiority with reference to their younger siblings. (Stein, 2007)

Friday, August 16, 2019

The tale of Beowulf

The tale of Beowulf begins and ends with the funeral of great kings. The funerals represented in this tale are decorated with rites that derive from the cultural traditions of the kings being laid to rest. Scyld Scefing is entombed within a barge decorated with signs of his accomplishments, while com/beowulf-as-an-epic-hero/">Beowulf is enshrined within a barrow filled with relics of his rule. Yet within these traditional burials one can find traces of the men themselves as each makes requests that lead to the distinctiveness of their burials.This allows the funerals to become particularly distinctive as the author makes use of the elements, such as earth, fire, and water (Smith). Each funeral defines the symbolic ideas of motion versus grounded-ness represented in the lives of these two men, and the method in which each is carried out emphasizes the opposite ways in which they entered their lands and mounted their thrones. The funerals of both men are representative of their persona lities as shown through their deeds and the ways in which they lived their lives. Though both men were valiant warriors and kings, their lives as youths and kings appear to be very different.The movement characteristic of Scyld Scefing’s funeral represents a continuation of the boldness and vigor with which he sailed through life. His life was continually one of forward motion from low to high estate, and he does not cease this motion in his death. The poem continues, â€Å"Forth he fared at the fated moment, sturdy Scyld to the shelter of God† (lines 26-27). His clansmen and subjects seem determined that their king should keep moving though he has been cut off from life, as they immediately â€Å"bore him over to ocean’s billow† (line 28).He is placed on a barge that is taken by the floods to an even higher and more celebrated place, and the words used by the author to describe this continue this motif of motion to an even higher estate. Such words and p hrases as â€Å"outbound† highlight this motion, and as â€Å"No man is able to say in sooth [†¦] who harbored that freight,† his burial demonstrates that his resting place could mean yet another promotion for this king who had risen from foundling to royalty. The funeral given Beowulf differs greatly from that granted Scyld Scefing.Beowulf’s rites represent that of a more grounded king who had been home grown and bred specifically to become royalty. His funeral demonstrates no great motion, as his lineage is anchored and steeped in royalty. The rites take place within the land of his birth, and his tomb is laid upon a foundation of the soil upon which his ancestors walked. The writer establishes this in his recounting of the events: â€Å"They fashioned for him the folk of Geats firm on the earth a funeral-pile† (line 2821). The firmness with which this tomb is established upon the earth symbolizes the strength of Beowulf’s roots within his h omeland.Around this is erected a wall, and this further strengthen’s Beowulf’s position as a foundational leader of his land. The monuments given to house this leader are built into the ground of the kingdom and given foundations akin to the roots that one finds in Beowulf’s lineage. His burial is akin to burying treasure (gold and precious stones), â€Å"trusting the ground with treasures of earls, gold in the earth† (2850), and this is in essence an act of giving back to the earth the treasure it has afforded. The funerals of Scefing and Beowulf also differ in the elements that attend each.According to critic George Clark in his essay â€Å"Beowulf’s Armor, † â€Å"Each funeral places the final offering of arms and armor and treasure in the context of one of the elements, water, fire, or earth† (429). While water is the dominant element in Scefing’s funeral, fire is used to herald the burial of Beowulf. The significance of th e water for Scefing derives mainly from his history, as he was borne to the Danes on a small vessel as an abandoned infant. The water represents the deep, the void from which the king came and to which he is allowed to return.The story comes full circle for this king, as he is again borne away at the end of his life, given back to the water that offered him to the Danes. This is done on purpose by his clansmen, and highlighted by the narrator who writes, â€Å"No less these loaded the lordly gifts, thanes' huge treasure, than those had done who in former time forth had sent him sole on the seas, a suckling child† (lines 43-46). He is again sent by himself â€Å"on the seas† into the unknown belly of the flood which had offered him up as a child.The fire for Beowulf is the opposite of this water, and this might also be seen as a reference to difference in his birth and youth. However, the narrative continues, â€Å"Wood-smoke rose black over blaze, and blent was the ro ar of flame with weeping (the wind was still), till the fire had broken the frame of bones† (2827-30). While the water takes Scefing away from the land, Beowulf’s fire offers up incense that rises and, as the ashes fall, remains forever mingled with the soil in the land of his birth.The narrator mentions that the wind was still, emphasizing the idea that no part of Beowulf’s burnt body or ashes is allowed to fly beyond the land of his birth and rule. He utterly belongs to this land, and the roaring of the fire becomes a dirge that rises and mingles with the sound of his subjects’ weeping. Yet the reader gets the feeling that Beowulf is not lost to his people. This fire is allowed to burn beyond Beowulf’s bones, consuming his flesh and, as â€Å"the smoke was by the sky devoured† (2838), the fire sends up Beowulf’s essence as a protection and covering for his land and people.Though the lives of Scefing and Beowulf were similar in many w ays, they also differed in some very significant areas that have to do with how they came to be king. While Scefing begins life as a foundling and sustains upward motion that raises him to the estate of ruler, Beowulf is born a prince whose roots are grounded in his homeland. The elements used to represent these two men are also representative of their origins. Water is used to symbolize the rootless Scefing, while fire and earth symbolize Beowulf’s grounded ancestry.Both men are treasured by their people, yet allowed to fulfill their destinies by drifting or remaining rooted as has been their custom. Works Cited Beowulf. The Harvard Classics, Volume 49. Frances B. Grummere (Trans. ) 1910. P. F. Collier & Son, 1993. Clark, George. â€Å"Beowulf’s Armor. † ELH. Vol. 32. No. 4. Dec. 1965. pp. 409-441. Smith, Jennifer. â€Å"Paradise Lost and Beowulf: The Christian/Pagan Hybrids of the Epic Tradition. † Department of English. Long Beach: California State University. http://www. csulb. edu/~jsmith10/miltbeow. htm