Do Social Network Providers Require (Further?) Regulation?
✅ Paper Type: Free Essay | ✅ Subject: Law |
✅ Wordcount: 5895 words | ✅ Published: 1st Jun 2020 |
Do social network providers require (further?) regulation?
1. Introduction
On 15 March 2019, Brenton Tarrant shot 50 people dead during a terrorist attack at two mosques in Christchurch, New Zealand.[1] By now, the story of the attack is well-known, but an aspect of it that has received less attention is its social networks aspect. Among other things, Tarrant published a manifesto, announced the attack and subsequently, livestreamed it on social networks channels.[2] It took 17 minutes to Facebook to take the notorious video down and whilst the social network provider managed to prevent 1.5 million uploads by people who had captured it, 300,000 uploads eventually appeared on the platform before Facebook managed to remove them.[3] Even then, captured versions of the livestream were uploaded on YouTube.[4] Tarrant’s tactic was by no means exceptional[5] – the same tactic was used by the Islamic State of Iraq and Syria (ISIS) which livestreamed beheadings of hostages on social networks.[6] Every time when something similar to the New Zealand terrorist attack happens, there are calls for further regulation of social network providers.[7] This essay will assess whether such calls are justified and whether indeed social networks require further regulation.
2. The Case against the Further Regulation of Social Network Providers
Regardless of cases like those of Brentan Tarrant, there is no universal agreement that social network providers require further regulation.
2.1. Impact on Freedom of Speech
Social networks are often perceived as key facilitators of freedom of speech.[8] They enhance the ability of users to contribute and share content and also to read, listen to and watch content, created by other users.[9] Social networks not only increase the amount of content created, but they also lead to an increase in the number of contributors, which is seen as a positive development in terms of the ability of more individuals to participate in the democratic society.[10] In this framework, any further regulation of social network providers is perceived as limitation on freedom of speech. This is an issue, particularly in states with overprotective speech laws, such as the US. The case of Packingham v North Carolina[11] is an example of this. In 2002, Packingham was convicted of a sex offence for having sex with a 13-year-old girl. In 2010, he posted on social media, saying that he thanked God since a traffic ticket that he got was dismissed.[12] The police saw the post and subsequently, Packingham was convicted of felony under §202.5 North Carolina General Statutes[13] which prohibited the use of social networks which allowed minors to create accounts by registered child sex offenders. Packingham challenged the constitutionality of §202.5 North Carolina General Statutes and in 2017, the US Supreme Court unanimously held that this provision of the statute violated the First Amendment[14].[15] Whether the US Supreme Court was right is a value judgement, but the case highlights that some countries are reluctant to impose strict regulations on social networks providers on grounds of freedom of speech.
If you need assistance with writing your essay, our professional essay writing service is here to help!
Essay Writing ServiceOne may agree with the US Supreme Court as there have been cases where there has been obvious damage to freedom of speech as result of attempts to regulate the conduct of users on social networks. In PF v Mark Meechan,[16] Mark Meechan was found guilty of an offence under s.127(1)(a) Communications Act 2003[17] for posting a funny viral video on YouTube in which he showed how his girlfriend’s tiny pug dog reacted to the Nazi salutes “Sieg Heil” and “Gas the Jews”.[18] Despite the fact that the majority of people found the video amusing, Sheriff Derek O’Carroll held that it was “grossly offensive and contained menacing, anti-Semitic and racist material.”[19] Following Meechan’s arrest and prior to him being sentenced, YouTube was ordered to restrict the users’ access to the video, which it did.[20] If social network providers are obliged to remove content of the sort published by Mark Meechan, this will clearly be damaging for freedom of speech as when social network providers are sent a content removal request, they often decide to err on the side of caution and directly remove the problematic content which may in fact be legal.[21] Hence, one could argue that social network providers should not be held liable in respect of certain types of speech, such as speech that certain individuals find offensive but is not linked to or does not incite crime and violence.
2.2. Impact on Social Network Providers’ Business Model
Often when social network providers’ regulation is discussed, social network providers and activists argue that regulation is damaging to their business model. For example, following the New Zealand terrorist attack, it was suggested that social network providers should be compelled to delay the broadcasting of live videos to prevent terrorists such as Brenton Tarrant from spreading their propaganda.[22] Social network providers were quick to respond that this could be damaging to their business model which relies on advertising revenue and expanding their user base.[23] Particularly in this context, the validity of this argument is questionable since many Facebook advertisers decided to boycott the platform due to its inability to prevent the spread of Brentan Tarrant’s video.[24] However, in other contexts, further regulation could be harmful to the business model of social network providers. For instance, the proposed Directive on Copyright in the Digital Single Market[25] allegedly requires from social network providers to obtain authorisation from the copyright owners when their users post memes.[26] Since this is practically impossible, this will lead to the prohibition of memes on social networks.[27] Memes are widely enjoyed by social network users, so their ban will lead to a drop in the number of users and less advertising revenue for social network providers. Therefore, it could be argued that in certain contexts, the further regulation of social network providers is indeed damaging for their business model.
3. The Case for Further Regulation of Social Network Providers
The above discussion showed that there is a compelling case for not imposing further regulatory burden on social network providers. However, this is very much context-dependent. For example, regulation of online speech that is merely offensive to some individuals is undesirable, but regulation of online speech that incites violence is necessary.
3.1. The Position in the UK
According to the most recent report on social networks of the UK’s telecommunications regulator – Ofcom, the main rationale for further regulation of social network providers is to protect people from harm.[28] In addition, Ofcom cites other grounds for the need to further regulate social network providers. Firstly, there is public concern about the lack of regulation of social network providers. A joint survey of Ofcom and the Information Commissioner’s Office (ICO) showed that 69 per cent of the UK’s adult Internet users are concerned about harmful content or conduct on social networks, 58 per cent worry about their data and privacy and 54 per cent – about their cybersecurity when using social networks.[29] Secondly, broadcasters are subject to extensive regulation, whereas social network providers are not. Yet, nowadays the boundaries between broadcasters and social network providers are becoming blurred, which makes the different regulatory standards for the two stakeholders unjustified.[30]
3.2. Discussion
The emphasis on harm prevention in social media regulation is not supported by Ofcom only. It has significant support among the academic community in the UK.[31] However, there are a number of differences between Ofcom’s position and the position of the academic community.
Firstly, according to the academic opinion that supports the harm-based approach, the fact that many people in the UK are concerned about harmful content, data and privacy, etc. on social networks does not automatically lead to a political decision to further regulate social network providers.[32] There are several reasons for this. One of the reasons is that political parties are reluctant to regulate social network providers as they often rely on them for their political campaigns.[33] Another reason is that social network providers have learnt from previous episodes of regulation, such as the regulation of broadcasters, and have built strong lobbies to oppose further regulation.[34] A third reason is that the harm caused by social network providers is not opposed by companies, but by individuals and civil society organisations which lack resources to exercise sufficient degree of power to push forward for further regulation of social network providers.[35]
Secondly, unlike Ofcom, some academics have come up with concrete proposals as to how to prevent harm on social networks without compromising freedom of speech and the legitimate interests of social network providers, such as their interest in preserving their business model. There are many ways to prevent harm, but not all of them are consistent with freedom of speech and the interests of social network providers. One of the easiest ways to do so is to simply outlaw a particular kind of speech, such as offensive speech or memes, but in this case social network providers will be obliged to monitor the content that their users upload.[36] This will create the risk of repeating the mistake from the PF v Mark Meechan case.
According to Woods and Perrin this is the wrong regulatory approach.[37] Instead, Parliament should impose a statutory duty of care on social network providers.[38] Thus, social network providers will be obliged to prevent harm, but the best means to achieve this will be left to them.[39] Statutory duties of care have been used as a means to create responsibility where harm can occur to users or the public.[40] For instance, the employers have a duty of care towards their employees, the owners or occupiers of public premises have a duty of care towards their users, etc.[41] The failure on behalf of social network providers to observe their duty of care will result in large fines along the lines of those envisaged in the General Data Protection Regulation (GDPR).[42]
It is submitted that the harm reduction approach which envisages the imposition of duty of care on social network providers is a reasonable approach, since it preserves the freedom of speech of the users of social networks by not imposing content filtering obligations on social network providers. It also preserves the legitimate interests of social network providers by leaving them the freedom to choose the best possible means of achieving the objective of preventing harm. Furthermore, statutory duties are broadly formulated and their content changes with time.[43] Hence, a statutory duty on social network providers to prevent harm can accommodate future threats created by the use of social networks without the need to make significant legislative changes. Furthermore, such statutory duty would enable social network providers to differentiate between cases akin to PF v Mark Meechan where there was no actual harm to the public and cases akin to the Brentan Tarrant case, where the content published by him was clearly harmful.
As with most regulatory efforts, initially social network providers will be reluctant to accept their statutory duty, but this is not a unique challenge. Normally, after initially protesting against regulation, the Internet industry agrees to cooperate in the enforcement of the new legal requirements and a new status quo is established.[44]
4. Conclusion
The case against further regulation of social network providers is not convincing because it cannot be accepted in all contexts. While offensive speech should not be regulated, the same cannot be said about terrorist speech. Also, while it can be argued that prohibiting memes on social networks harms the legitimate interest of social network providers, the same argument cannot be applied to the livestreaming of videos inciting violence or the commission of crime. Hence, it has to be accepted that certain level of further regulation is necessary. The issue is, however, what further regulation of social network providers is desirable and how it can be achieved, without interfering with the freedom of speech of social network users and the legitimate interests of social network providers. Imposing bans on certain types of content is not the appropriate approach since it is too rigid and it creates a likelihood of mistakes. Instead, Parliament should introduce a statutory duty of care for social network providers. In this manner, social network providers will have the opportunity to choose the most appropriate means of complying with their statutory duty, they will only remove content when it is harmful, and they will be held to account for any harm suffered as a result of their actions or inactions.
BIBLIOGRAPHY
Primary Sources
EU Legislation
Directive 2016/0802 on Copyright in the Digital Single Market (adoption pending) <https://juliareda.eu/wp-content/uploads/2019/02/Copyright_Final_compromise.pdf> Accessed 21 March 2019
Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data (General Data Protection Regulation – GDPR) [2016] OJ L 119 <https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:32016R0679&from=EN> Accessed 21 March 2019
UK Legilsation
Communications Act 2003, Chapter 21 <https://www.legislation.gov.uk/ukpga/2003/21/contents> Accessed 21 March 2019
US Legislation
Constitution of the United States 1787 <http://constitutionus.com/> Accessed 21 March 2019
North Carolina General Statutes (Last Updated August 2018) <https://www.ncleg.gov/enactedlegislation/statutes/pdf/bysection/chapter_14/gs_14-202.5.pdf> Accessed 21 March 2019
Case Law
Packingham v North Carolina 582 U.S. ___ (2017) <https://www.supremecourt.gov/opinions/16pdf/15-1194_08l1.pdf> Accessed 21 March 2019
PF v Mark Meechan [2018] Scottish Airdrie Sheriff Court (Judgment of 23 April 2018) (Unreported) <http://www.scotland-judiciary.org.uk/8/1962/PF-v-Mark-Meechan> Accessed 21 March 2019
Secondary Sources
Book Chapters
Woods L, ‘Social Media: It is not just about Article 10’. In D Mangan, L E Gillies (eds), The Legal Challenges of Social Media (Cheltenham/Northampton, Edward Elgar, 2017) 104
Articles
Eordogh F, ‘A Nuanced Take On Count Dankula’s Nazi Pug’, Forbes Magazine (online), 30 April 2018 <https://www.forbes.com/sites/fruzsinaeordogh/2018/04/30/a-nuanced-take-on-count-dankulas-nazi-pug/> Accessed 21 March 2019
Howe A, ‘Opinion analysis: Court invalidates ban on social media for sex offenders’, SCOTUS – Supreme Court of the United States Blog (online), 19 June 2017 <https://www.scotusblog.com/2017/06/opinion-analysis-court-invalidates-ban-social-media-sex-offenders/> Accessed 21 March 2019
Human Rights Watch, ‘Germany: Flawed Social Media Law: NetzDG is Wrong Response to Online Abuse’, Human Rights Watch (online), 14 February 2018 <https://www.hrw.org/news/2018/02/14/germany-flawed-social-media-law> Accessed 21 March 2019
Nguyen K, ‘Accused Christchurch mosque shooter Brenton Tarrant used same radicalisation tactics as Islamic State, expert says’, ABC News (online), 17 March 2019 <https://www.abc.net.au/news/2019-03-17/christchurch-shootings-brenton-tarrant-social-media-strategies/10908692> Accessed 20 March 2019
Reynolds M, ‘What is Article 13? The EU’s divisive new copyright plan explained’, Wired UK (online), 12 March 2019 <https://www.wired.co.uk/article/what-is-article-13-article-11-european-directive-on-copyright-explained-meme-ban> Accessed 21 March 2019
Walsh M, ‘Introducing a duty of care for social media’, DigiLeaders (online), 14 November 2018 <https://digileaders.com/introducing-a-duty-of-care-for-social-media/> Accessed 21 March 2019
Woods L, ‘Duty of Care’ (2019) 46(4) InterMEDIA 17, p. 19 <https://www.iicom.org/images/iic/intermedia/jan-2019/im2019-v46-i4-dutyofcare.pdf> Accessed 21 March 2019
Woods L, Perrin W, ‘Can Society Rein in Social Media?’, Carnegie Trust (online), 27 February 2018 <https://www.carnegieuktrust.org.uk/blog/social-media-harm-1/> Accessed 21 March 2019
Woods L, Perrin W, ‘Harm Reduction In Social Media – A Proposal’, Carnegie Trust (online), 22 March 2018 <https://www.carnegieuktrust.org.uk/blog/social-media-regulation/> Accessed 21 March 2019
Woods L, Perrin W, ‘Internet Harm Reduction: a Proposal’, Carnegie Trust (online), 30 January 2019 <https://www.carnegieuktrust.org.uk/blog/internet-harm-reduction-a-proposal/> Accessed 21 March 2019
Reports
Ofcom, ‘Addressing harmful online content: A perspective from broadcasting and on-demand standards regulation’ (Ofcom, 18 September 2018) <https://www.ofcom.org.uk/__data/assets/pdf_file/0022/120991/Addressing-harmful-online-content.pdf> Accessed 21 March 2019
Ofcom / Information Commissioner’s Office / Kantar Media, ‘Internet users’ experience of harm online: summary of survey research’ (June-July 2018) <https://www.ofcom.org.uk/__data/assets/pdf_file/0018/120852/Internet-harm-research-2018-report.pdf> Accessed 21 March 2019
News Sources
BBC News, ‘Christchurch shootings: 49 dead in New Zealand mosque attacks’, BBC News (online), 15 March 2019 <https://www.bbc.co.uk/news/world-asia-47578798> Accessed 20 March 2019
BBC News, ‘Christchurch shootings: Social media ‘too slow’ at removing footage’, BBC News (online), 17 March 2019 <https://www.bbc.co.uk/news/uk-wales-47601241> Accessed 21 March 2019
Hymas C, Zolfagharifard E, ‘New Zealand massacre: Tech giants told ‘enough is enough’ after shooting live-streamed in social media terror attack’, The Telegraph (online), 15 March 2019 <https://www.telegraph.co.uk/news/2019/03/15/new-zealand-massacre-tech-giants-told-enough-enough-shooting/> Accessed 20 March 2019
Japan Times, ‘Livestreaming delays proposed as way to discourage more viral massacre videos’, The Japan Times (online), 18 March 2019 <https://www.japantimes.co.jp/news/2019/03/18/asia-pacific/crime-legal-asia-pacific/livestreaming-delays-proposed-way-discourage-viral-massacre-videos/#.XJK2E_nFLIV> Accessed 20 March 2019
Kirkpatrick D D, ‘Massacre Suspect Traveled the World but Lived on the Internet’, The New York Times (online), 15 March 2019 <https://www.nytimes.com/2019/03/15/world/asia/new-zealand-shooting-brenton-tarrant.html> Accessed 20 March 2019
Silverstein J, ‘Facebook faces advertising boycott over livestream of New Zealand mosque shooting’, CBS News (online), 18 March 2019 <https://www.cbsnews.com/news/facebook-faces-advertising-boycott-over-live-stream-of-new-zealand-mosque-shooting/> Accessed 21 March 2019
Stanley-Becket I, ‘Who watches ISIS
beheading videos in the U.S.?
Men, Christians and the fearful, say psychologists.’, The Washington Post (online), 19 March 2019
<https://www.washingtonpost.com/nation/2019/03/19/who-watches-isis-beheading-videos-us-men-christians-fearful-say-psychologists/?utm_term=.97842e63d048>
Accessed 20 March 2019
[1] BBC News, ‘Christchurch shootings: 49 dead in New Zealand mosque attacks’, BBC News (online), 15 March 2019 <https://www.bbc.co.uk/news/world-asia-47578798> Accessed 20 March 2019
[2] D D Kirkpatrick, ‘Massacre Suspect Traveled the World but Lived on the Internet’, The New York Times (online), 15 March 2019 <https://www.nytimes.com/2019/03/15/world/asia/new-zealand-shooting-brenton-tarrant.html> Accessed 20 March 2019
[3] BBC News, ‘Christchurch shootings: Social media ‘too slow’ at removing footage’, BBC News (online), 17 March 2019 <https://www.bbc.co.uk/news/uk-wales-47601241> Accessed 21 March 2019
Japan Times, ‘Livestreaming delays proposed as way to discourage more viral massacre videos’, The Japan Times (online), 18 March 2019 <https://www.japantimes.co.jp/news/2019/03/18/asia-pacific/crime-legal-asia-pacific/livestreaming-delays-proposed-way-discourage-viral-massacre-videos/#.XJK2E_nFLIV> Accessed 20 March 2019
[4] ibid.
[5] K Nguyen, ‘Accused Christchurch mosque shooter Brenton Tarrant used same radicalisation tactics as Islamic State, expert says’, ABC News (online), 17 March 2019 <https://www.abc.net.au/news/2019-03-17/christchurch-shootings-brenton-tarrant-social-media-strategies/10908692> Accessed 20 March 2019
[6] I Stanley-Becket, ‘Who watches ISIS beheading videos in the U.S.? M n, Christians and the fearful, say psychologists.’, The Washington Post (online), 19 March 2019 <https://www.washingtonpost.com/nation/2019/03/19/who-watches-isis-beheading-videos-us-men-christians-fearful-say-psychologists/?utm_term=.97842e63d048> Accessed 20 March 2019
[7] C Hymas, E Zolfagharifard, ‘New Zealand massacre: Tech giants told ‘enough is enough’ after shooting live-streamed in social media terror attack’, The Telegraph (online), 15 March 2019 <https://www.telegraph.co.uk/news/2019/03/15/new-zealand-massacre-tech-giants-told-enough-enough-shooting/> Accessed 20 March 2019
[8] L Woods, ‘Social Media: It is not just about Article 10’. In D Mangan, L E Gillies (eds), The Legal Challenges of Social Media (Cheltenham/Northampton, Edward Elgar, 2017) 104, p. 104
[9] ibid.
[10] ibid.
[11] Packingham v North Carolina 582 U.S. ___ (2017) <https://www.supremecourt.gov/opinions/16pdf/15-1194_08l1.pdf> Accessed 21 March 2019
[12] A Howe, ‘Opinion analysis: Court invalidates ban on social media for sex offenders’, SCOTUS – Supreme Court of the United States Blog (online), 19 June 2017 <https://www.scotusblog.com/2017/06/opinion-analysis-court-invalidates-ban-social-media-sex-offenders/> Accessed 21 March 2019
[13] §202.5(a) and (e), Chapter 14, North Carolina General Statutes (Last Updated August 2018) <https://www.ncleg.gov/enactedlegislation/statutes/pdf/bysection/chapter_14/gs_14-202.5.pdf> Accessed 21 March 2019
[14] First Amendment, Constitution of the United States 1787 <http://constitutionus.com/> Accessed 21 March 2019
[15] A Howe, ‘Opinion analysis: Court invalidates ban on social media for sex offenders’
[16] PF v Mark Meechan [2018] Scottish Airdrie Sheriff Court (Judgment of 23 April 2018) (Unreported) <http://www.scotland-judiciary.org.uk/8/1962/PF-v-Mark-Meechan> Accessed 21 March 2019
[17] Communications Act 2003, Chapter 21 <https://www.legislation.gov.uk/ukpga/2003/21/contents> Accessed 21 March 2019
[18] PF v Mark Meechan [2018]
[19] ibid.
[20] F Eordogh, ‘A Nuanced Take On Count Dankula’s Nazi Pug’, Forbes Magazine (online), 30 April 2018 <https://www.forbes.com/sites/fruzsinaeordogh/2018/04/30/a-nuanced-take-on-count-dankulas-nazi-pug/> Accessed 21 March 2019
[21] Human Rights Watch, ‘Germany: Flawed Social Media Law: NetzDG is Wrong Response to Online Abuse’, Human Rights Watch (online), 14 February 2018 <https://www.hrw.org/news/2018/02/14/germany-flawed-social-media-law> Accessed 21 March 2019
[22] Japan Times, ‘Livestreaming delays proposed as way to discourage more viral massacre videos’
[23] ibid.
[24] J Silverstein, ‘Facebook faces advertising boycott over livestream of New Zealand mosque shooting’, CBS News (online), 18 March 2019 <https://www.cbsnews.com/news/facebook-faces-advertising-boycott-over-live-stream-of-new-zealand-mosque-shooting/> Accessed 21 March 2019
[25] Directive 2016/0802 on Copyright in the Digital Single Market (adoption pending) <https://juliareda.eu/wp-content/uploads/2019/02/Copyright_Final_compromise.pdf> Accessed 21 March 2019
[26] M Reynolds, ‘What is Article 13? The EU’s divisive new copyright plan explained’, Wired UK (online), 12 March 2019 <https://www.wired.co.uk/article/what-is-article-13-article-11-european-directive-on-copyright-explained-meme-ban> Accessed 21 March 2019
[27] ibid.
[28] Ofcom, ‘Addressing harmful online content: A perspective from broadcasting and on-demand standards regulation’ (Ofcom, 18 September 2018), p. 14 <https://www.ofcom.org.uk/__data/assets/pdf_file/0022/120991/Addressing-harmful-online-content.pdf> Accessed 21 March 2019
[29] Ofcom / Information Commissioner’s Office / Kantar Media, ‘Internet users’ experience of harm online: summary of survey research’ (June-July 2018), p. 4 <https://www.ofcom.org.uk/__data/assets/pdf_file/0018/120852/Internet-harm-research-2018-report.pdf> Accessed 21 March 2019
[30] Ofcom, ‘Addressing harmful online content: A perspective from broadcasting and on-demand standards regulation’, pp. 16-18
[31] L Woods, W Perrin, ‘Can Society Rein in Social Media?’, Carnegie Trust (online), 27 February 2018 <https://www.carnegieuktrust.org.uk/blog/social-media-harm-1/> Accessed 21 March 2019
M Walsh, ‘Introducing a duty of care for social media’, DigiLeaders (online), 14 November 2018 <https://digileaders.com/introducing-a-duty-of-care-for-social-media/> Accessed 21 March 2019
L Woods, ‘Duty of Care’ (2019) 46(4) InterMEDIA 17, p. 19 <https://www.iicom.org/images/iic/intermedia/jan-2019/im2019-v46-i4-dutyofcare.pdf> Accessed 21 March 2019
[32] L Woods, W Perrin, ‘Can Society Rein in Social Media?’
[33] ibid.
[34] ibid.
[35] ibid.
[36] L Woods, W Perrin, ‘Harm Reduction In Social Media – A Proposal’, Carnegie Trust (online), 22 March 2018 <https://www.carnegieuktrust.org.uk/blog/social-media-regulation/> Accessed 21 March 2019
[37] ibid.
[38] L Woods, W Perrin, ‘Internet Harm Reduction: a Proposal’, Carnegie Trust (online), 30 January 2019 <https://www.carnegieuktrust.org.uk/blog/internet-harm-reduction-a-proposal/> Accessed 21 March 2019
[39] ibid.
[40] ibid.
[41] ibid.
[42] Art.83 Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data (General Data Protection Regulation – GDPR) [2016] OJ L 119 <https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:32016R0679&from=EN> Accessed 21 March 2019
[43] L Woods, W Perrin, ‘Internet Harm Reduction: a Proposal’
[44] L Woods, W Perrin, ‘Can Society Rein in Social Media?’
Cite This Work
To export a reference to this article please select a referencing stye below:
Related Services
View allDMCA / Removal Request
If you are the original writer of this essay and no longer wish to have your work published on UKEssays.com then please: