Top Special Offer! Check discount
Get 13% off your first order - useTopStart13discount code now!
Facebook is an online social networking website. It endows registered customers to send messages, upload photos, create profiles and uploads videos. It is a convenient way for people to connect and share with their families and friends online. Facebook came was established in February 2004 by Mark Zuckerberg with his schoolmates and several colleagues of Harvard University (Phillips, 2007). Facebook, social media platform, has an ethical but no law-related obligation to rescue a victim of crime. There are three approaches through which social media environments employ in order to be more thorough and also proactive with their evaluation of the kind of content that is posted on their platforms. Additionally, Facebook should employ some safeguards to deter acts of inhumanity from being conveyed through its platform. Furthermore, Facebook does not have an oversight committee or an ethics officer. There are some proposals of the adjustments that Facebook should apply to encourage the use of the platform.
The question of whether Facebook has any ethical or legal duty to rescue a victim of crime is debatable. To begin with, fighting crime is legally the duty of the law enforcement officers; therefore, Facebook does not have any law-related duty to rescue a victim of crime. The only obligation Facebook has is to monitor and notify the necessary law enforcement officials about any crime conveyed through its platform. Facebook does not encourage any form of crime including suicide or personal injuries. The company has shown goodwill in situations where such incidences have occurred, by collaborating with law enforcement. There is no evidence that Facebook has caused crime since it was invented; it has only been used by as a platform, by its users, to organize and witness crime through communication and live video streams. According to Mangold (2009), Facebook cannot be accused of the crimes because the offenders have used it as a platform to broadcast their criminal acts as in the case of Cleveland shooting (News, 2017). There are few if any state that has reviewed their laws to consider the issue of witnessing crimes online. Most state laws require that a witness should have been at the crime scene so as to make their testimony viable in a court. Therefore, for a case like this where Facebook monitors are miles away from the crime scene, it would not apply.
Another issue that makes Facebook not liable for crimes that are streamed live is that with the advanced technology across the globe, it is difficult to determine whether the actual crime is taking place or not; there is a lot of, manipulated, odd and surreal videos that are circulated online every minute. According to Steinberg et al., (2015), Facebook has always encouraged its users to report any content, they find in their platform, that may be viewed as life-threatening. This is an ethical responsibility to ensure that their users are able to uphold morality. This is a portrayal of moral and ethical standards. Therefore, Facebook has the ethical responsibility of keeping an eye and managing life-threatening occurrences.
How to control moderate the user-generated content has been a headache to most platforms. However, they may be more thorough and proactive in reviewing the type of content that is posted. To start with, all social media platforms should incorporate the use of image fingerprinting through logarithmic signature; to filter unethical video and image content (Seo et al., 2004). A review of reporting tools that individuals use to raise a red flag on content posted on social media should be enhanced. Automated systems should also be used in the review process and where need be, they should be used for the proactive purpose of removing illegal content (Parasuraman & Mille, 2004). No number of human beings can manually moderate and review every video or image being uploaded. Use of human moderators would mean that some live video streaming would begin and end even before they are aware that such video has been aired.
Facebook and other social media platforms should incorporate these two safeguards to prevent acts of violence from being aired. To start with, they should come up with artificial intelligence that they can rely on to try to control the content that appears on their sites. Such would be used to make it easier to report whether the content is offensive or not so that the necessary action can be taken immediately (Bench-Capon & Dunne, 2007). Another safeguard that should be considered is the use of time algorithm which picks each post for any individual who logs in and can forecast whether a given user is going to comment, share, like, mark as spam or hide a content (Bimal & Raju, 2014).
According to Zimmer (2017), Facebook has neither an ethics officer nor an oversight committee, and there is no need for Facebook to create such roles. This is because, investing in high-level ethics position would signal to policymakers and users that technological organizations are replacing their roles as the main actors of shaping political, economic and social activities. More so, individuals working within Facebook are always challenging the authority on critical ethical issues. It is never easy to balance the issues of ethics and those of commercial incentives (Zimmer, 2017; Studer, 2010); any ethical issues that outweigh any organization’s commercial interests will always be foregone. Therefore, I believe that no any ethical officer or oversight committee can bring the ethical sanity that most believe need to be incorporated into Facebook.
Facebook should adopt the following two changes to foster ethical usage of their platform. To begin with, they should initiate bounty program for individuals to report and be rewarded when they instantly report any content that is regarded as unethical. Facebook users would warmly welcome such an idea where they are rewarded for keeping watch on what is being posted by their friends. Despite being an extrinsically motivated approach, it would serve to reduce the multiplication of vulgar content on the Facebook platform. Another change that they can adopt is to advocate for new legislation that criminalizes any manner of amplification of any crime through social media. This would not sound good to most Facebook users, but it would go a long way to curb harmful content from being posted.
In summary, every social media platform has an ethical duty to ensure that their clients do not fall victims of crime as a result of their mismanagement. It is, therefore, necessary to come up with good strategies of curbing the streaming of illegal contents on their platforms.
Bench-Capon, T. J., & Dunne, P. E. (2007). Argumentation in artificial intelligence. Artificial intelligence, 171(10-15), 619-641.
Bimal, V. O., & Raju, G. (2014). Performance Analysis of TimeLine Algorithm in Grid Environment using Alea. International Journal of Computer Science Systems Engineering and Information Technology (IJCSSEIT), 7(2), 199-210.
Crawford, K., & Gillespie, T. (2016). What is a flag for? Social media reporting tools and the vocabulary of complaint. New Media & Society, 18(3), 410-428.
Mangold, W. G., & Faulds, D. J. (2009). Social media: The new hybrid element of the promotion mix. Business Horizons.
News. (2017, April 17). Facebook has a unique challenge in policing depraved videos. Retrieved from http://www.nbcnews.com/tech/social-media/cleveland-shooting-highlights-facebook-s-responsibility-policing-depraved-videos-n747306
Parasuraman, R., & Miller, C. A. (2004). Trust and etiquette in high-criticality automated systems. Communications of the ACM, 47(4), 51-55
Phillips, S. (2007). A brief history of Facebook. The Guardian, 25(7).
Seo, J. S., Haitsma, J., Kalker, T., & Yoo, C. D. (2004). A robust image fingerprinting system using the Radon transform. Signal Processing: Image Communication, 19(4), 325-339.
Steinberg, A., Tonkelowitz, M., Deng, P., Mosseri, A., Hupp, A., Sittig, A., & Zuckerberg, M. (2015). U.S. Patent No. 9,110,953. Washington, DC: U.S. Patent and Trademark Office.
Studer, G. (2017). Live Streaming Violence over Social Media: An Ethical Dilemma. Charleston L. Rev., 11, 621.
Zimmer, M. (2010). “But the data is already public”: on the ethics of research on Facebook. Ethics and information technology.
Hire one of our experts to create a completely original paper even in 3 hours!