Privacy and Security Issues Experienced by Consumers and Businesses Online

Feel free to download this sample essay to view our writing style, or use it as a template for your own paper. If you need help writing your assignment, click here!

Assignment Type Essay
Subject E-commerce
Academic Level Undergraduate
Format Harvard
Length 7 pages
Word Count 2,209

Need Some Help Writing your Paper?

We offer custom written papers starting at $32 / page. Your will get a completely custom-written paper tailored to your instructions, with zero chance of plagiarism.

Document Preview:

Privacy and Security Issues Experienced by Consumers and Businesses Online




























Executive Summary

The environment of online activity complexes the privacy problems that people already experience in the physical world. Every move that is made on PCs, smartphones, and tablets turns into a data point that trackers can easily gather and share. And people effectually agree to such gathering and sharing whenever they sign up for an online service and consent to its privacy policy. There’s a pretty big difference between what people think their privacy rights are online and what they really are online.
Consumers are concerned over unauthorized access to personal data because of security breaches or the lack of internal controls. Second, consumers are concerned about the risk of secondary use – the reuse of their personal data for unrelated purposes without their consent. This includes sharing with third parties who were not part of the transaction in which the consumer related his or her personal data. It also includes the aggregation of a consumers’ transaction data and other personal data to create a profile. The online dangers to privacy will continue to grow unless Congress and other decision-making bodies offer some significant support for privacy. Witnessing the battle between privacy and civil liberties advocates (on one side) and business and law-enforcement interests (on the other) may seem a bit like watching a particularly foul tennis game, but it all boils down to a matter of openness versus secrecy. Privacy advocates see Do Not Track as a no-brainer fix for the numerous privacy issues related to cookies. Marketers point to the ongoing success of data-driven, targeted Web advertising, which cookies make possible, as an indirect authorization of their methods.



Table of Contents




















Introduction
As technologies available for collection and analysis of Web data have become more elaborate, data privacy concerns among Internet users have grown (Pollach, 2007). Government and industry organizations have declared information privacy and security to the major obstacles in the development of consumer-related e-commerce. Risk perceptions regarding Internet privacy and security have been identified as issues for both new and experienced users of Internet technology (Miyazaki & Fernandez, 2001). The environment of online activity complexes the privacy problems that people already experience in the physical world. Every move that is made on PCs, smartphones, and tablets turns into a data point that trackers can easily gather and share. And people effectually agree to such gathering and sharing whenever they sign up for an online service and consent to its privacy policy. There’s a pretty big difference between what people think their privacy rights are online and what they really are online (Sheehan, 2002). People mistake privacy policy for meaning that they have privacy, but in reality that policy is regularly a way to describe the rights one doesn’t have (Miyazaki & Fernandez, 2001). Federal law may or may not lessen the privacy threats. “Efforts to update the Electronic Communications Privacy Act (ECPA) aim to make online data harder to collect and share” (Pike, 2011). In the meantime, proposed legislation called the Cyber Intelligence Sharing and Protection Act (CISPA) could make it easier to get (Pike, 2011). As privacy is being kicked around like a football in a scrimmage, people should pay close attention to the major threats that exist online.
Findings and Analysis
Cookie Proliferation
The invisible cookie software agents that track peoples browsing habits and personal data are likely to increase in the future. Advertising networks, marketers, and other data profiteers rest on on cookies to learn more about who people are and what they may be interested in buying (Kawkins, 2011). Unless legislation levies legal restraints on Web-browser tracking, systems are likely to gather more cookies than one would find in a package of Oreo’s. Cookies have been multiplying at a rate that would impress epidemiologists. Five to ten years ago, if one had opened a website, they might have gotten a cookie or two. “Today one would probably get on the order of 50 cookies from all sorts of third parties: ad servers, data brokers and trackers” (Kawkins, 2011). They can construct up this big profile about peoples browsing history, and the worst part is that it is undetectable to users. People have no idea what’s going on. Marketers say that they keep user data private by viewing it only in aggregate, but the pure volume of data a cookie can collect about any one person can enable the cookie’s owner to deduce a surprising amount about the individuals being tracked.
Seizing Cloud Data
People love how easy it is to grab data from the cloud and so do law enforcement agencies. And there’s only going to be more of that data to love in coming years. “It is thought that 36 percent of U.S. consumer content will be stored in the cloud by 2016” (Anderson, 2010). But whether one uses a Web-based email service, keep files in Google Drive, or uploads photos to Shutterfly, everything written, uploaded, or posted gets stored in a server that belongs to the online service.
A huge worry about using the cloud is that data does not have the same Fourth Amendment protections that it would have if it were stored in a desk drawer or on a desktop computer. One important reason that privacy supporters and some legislators are trying to update the ECPA this year is that the current law treats data stored on a server for more than 180 days as abandoned (Pike, 2011). This statutory supposition is an indication of a time when servers held data only fleetingly before shunting it off to a local computer. Additionally, the law’s definition of such data is ambiguous enough to cover not just email messages—a popular target of law enforcement agencies—but (possibly) other kinds of data stored on servers (Anderson, 2010). Now that so much data exists on servers owned by cloud-based services, and so many people keep content in the cloud for years, a lot of long-stored files that people haven’t abandoned could be fair game for Big Brother.
Law-enforcement agencies are requesting cloud-based data with increasing (and disturbing) frequency. Law-enforcement interests have darted past attempts to update ECPA, so it’s hard to say whether the present efforts will get any farther (Pike, 2011). The only true protection is to comprehend that anything one puts up there can be read by somebody else, so if one doesn’t want that to happen, they shouldn’t put it in the cloud.
Location Data Betrayal
Location data makes it progressively more difficult for one to wander around the world without someone knowing precisely where they are at any given point in time (Desai, 2013). One’s cell phone is the primary informer, but the location data that people post to social networking sites are illuminating sources, too. Identifying ones location will get easier still as other location-beaming devices come online, from smarter cars to smarter watches to Google Glass. Armed with this data, advertisers might (for instance) send one promotions for nearby businesses, wherever they are (Desai, 2013). The result could be a nice surprise, or not. And as with cloud-based data, the legal necessities for obtaining location data from ones mobile service provider are not very strict. Actually it’s pretty easy for the government to get access to the location data, and very hard for users to stop that data from being collected.
Data Never Forgets a Face
Posting and tagging photos online may feel like innocent fun, but behind the scenes it helps build a facial recognition database that makes escaping notice increasingly difficult for anyone. Most consumers are already in the largest facial recognition database in the world, and that’s Facebook (McKenzie, 2011). Indeed, the immense quantity of photos uploaded to Facebook makes it the poster child—or rather, giant—for the privacy issues surrounding this technology. Facebook uses the tags associated with those photos to build ever-more-detailed “faceprints” of what people look like from every angle (McKenzie, 2011).
If Facebook used this data strictly to help one find other people they know on Facebook, it might be okay. But when Facebook sells user data to third parties, photo data may be included—and the sanctity of the data afterward is uncertain (McKenzie, 2011). And Facebook isn’t the only source of facial-recognition data. Companies such as Google and Apple have facial-recognition technology built into some of their applications, too—most notably online photo sites.
The future of facial recognition offers scant comfort. Continued advances in surveillance technology, including drones and super-high-resolution cameras, will make identifying individuals in public places easier than ever, especially if the entity doing the surveillance has a nice, fat, facial-recognition database to consult (Li, Wang, Zhao & Ji, 2013). As in connection with other cloud-based data, revisions to the ECPA could boost privacy protections for digital photos—depending on what gets enacted.
Scanning in the Name of Cybersecurity
One may not be a malicious hacker, but that doesn’t mean their online activity won’t be scanned for telltale signs of cybercrime. The federal government has made cybersecurity a high priority, as concerns grow about over the vulnerability of the nation’s infrastructure to a computer-based attack (Asllani, White, & Ettkin, 2013). The Presidential Policy Directive concerning cybersecurity lists business sectors that the Administration considers critical—and therefore, in need of online watchdogging. Some sectors, such as “Commercial Facilities” and “Critical Manufacturing,” lend themselves to broad interpretation (Asllani, White, & Ettkin, 2013). The definition is still in flux, so there’s a question about what ‘critical infrastructure’ will ultimately encompass. There is reason to believe that the government plans to expand its scanning of Internet traffic from three defined sectors: financial institutions, utilities, and transportation companies. Collectively, that covers a lot of consumer activity.
Conclusions
Consumers are concerned over unauthorized access to personal data because of security breaches or the lack of internal controls. Second, consumers are concerned about the risk of secondary use – the reuse of their personal data for unrelated purposes without their consent. This includes sharing with third parties who were not part of the transaction in which the consumer related his or her personal data. It also includes the aggregation of a consumers’ transaction data and other personal data to create a profile. The online dangers to privacy will continue to grow unless Congress and other decision-making bodies offer some significant support for privacy. Witnessing the battle between privacy and civil liberties advocates (on one side) and business and law-enforcement interests (on the other) may seem a bit like watching a particularly foul tennis game, but it all boils down to a matter of openness versus secrecy. Privacy advocates see Do Not Track as a no-brainer fix for the numerous privacy issues related to cookies. Marketers point to the ongoing success of data-driven, targeted Web advertising, which cookies make possible, as an indirect authorization of their methods.
Consumer behavior might be sending contradictory signals, but independent research suggests a need for more, not less, security. One thing is certain: resolving online privacy issues will be vital as new devices like, smart cars, watches, Google Glass, and more, add to the mounting data stream. Make no mistake, everything people touch that is digital in the future will be a data source, which makes it all that more imperative that the privacy issues be fixed.
Tackling privacy, though, is no easy matter. If nothing else, privacy discussions frequently turn heated very rapidly. Some people consider privacy to be a fundamental right; others consider it to be a tradable product. The apprehension stems from a new technical environment for consumers and businesses, the subsequent data flow with substantial benefits to businesses and consumers, consumer anxieties in this new environment, and regulatory attempts to govern this environment. It is significant to understand each one of these, and to understand the tradeoffs. Privacy as a business issue is extremely sensitive to changes in the surrounding framework.
Privacy is now understood, by many, to be a social construction with prospects the largest deliberation. Yet, privacy is also measured a public issue by regulators, who have nevertheless largely allowed technology to unfold to date. Security is now understood to be largely flawed, the continual cat-and-mouse game of security expert and hacker. Important technical progresses have been deployed in the last five years; nevertheless, it is clear that organizational policies may play as a significant a role in security.



















References
Anderson, W. L. (2010). Increased "Cloud" Adoption Accelerates the Need for Privacy Legislation Reform. Franklin Business & Law Journal, (4), 16-20.
Asllani, A., White, C., & Ettkin, L. (2013). Journal of Legal, Ethical & Regulatory Issues. 16(1), p7-14. 8p.
Desai, D. (2013). Beyond Location: Data Security in the 21st Century. Communications of the ACM, 56(1), 34-36. doi:10.1145/2398356.2398368
Kawkins, K. (2011). Controlling Your Privacy in the New Digital World. Officepro, 71(4), 12- 15.
Li, Yongqiang; Wang, Shangfei; Zhao, Yongping; Ji, Qiang. IEEE Transactions on Image Processing. Jul2013, Vol. 22 Issue 7, p2559-2573. 15p. DOI: 10.1109/TIP.2013.2253477
McKenzie, P. (2011). Weapons of Mass Assignment. Communications of the ACM, 54(5), 54-59. doi:10.1145/1941487.1941503
Miyazaki, A. D., & Fernandez, A. (2001). Consumer perceptions of privacy and security risks for online shopping. Journal of Consumer Affairs, 35(1), 27-44.
Pike, G. H. (2011). The Online Privacy Debate: How to Get to 'No'. Eventdv, 24(10), 24.
Pollach, I. (2007). What’s wrong with online privacy policies? Communications Of The ACM, 50(9), 103-108.
Sheehan, K. (2002). Toward a Typology of Internet Users and Online Privacy Concerns. Information Society, 18(1), 21-32.