دسترسی نامحدود
برای کاربرانی که ثبت نام کرده اند
برای ارتباط با ما می توانید از طریق شماره موبایل زیر از طریق تماس و پیامک با ما در ارتباط باشید
در صورت عدم پاسخ گویی از طریق پیامک با پشتیبان در ارتباط باشید
برای کاربرانی که ثبت نام کرده اند
درصورت عدم همخوانی توضیحات با کتاب
از ساعت 7 صبح تا 10 شب
ویرایش: 4th
نویسندگان: Herman T. Tavani
سری:
ISBN (شابک) : 1118281721, 9781118281727
ناشر: John Wiley and Sons
سال نشر: 2012
تعداد صفحات: 456
زبان: English
فرمت فایل : PDF (درصورت درخواست کاربر به PDF، EPUB یا AZW3 تبدیل می شود)
حجم فایل: 3 مگابایت
در صورت تبدیل فایل کتاب Ethics and Technology: Controversies, Questions, and Strategies for Ethical Computing, 4th Edition به فرمت های PDF، EPUB، AZW3، MOBI و یا DJVU می توانید به پشتیبان اطلاع دهید تا فایل مورد نظر را تبدیل نمایند.
توجه داشته باشید کتاب اخلاق و فناوری: بحثها، پرسشها و استراتژیها برای محاسبات اخلاقی، ویرایش چهارم نسخه زبان اصلی می باشد و کتاب ترجمه شده به فارسی نمی باشد. وبسایت اینترنشنال لایبرری ارائه دهنده کتاب های زبان اصلی می باشد و هیچ گونه کتاب ترجمه شده یا نوشته شده به فارسی را ارائه نمی دهد.
ویرایش چهارم اخلاق و فناوری دانشجویان را با مسائل و بحثهایی آشنا میکند که حوزه نسبتاً جدید اخلاق سایبری را شامل میشود. این کتاب درسی طیف گستردهای از مسائل اخلاق سایبری را مورد بررسی قرار میدهد - از مسائل خاص مسئولیت اخلاقی گرفته تا نگرانیهای اجتماعی و اخلاقی گستردهتر که بر هر یک از ما در زندگی روزمره تأثیر میگذارد. تحولات اخیر در اخلاق ماشینی همچنین باید دانش آموزان را وادار کند تا سؤالاتی را در مورد مفاهیم متعارف استقلال و اعتماد در نظر بگیرند. چنین موضوعاتی و بسیاری از مناقشات اخلاقی جذاب دیگر - اعم از موارد فرضی و واقعی - در این متن پرکاربرد و قابل احترام مورد بحث قرار گرفته است.
بهروزرسانیهای نسخه چهارم عبارتند از
The Fourth Edition of Ethics and Technology introduces students to issues and controversies that comprise the relatively new field of cyberethics. This textbook examines a wide range of cyberethics issues--from specific issues of moral responsibility to broader social and ethical concerns that affect each of us in our day-to-day lives. Recent developments in machine ethics should also cause students to consider questions about conventional conceptions of autonomy and trust. Such topics and many other engaging ethical controversies--both hypothetical and actual cases--are discussed in this widely used and respected text.
Updates to the 4th Edition include
Cover......Page 1
Title Page......Page 3
Copyright Page......Page 4
Contents At A Glance......Page 7
Table of Contents......Page 9
Preface......Page 19
New to the Fourth Edition......Page 20
Audience and Scope......Page 21
Organization and Structure of the Book......Page 23
The Web Site for Ethics and Technology......Page 25
Note to Instructors: A Roadmap for Using This Book......Page 26
A Note to Computer Science Instructors......Page 27
Acknowledgments......Page 29
Foreward......Page 31
Scenario 1–1: A Fatal Cyberbullying Incident on MySpace......Page 33
Scenario 1–3: “The Washingtonienne” Blogger......Page 34
1.1 Defining Key Terms: Cyberethics and Cybertechnology......Page 35
1.1.1 What Is Cybertechnology?......Page 36
1.1.2 Why the Term Cyberethics?......Page 37
1.2 The Cyberethics Evolution: Four Developmental Phases in Cybertechnology......Page 38
1.3 Are Cyberethics Issues Unique Ethical Issues?......Page 41
Scenario 1–4: Developing the Code for a Computerized Weapon System......Page 42
1.3.1 Distinguishing between Unique Technological Features and Unique Ethical Issues......Page 43
1.3.2 An Alternative Strategy for Analyzing the Debate about the Uniqueness of Cyberethics Issues......Page 44
1.3.3 A Policy Vacuum in Duplicating Computer Software......Page 45
1.4 Cyberethics as a Branch of Applied Ethics: Three Distinct Perspectives......Page 46
1.4.1 Perspective #1: Cyberethics as a Field of Professional Ethics......Page 47
1.4.2 Perspective #2: Cyberethics as a Field of Philosophical Ethics......Page 50
Scenario 1–6: The Impact of Technology X on the Pleasantville Community......Page 53
1.5 A Comprehensive Cyberethics Methodology......Page 56
1.5.1 A “Disclosive” Method for Cyberethics......Page 57
1.5.2 An Interdisciplinary and Multilevel Method for Analyzing Cyberethics Issues......Page 58
1.6 A Comprehensive Strategy for Approaching Cyberethics Issues......Page 59
Review Questions......Page 60
Scenarios for Analysis......Page 61
Endnotes......Page 62
References......Page 63
Online Resources......Page 64
2.1 Ethics and Morality......Page 65
Scenario 2–1: The “Runaway Trolley”: A Classic Moral Dilemma......Page 66
2.1.1 What Is Morality?......Page 67
2.1.2 Deriving and Justifying the Rules and Principles of a Moral System......Page 70
2.2 Discussion Stoppers as Roadblocks to Moral Discourse......Page 74
2.2.1 Discussion Stopper #1: People Disagree on Solutions to Moral Issues......Page 75
2.2.2 Discussion Stopper #2: Who Am I to Judge Others?......Page 77
2.2.3 Discussion Stopper #3: Morality Is Simply a Private Matter......Page 79
2.2.4 Discussion Stopper #4: Morality Is Simply a Matter for Individual Cultures to Decide......Page 80
Scenario 2–2: The Perils of Moral Relativism......Page 81
2.3 Why Do We Need Ethical Theories?......Page 84
2.4 Consequence-Based Ethical Theories......Page 85
2.4.2 Rule Utilitarianism......Page 87
2.5 Duty-Based Ethical Theories......Page 88
2.5.1 Rule Deontology......Page 89
Scenario 2–4: Making an Exception for Oneself......Page 90
2.5.2 Act Deontology......Page 91
Scenario 2–5: A Dilemma Involving Conflicting Duties......Page 92
2.6 Contract-Based Ethical Theories......Page 93
2.6.1 Some Criticisms of Contract-Based Theories......Page 94
2.6.2 Rights-Based Contract Theories......Page 95
2.7.1 Being a Moral Person vs. Following Moral Rules......Page 96
2.7.2 Acquiring the “Correct” Habits......Page 97
2.8 Integrating Aspects of Classical Ethical Theories into a Single Comprehensive Theory......Page 98
2.8.1 Moor’s Just-Consequentialist Theory and Its Application to Cybertechnology......Page 99
2.8.2 Key Elements in Moor’s Just-Consequentialist Framework......Page 101
Review Questions......Page 102
Essay/Presentation Questions......Page 103
Endnotes......Page 104
Further Readings......Page 105
3.1 Getting Started......Page 106
3.1.1 Defining Two Key Terms in Critical Reasoning: Claims and Arguments......Page 107
3.1.3 The Basic Structure of an Argument......Page 108
3.2 Constructing an Argument......Page 110
3.3 Valid Arguments......Page 112
3.4 Sound Arguments......Page 115
3.5 Invalid Arguments......Page 117
3.6 Inductive Arguments......Page 118
3.7 Fallacious Arguments......Page 119
3.8 A Seven-Step Strategy for Evaluating Arguments......Page 121
3.9 Identifying Some Common Fallacies......Page 123
3.9.2 Slippery Slope Argument......Page 124
3.9.4 False Cause Fallacy......Page 125
3.9.6 Fallacy of Composition/Fallacy of Division......Page 126
3.9.8 Appeal to the People (Argumentum ad Populum)......Page 127
3.9.9 The Many/Any Fallacy......Page 128
3.9.10 The Virtuality Fallacy......Page 129
Discussion Questions......Page 130
Endnotes......Page 131
Further Readings......Page 132
CHAPTER 4: PROFESSIONAL ETHICS, CODES OF CONDUCT, AND MORAL RESPONSIBILITY......Page 133
4.1 Professional Ethics......Page 134
4.1.2 Who Is a Professional?......Page 135
4.1.3 Who Is a Computer/IT Professional?......Page 136
4.2.1 Safety-Critical Software......Page 137
4.3 Professional Codes of Ethics and Codes of Conduct......Page 138
4.3.1 The Purpose of Professional Codes......Page 139
4.3.2 Some Criticisms of Professional Codes......Page 140
4.3.3 Defending Professional Codes......Page 141
4.3.4 The IEEE-CS/ACM Software Engineering Code of Ethics and Professional Practice......Page 142
4.4.1 Do Employees Have an Obligation of Loyalty to Employers?......Page 144
4.4.2 Whistle-Blowing Issues......Page 146
Scenario 4–1: Whistle-Blowing and the “Star Wars” Controversy......Page 147
4.5 Moral Responsibility, Legal Liability, and Accountability......Page 149
4.5.1 Distinguishing Responsibility from Liability and Accountability......Page 150
4.5.2 Accountability and the Problem of “Many Hands”......Page 151
4.5.3 Legal Liability and Moral Accountability......Page 152
Scenario 4–3: The Aegis Radar System......Page 153
4.7 Do Some Computer Corporations Have Special Moral Obligations?......Page 154
4.7.1 Special Responsibilities for Search Engine Companies......Page 155
4.7.2 Special Responsibilities for Companies that Develop Autonomous Systems......Page 156
4.8 Chapter Summary......Page 157
Essay/Presentation Questions......Page 158
Scenarios for Analysis......Page 159
References......Page 160
Further Readings......Page 162
CHAPTER 5: PRIVACY AND CYBERSPACE......Page 163
5.1 Are Privacy Concerns Associated with Cybertechnology Unique or Special?......Page 164
5.2 What is Personal Privacy?......Page 166
5.2.2 Decisional Privacy: Freedom from Interference in One’s Personal Affairs......Page 167
5.2.4 A Comprehensive Account of Privacy......Page 168
5.2.5 Privacy as “Contextual Integrity”......Page 169
Scenario 5–3: Preserving Contextual Integrity in a University Seminar......Page 170
5.3 Why is Privacy Important?......Page 171
5.3.1 Is Privacy an Intrinsic Value?......Page 172
5.4.1 “Dataveillance” Techniques......Page 173
5.4.2 Internet Cookies......Page 174
5.4.3 RFID Technology......Page 175
5.4.4 Cybertechnology and Government Surveillance......Page 177
5.5.1 Merging Computerized Records......Page 178
Scenario 5–4: Merging Personal Information in Unrelated Computer Databases......Page 179
5.5.2 Matching Computerized Records......Page 180
Scenario 5–5: Using Biometric Technology at Super Bowl XXXV......Page 181
5.6.1 How Does Data Mining Threaten Personal Privacy?......Page 182
Scenario 5–6: Data Mining at the XYZ Bank......Page 183
Scenario 5–7: The Facebook Beacon Controversy......Page 186
5.7 Protecting Personal Privacy in Public Space......Page 188
Scenario 5–9: Shopping at Nile.com......Page 189
5.7.1 Search Engines and the Disclosure of Personal Information......Page 190
Scenario 5–10: Tracking Your Search Requests on Google......Page 191
5.7.2 Accessing Online Public Records......Page 192
Scenario 5–11: Accessing Online Public Records in Pleasantville......Page 193
5.8 Privacy-Enhancing Technologies......Page 194
5.8.2 PETs and the Principle of Informed Consent......Page 195
5.9.1 Industry Self-Regulation Initiatives Regarding Privacy......Page 196
5.9.2 Privacy Laws and Data Protection Principles......Page 198
5.10 Chapter Summary......Page 200
Discussion Questions......Page 201
Scenarios for Analysis......Page 202
References......Page 203
Further Readings......Page 205
6.1 Security in the Context of Cybertechnology......Page 206
6.1.2 Security and Privacy: Some Similarities and Some Differences......Page 207
6.2 Three Categories of Cybersecurity......Page 208
6.2.1 Data Security: Confidentiality, Integrity, and Availability of Information......Page 209
Scenario 6–1: The Conficker Worm......Page 210
Scenario 6–2: The GhostNet Controversy......Page 211
6.3 “Cloud Computing” and Security......Page 212
6.3.1 Deployment and Service/Delivery Models for the Cloud......Page 213
6.3.2 Securing User Data Residing in the Cloud......Page 214
6.4 Hacking and “The Hacker Ethic”......Page 215
6.4.1 What Is “The Hacker Ethic”?......Page 216
6.4.2 Are Computer Break-ins Ever Ethically Justifiable?......Page 218
6.5 Cyberterrorism......Page 219
6.5.1 Cyberterrorism vs. Hacktivism......Page 220
Scenario 6–3: Anonymous and the “Operation Payback” Attack......Page 221
6.5.2 Cybertechnology and Terrorist Organizations......Page 222
6.6.1 Information Warfare vs. Conventional Warfare......Page 223
6.6.2 Potential Consequences for Nations that Engage in IW......Page 224
6.7.1 The Risk Analysis Methodology......Page 226
6.7.2 The Problem of “De-Perimeterization” of Information Security for Analyzing Risk......Page 227
Review Questions......Page 228
Scenarios for Analysis......Page 229
References......Page 230
Further Readings......Page 232
7.1 Cybercrimes and Cybercriminals......Page 233
7.1.1 Background Events: A Brief Sketch......Page 234
7.2 Hacking, Cracking, and Counterhacking......Page 235
7.2.2 Active Defense Hacking: Can Acts of “Hacking Back” or Counter Hacking Ever Be Morally Justified?......Page 236
7.3 Defining Cybercrime......Page 237
7.3.1 Determining the Criteria......Page 238
Scenario 7–1: Using a Computer to File a Fraudulent Tax Return......Page 239
7.4 Three Categories of Cybercrime: Piracy, Trespass, and Vandalism in Cyberspace......Page 240
7.5.1 Some Examples of Cyber-Exacerbated vs. Cyber-Assisted Crimes......Page 241
7.5.2 Identity Theft......Page 243
Scenario 7–2: Intercepting Mail that Enters and Leaves Your Neighborhood......Page 245
7.6.1 Biometric Technologies......Page 246
7.6.2 Keystroke-Monitoring Software and Packet-Sniffing Programs......Page 247
Scenario 7–3: Entrapment on the Internet......Page 248
7.7.2 Enhanced Government Surveillance Techniques and the Patriot Act......Page 249
Scenario 7–4: A Virtual Casino......Page 250
Scenario 7–5: Prosecuting a Computer Corporation in Multiple Countries......Page 251
7.8.2 Some International Laws and Conventions Affecting Cybercrime......Page 252
7.9 Cybercrime and the Free Press: The WikiLeaks Controversy......Page 253
7.9.2 Are WikiLeaks’ Practices Criminal?......Page 254
7.9.3 WikiLeaks and the Free Press......Page 255
Review Questions......Page 257
Scenarios for Analysis......Page 258
Endnotes......Page 259
References......Page 260
Further Readings......Page 261
8.1 What is Intellectual Property?......Page 262
8.1.1 Intellectual Objects......Page 263
8.1.3 Software as Intellectual Property......Page 264
8.1.4 Evaluating an Argument for Why It is Wrong to Copy Proprietary Software......Page 265
8.2.1 The Evolution of Copyright Law in the United States......Page 267
8.2.2 The Fair-Use and First-Sale Provisions of Copyright Law......Page 268
Scenario 8–2: Decrypting Security on an e-Book Reader......Page 269
8.2.3 Software Piracy as Copyright Infringement......Page 270
8.2.4 Napster and the Ongoing Battles over Sharing Digital Music......Page 271
Scenario 8–3: The Case of MGM v. Grokster......Page 273
8.3.1 Patent Protections......Page 274
8.3.3 Trade Secrets......Page 275
8.4 Jurisdictional Issues Involving Intellectual Property Laws......Page 276
8.5.1 The Labor Theory of Property......Page 277
Scenario 8–4: DEF Corporation vs. XYZ Inc.......Page 278
Scenario 8–5: Sam’s e-Book Reader Add-on Device......Page 279
8.5.3 The Personality Theory of Property......Page 280
Scenario 8–6: Angela’s B++ Programming Tool......Page 281
8.6.1 GNU and the Free Software Foundation......Page 282
8.6.2 The “Open Source Software” Movement: OSS vs. FSF......Page 283
8.7 The “Common-Good” Approach: An Alternative Framework for Analyzing the Intellectual Property Debate......Page 284
8.7.1 Information Wants to be Shared vs. Information Wants to be Free......Page 286
8.7.2 Preserving the Information Commons......Page 288
8.7.3 The Fate of the Information Commons: Could the Public Domain of Ideas Eventually Disappear?......Page 289
8.7.4 The Creative Commons......Page 291
8.8 PIPA, SOPA, and RWA Legislation: Current Battlegrounds in the Intellectual Property War......Page 292
8.8.2 RWA and Public Access to Health-Related Information......Page 293
Scenario 8–7: Elsevier Press and “The Cost of Knowledge” Boycott......Page 294
8.8.3 Intellectual Property Battles in the Near Future......Page 295
Review Questions......Page 296
Scenarios for Analysis......Page 297
Endnotes......Page 298
References......Page 299
Further Readings......Page 300
CHAPTER 9: REGULATING COMMERCE AND SPEECH IN CYBERSPACE......Page 301
9.1.1 The Ontology of Cyberspace: Is the Internet a Medium or a Place?......Page 302
9.1.2 Two Categories of Cyberspace Regulation......Page 303
9.2 Four Modes of Regulation: The Lessig Model......Page 305
9.3.1 DRM Technology: Implications for Public Debate on Copyright Issues......Page 306
Scenario 9–1: The Sony Rootkit Controversy......Page 307
9.3.2 Privatizing Information Policy: Implications for the Internet......Page 308
9.4.1 Issues Surrounding the Use/Abuse of HTML Metatags......Page 310
9.4.2 Hyperlinking and Deep Linking......Page 311
Scenario 9–3: Deep Linking on the Ticketmaster Web Site......Page 312
9.5.1 Defining Spam......Page 313
9.5.2 Why Is Spam Morally Objectionable?......Page 314
9.6.1 Protecting Free Speech......Page 316
9.6.2 Defining Censorship......Page 317
9.7.1 Interpreting “Community Standards” in Cyberspace......Page 318
9.7.2 Internet Pornography Laws and Protecting Children Online......Page 319
9.7.3 Virtual Child Pornography......Page 320
Scenario 9–4: A Sexting Incident Involving Greensburg Salem High School......Page 322
9.8.1 Hate Speech on the Web......Page 324
9.9 “Network Neutrality” and the Future of Internet Regulation......Page 326
9.9.1 Defining Network Neutrality......Page 327
9.9.3 Future Implications for the Net Neutrality Debate......Page 328
9.10 Chapter Summary......Page 329
Discussion Questions......Page 330
Scenarios for Analysis......Page 331
References......Page 332
Further Readings......Page 333
CHAPTER 10: THE DIGITAL DIVIDE, DEMOCRACY, AND WORK......Page 335
10.1.1 The Global Digital Divide......Page 336
10.1.2 The Digital Divide within Nations......Page 337
Scenario 10–1: Providing In-Home Internet Service for Public School Students......Page 338
10.1.3 Is the Digital Divide an Ethical Issue?......Page 339
10.2 Cybertechnology and the Disabled......Page 341
10.2.1 Disabled Persons and Remote Work......Page 342
10.2.2 Arguments for Continued WAI Support......Page 343
10.3.1 Internet Usage Patterns......Page 344
10.3.2 Racism and the Internet......Page 345
10.4 Cybertechnology and Gender......Page 346
10.4.1 Access to High-Technology Jobs......Page 347
10.5 Cybertechnology, Democracy, and Democratic Ideals......Page 349
10.5.1 Has Cybertechnology Enhanced or Threatened Democracy?......Page 350
10.5.2 How has Cybertechnology Affected Political Elections in Democratic Nations?......Page 354
10.6.1 Job Displacement and the Transformed Workplace......Page 356
10.6.2 The Quality of Work Life in the Digital Era......Page 360
Scenario 10–2: Employee Monitoring and the Case of Ontario v. Quon......Page 361
10.7 Chapter Summary......Page 363
Discussion Questions......Page 364
Scenarios for Analysis......Page 365
Endnotes......Page 366
References......Page 367
Further Readings......Page 368
11.1.1 Online Communities vs. Traditional Communities......Page 369
11.1.3 Assessing Pros and Cons of Online Communities......Page 371
Scenario 11–1: A Virtual Rape in Cyberspace......Page 374
11.2 Virtual Environments and Virtual Reality......Page 375
11.2.1 What is Virtual Reality (VR)?......Page 376
11.2.2 Ethical Controversies Involving Behavior in VR Applications and Games......Page 377
11.2.3 Misrepresentation, Bias, and Indecent Representations in VR Applications......Page 381
11.3 Cyber Identities and Cyber Selves: Personal Identity and Our Sense of Self in the Cyber Era......Page 383
11.3.2 “MUD Selves” and Distributed Personal Identities......Page 384
11.3.3 The Impact of Cybertechnology on Our Sense of Self......Page 385
11.4.1 What is AI? A Brief Overview......Page 387
11.4.2 The Turing Test and John Searle’s “Chinese Room” Argument......Page 389
11.4.3 Cyborgs and Human-Machine Relationships......Page 390
11.4.4 Do (At Least Some) AI Entities Warrant Moral Consideration?......Page 393
Review Questions......Page 395
Essay/Presentation Questions......Page 396
Endnotes......Page 397
References......Page 398
Further Readings......Page 399
12.1 Converging Technologies and Technological Convergence......Page 400
12.2 Ambient Intelligence (AmI) and Ubiquitous Computing......Page 401
12.2.3 Intelligent User Interfaces......Page 403
12.2.4 Ethical and Social Issues in AmI......Page 404
Scenario 12–1: E. M. Forster’s Precautionary Tale......Page 405
Scenario 12–2: Jeremy Bentham’s Panopticon......Page 407
12.3.2 Ethical Issues and Controversies......Page 408
Scenario 12–3: deCODE Genetics Inc.......Page 409
12.3.3 ELSI Guidelines and Genetic-Specific Legislation......Page 412
12.4 Nanotechnology and Nanocomputing......Page 413
12.4.1 Nanotechnology: A Brief Overview......Page 414
12.4.2 Optimistic vs. Pessimistic Views of Nanotechnology......Page 415
12.4.3 Ethical Issues in Nanotechnology and Nanocomputing......Page 418
12.5 Autonomous Machines and Machine Ethics......Page 421
12.5.1 What is an Autonomous Machine (AM)?......Page 422
12.5.2 Some Ethical and Philosophical Questions Involving AMs......Page 425
12.5.3 Machine Ethics and Moral Machines......Page 430
12.6.1 Is an ELSI-Like Model Adequate for New/Emerging Technologies?......Page 434
12.6.2 A “Dynamic Ethics” Model......Page 435
Review Questions......Page 436
Scenarios for Analysis......Page 437
Endnotes......Page 438
References......Page 439
Further Readings......Page 441
Glossary......Page 443
Index ......Page 449