My Quotes
Ethics in Computer Engineering: Professional Values and Moral Decision-Making
Course Module: Professional Ethics in Computing
Duration: 3-4 hours
Level: Undergraduate Computer Engineering
Learning Objectives
Upon completion of this lesson, students will be able to:
- Understand the fundamental principles of engineering ethics and their application in computer engineering
- Analyze the ACM/IEEE Code of Ethics and apply its principles to real-world scenarios
- Identify and evaluate ethical dilemmas in software development, artificial intelligence, and data management
- Apply systematic frameworks for ethical decision-making in professional contexts
- Recognize professional responsibilities including whistleblowing and social accountability
- Critically examine case studies involving ethical failures and successes in the technology industry
1. Introduction: Why Ethics Matter in Computer Engineering
Computer engineering has become one of the most influential fields shaping modern society. From smartphones that connect billions of people to artificial intelligence systems that make critical decisions about healthcare, finance, and justice, the work of computer engineers affects every aspect of human life. With this power comes profound responsibility.
Unlike traditional engineering disciplines where failures are often immediately visible—a bridge collapse, a building fire—the consequences of ethical failures in computer engineering can be subtle, pervasive, and far-reaching. A biased algorithm can perpetuate discrimination against millions. A security vulnerability can expose the personal data of entire populations. An autonomous vehicle’s programming decisions can determine who lives and who dies in unavoidable accidents.
In the Philippines, as we rapidly digitalize government services, banking, education, and healthcare, the ethical implications of technology decisions become even more critical. Computer engineers must be prepared not only with technical skills but with moral frameworks to guide their professional conduct.
2. Theoretical Foundation: What is Engineering Ethics?
2.1 Definition and Scope
Engineering ethics is the field of applied ethics that examines and sets standards for engineers’ professional conduct. It encompasses:
- Normative Ethics: What engineers ought to do
- Descriptive Ethics: What engineers actually do
- Meta-ethics: The nature and justification of ethical principles in engineering
2.2 Core Ethical Theories
Several philosophical frameworks inform engineering ethics:
| Ethical Theory | Core Principle | Application in Computing |
|---|---|---|
| Utilitarianism | Maximize overall happiness/minimize harm | Cost-benefit analysis of software features; weighing privacy vs. security |
| Deontology | Duty-based ethics; follow universal moral rules | Respecting user privacy regardless of consequences; honoring commitments |
| Virtue Ethics | Cultivate moral character and virtues | Developing habits of honesty, integrity, and responsibility in professional conduct |
| Care Ethics | Emphasize relationships and interdependence | Considering vulnerable populations in technology design; accessibility |
3. ACM/IEEE Code of Ethics for Software Engineers
The Association for Computing Machinery (ACM) and the Institute of Electrical and Electronics Engineers (IEEE) have established comprehensive codes of ethics that serve as the gold standard for computing professionals worldwide.
3.1 ACM Code of Ethics (2018) – Eight Principles
- Contribute to society and human well-being: Computing professionals should work to develop systems that benefit society and minimize negative consequences.
- Avoid harm: “Harm” includes unjustified physical or mental injury, destruction or disclosure of information, property damage, or environmental damage.
- Be honest and trustworthy: Honesty is essential to maintaining trust. This includes being transparent about system limitations and potential risks.
- Be fair and take action not to discriminate: Technologies should be designed to be fair and not discriminate against individuals or groups.
- Respect the work required to produce new ideas: Honor intellectual property, give proper credit, and respect privacy.
- Respect privacy: Only collect personal information that is legally required or with informed consent, and protect it appropriately.
- Honor confidentiality: Protect confidential information except in cases where it is evidence of violation of law or ethical principles.
- Strive to achieve high quality: Computing professionals should insist on high quality work from themselves and colleagues.
3.2 IEEE Code of Ethics – Ten Fundamental Canons
IEEE members agree to:
- Hold paramount the safety, health, and welfare of the public
- Perform services only in areas of their competence
- Issue public statements only in an objective and truthful manner
- Act for each employer or client as faithful agents or trustees
- Avoid deceptive acts
- Conduct themselves honorably, responsibly, ethically, and lawfully
- Assist colleagues and co-workers in their professional development
- Treat all persons fairly and with respect
- Avoid injuring others, their property, reputation, or employment
- Support colleagues and co-workers in following this code of ethics
4. Real-World Ethical Dilemmas in Computer Engineering
4.1 Privacy and Data Protection
The Challenge: Modern applications collect massive amounts of personal data. Engineers face constant pressure to collect more data for analytics, personalization, and monetization, while users’ privacy rights must be protected.
Ethical Questions:
- How much data collection is justified?
- Is informed consent truly possible when terms of service are hundreds of pages long?
- Should engineers implement features they believe violate user privacy if directed by management?
- What responsibility do engineers have for data breaches resulting from their code?
4.2 Artificial Intelligence and Algorithmic Bias
The Challenge: AI systems can perpetuate and amplify existing societal biases. Algorithms trained on historical data may discriminate against protected classes in hiring, lending, criminal justice, and other critical domains.
Examples of Bias:
- Facial recognition systems with higher error rates for people with darker skin tones
- Resume screening algorithms that favor male candidates
- Predictive policing systems that over-target minority neighborhoods
- Healthcare algorithms that provide inferior care recommendations for Black patients
4.3 Cybersecurity and Responsible Disclosure
The Challenge: When engineers discover security vulnerabilities, they face difficult choices about disclosure. Revealing vulnerabilities publicly could enable attacks, but keeping them secret leaves users at risk.
Competing Interests:
- Public safety vs. organizational reputation
- Immediate disclosure vs. coordinated vulnerability disclosure
- Legal liability vs. ethical responsibility
- National security concerns vs. transparency
4.4 Software Reliability and Safety-Critical Systems
The Challenge: In safety-critical systems (medical devices, autonomous vehicles, aviation software), software bugs can kill people. Engineers must balance time-to-market pressures with thorough testing and validation.
Key Considerations:
- When is software “safe enough” to deploy?
- How should engineers respond to management pressure to release untested code?
- What testing standards are appropriate for different risk levels?
- Who is liable when autonomous systems cause harm?
4.5 Environmental Impact of Computing
The Challenge: Data centers consume enormous amounts of energy, cryptocurrency mining has a massive carbon footprint, and electronic waste creates environmental hazards. Engineers must consider sustainability in their technical decisions.
Ethical Dimensions:
- Energy efficiency in algorithm design
- Hardware lifecycle and e-waste management
- Cloud computing’s hidden environmental costs
- Responsibility for downstream environmental impacts
5. Case Studies: Ethical Failures and Lessons Learned
Case Study 1: Cambridge Analytica and Facebook Data Scandal (2018)
What Happened: Cambridge Analytica, a political consulting firm, harvested personal data from millions of Facebook users without their consent. The data was obtained through a seemingly innocuous personality quiz app that collected data not only from users who took the quiz but also from all their Facebook friends—ultimately affecting 87 million users.
Ethical Violations:
- Lack of informed consent from affected users
- Misrepresentation of data collection purposes
- Facebook’s inadequate oversight of third-party developers
- Use of personal data for political manipulation
Consequences: Facebook paid $5 billion in fines, Mark Zuckerberg testified before Congress, and the scandal sparked global conversations about data privacy, leading to regulations like GDPR in Europe and data privacy bills worldwide.
Lessons for Engineers:
- Design APIs and platforms with privacy protections by default
- Question data collection practices that seem excessive
- Advocate for transparent data handling policies
- Consider downstream uses of data, not just immediate applications
Case Study 2: Volkswagen Emissions Scandal (2015)
What Happened: Volkswagen engineers programmed engine control software to detect when vehicles were undergoing emissions testing. During tests, the software activated full emissions controls, but during normal driving, it disabled or reduced these controls to improve performance and fuel efficiency. This “defeat device” allowed VW diesels to emit up to 40 times the legal limit of nitrogen oxides while passing regulatory tests.
Ethical Violations:
- Deliberate deception of regulators and consumers
- Environmental harm from excess pollution
- Public health risks from increased air pollution
- Breach of professional integrity by engineers
Consequences: VW paid over $30 billion in fines, settlements, and recalls. Several executives faced criminal charges. The company’s reputation suffered lasting damage, and trust in diesel technology collapsed.
Lessons for Engineers:
- Never implement features designed to deceive, even under management pressure
- Recognize when engineering work crosses ethical and legal boundaries
- Understand that “just following orders” is not an ethical defense
- Consider whistleblowing when witnessing serious ethical violations
Case Study 3: Boeing 737 MAX MCAS Software (2018-2019)
What Happened: Boeing’s 737 MAX featured a new automated system called MCAS (Maneuvering Characteristics Augmentation System) designed to prevent stalls. However, the system relied on a single sensor, lacked redundancy, and was not adequately explained to pilots. When sensors malfunctioned, MCAS repeatedly pushed the nose down, and pilots struggled to override it. Two crashes killed 346 people.
Ethical Violations:
- Inadequate safety analysis and testing
- Prioritizing cost savings and schedule over safety
- Insufficient pilot training and documentation
- Lack of redundancy in safety-critical systems
- Management pressure that compromised engineering judgment
Consequences: The 737 MAX was grounded worldwide for 20 months. Boeing paid $2.5 billion in fines and compensation. The company faced criminal charges for fraud, and its reputation for safety was severely damaged.
Lessons for Engineers:
- Never compromise safety for cost or schedule pressures
- Insist on redundancy and fail-safes in safety-critical systems
- Speak up when management decisions threaten public safety
- Ensure adequate testing and validation before deployment
- Document all safety concerns in writing
Case Study 4: Therac-25 Radiation Therapy Machine (1985-1987)
What Happened: The Therac-25 was a radiation therapy machine controlled by software. Due to race conditions and inadequate safety interlocks, the machine delivered radiation doses 100 times higher than intended, killing three patients and seriously injuring three others. The software had been reused from previous models but without the hardware safety mechanisms those systems relied upon.
Ethical Violations:
- Over-reliance on software without hardware safeguards
- Inadequate software testing for safety-critical applications
- Poor error handling and user interface design
- Slow response to early incident reports
- Lack of systematic software safety analysis
Lessons for Engineers:
- Software alone cannot be trusted for safety-critical functions without multiple layers of protection
- Reusing code in new contexts requires thorough re-validation
- Error messages must be clear and actionable
- Take all incident reports seriously and investigate immediately
- Apply formal methods and rigorous testing to medical and safety-critical software
6. Framework for Ethical Decision-Making
When faced with ethical dilemmas, engineers need systematic approaches to analyze situations and make sound decisions. Here is a comprehensive framework:
6.1 The Seven-Step Ethical Decision-Making Process
| Step | Action | Key Questions |
|---|---|---|
| 1. Recognize | Identify that an ethical issue exists | Does this decision affect people’s rights, safety, or welfare? Does it feel wrong? |
| 2. Gather Facts | Collect relevant information | What are the technical facts? What are the organizational constraints? What are the legal requirements? |
| 3. Identify Stakeholders | Determine who will be affected | Who benefits? Who is harmed? Who has rights in this situation? Are vulnerable populations affected? |
| 4. Consider Alternatives | Generate multiple possible courses of action | What are all possible options? Are there creative solutions that satisfy multiple interests? |
| 5. Evaluate Alternatives | Assess each option against ethical principles | What are consequences? What are my duties? What would a virtuous person do? Is this fair to all? |
| 6. Make a Decision | Choose the most ethically sound option | Which option best aligns with professional codes? Can I defend this decision publicly? |
| 7. Reflect | Review the decision and its outcomes | What was the result? What did I learn? What would I do differently next time? |
6.2 Ethical Tests and Questions
Before finalizing an ethical decision, apply these additional tests:
- The Publicity Test: Would I be comfortable if this decision was published on the front page of a newspaper?
- The Generalization Test: What if everyone in similar circumstances acted this way?
- The Reversibility Test: Would I think this choice was fair if I were among those adversely affected?
- The Professional Code Test: Does this align with ACM/IEEE codes of ethics?
- The Harm Test: Does this minimize harm to all stakeholders?
- The Rights Test: Does this respect the rights of all individuals?
- The Justice Test: Does this distribute benefits and burdens fairly?
6.3 When Ethical Principles Conflict
Often, ethical principles come into conflict. For example, privacy may conflict with security, or individual rights may conflict with public welfare. When this happens:
- Prioritize fundamental rights: Life, safety, and basic human rights generally take precedence
- Consider proportionality: The severity of harm should be proportional to the benefits
- Seek creative solutions: Often conflicts can be resolved through innovative technical solutions
- Ensure transparency: When trade-offs are made, they should be acknowledged and explained
- Provide opt-outs: When possible, allow individuals to make their own choices about trade-offs
7. Professional Responsibility and Whistleblowing
7.1 Scope of Professional Responsibility
Computer engineers have responsibilities to multiple stakeholders:
- To the Public: Prioritize public safety, health, and welfare; consider societal impacts
- To Clients and Employers: Act as faithful agents; protect confidential information; deliver quality work
- To the Profession: Maintain integrity; support colleagues; advance the field’s reputation
- To Colleagues: Provide fair treatment; support professional development; give credit appropriately
- To Yourself: Maintain competence; work within your expertise; practice self-care to avoid burnout that compromises judgment
7.2 When to Consider Whistleblowing
Whistleblowing—disclosing organizational wrongdoing to authorities or the public—is an extreme measure with serious consequences. It should be considered when:
- There is clear evidence of serious harm to the public
- Internal reporting channels have been exhausted or are unavailable
- The organization is violating laws or ethical standards
- You have documented evidence, not just suspicions
- The harm is imminent or ongoing
7.3 Responsible Whistleblowing Process
- Document everything: Keep detailed records of evidence, communications, and concerns
- Exhaust internal channels first: Report to supervisors, ethics officers, or ombudspersons
- Consult professional organizations: Seek guidance from ACM, IEEE, or professional societies
- Understand legal protections: Research whistleblower protection laws in your jurisdiction
- Consider anonymous reporting: If safe to do so, use anonymous reporting mechanisms
- Seek legal counsel: Consult an attorney before making external disclosures
- Be prepared for consequences: Whistleblowers often face retaliation despite legal protections
7.4 Whistleblowing Examples in Computing
Frances Haugen (Facebook, 2021): Former Facebook product manager who disclosed internal research showing Facebook knew Instagram harmed teen mental health, prioritized engagement over safety, and was used to incite violence. Her testimony led to increased regulatory scrutiny.
Edward Snowden (NSA, 2013): Disclosed classified information about mass surveillance programs. While controversial and resulting in criminal charges, his revelations sparked global debates about privacy, government surveillance, and the limits of state power.
Susan Fowler (Uber, 2017): Published a blog post detailing sexual harassment and discrimination at Uber. Her disclosure led to investigations, executive departures, and significant cultural changes in the tech industry regarding workplace conduct.
8. Practice Scenarios and Discussion Questions
Scenario 1: The Backdoor Request
Situation: You work for a messaging app company that emphasizes end-to-end encryption and user privacy. A government law enforcement agency approaches your company with a court order requesting that you build a backdoor into your encryption system to help them monitor suspected terrorists. Your manager asks you to implement this feature.
Discussion Questions:
- What are the competing ethical principles at stake (security vs. privacy, legal compliance vs. user trust)?
- Who are all the stakeholders, and how would they be affected?
- What technical alternatives might exist that could balance law enforcement needs with user privacy?
- What would the ACM/IEEE codes of ethics suggest you do?
- If you refused to implement this feature, what steps should you take?
- How would you apply the seven-step ethical decision-making process to this situation?
Scenario 2: The Biased Algorithm
Situation: Your team has developed a resume screening algorithm for a major recruitment platform used throughout the Philippines. During testing, you notice that the algorithm systematically ranks female candidates lower than male candidates with identical qualifications. When you raise this concern, your project manager says, “The algorithm is just reflecting real-world data patterns. If we change it, we’ll reduce overall accuracy, and our client won’t be happy.”
Discussion Questions:
- Is it ethical to deploy an algorithm that reflects historical biases, even if it’s “accurate” to training data?
- What are your obligations under the ACM Code of Ethics principle to “be fair and take action not to discriminate”?
- What technical approaches could address this bias while maintaining performance?
- How should you balance client satisfaction against ethical imperatives?
- If management insists on deploying the biased algorithm, what are your options?
- What documentation should you create to protect yourself and inform stakeholders?
Scenario 3: The Unfinished Security Audit
Situation: You’re a security engineer at a fintech startup in Manila that’s about to launch a mobile banking app. The CEO wants to launch next week to beat a competitor to market. However, your security audit is only 60% complete, and you’ve already found several critical vulnerabilities that haven’t been fixed. The CEO says, “We can patch security issues after launch. If we delay, we might lose our funding round.”
Discussion Questions:
- What are the potential harms of launching with known security vulnerabilities?
- How do you balance business pressures against your professional responsibility to users?
- What does the IEEE principle to “hold paramount the safety, health, and welfare of the public” require in this situation?
- What alternatives could you propose to the CEO?
- If the CEO overrides your objections, what should you do? Document? Resign? Report to regulators?
- What legal liabilities might you personally face if the app is hacked and user data is stolen?
Scenario 4: The Carbon Footprint Dilemma
Situation: Your company is developing a new cryptocurrency mining operation in the Philippines, taking advantage of relatively low electricity costs. You calculate that the operation will consume as much electricity as a small city and significantly increase carbon emissions in a country already vulnerable to climate change. However, the project will create jobs and bring investment to a poor region.
Discussion Questions:
- How do you weigh economic benefits against environmental harm?
- What responsibility do engineers have for the environmental impacts of their work?
- Are there technical solutions that could reduce the environmental impact?
- Should engineers refuse to work on projects they believe are environmentally harmful?
- How does the ACM principle to “contribute to society and human well-being” apply here?
- What role should considerations of climate justice play in engineering decisions?
Scenario 5: The Data Retention Question
Situation: You work for a social media company that collects detailed location data from users’ phones. The company’s current policy is to retain this location history indefinitely for “potential future analytics and personalization features.” You discover that this data could be subpoenaed in legal proceedings or accessed by hackers, potentially revealing sensitive information about users’ movements, including visits to medical clinics, political protests, or religious sites.
Discussion Questions:
- What are the privacy risks of indefinite data retention?
- How do you balance potential business value against privacy risks?
- What would a “privacy by design” approach suggest?
- Should you advocate for automated data deletion policies? After how long?
- How does the ACM principle to “respect privacy” inform this decision?
- What technical measures could minimize risk while retaining some business value?
9. Conclusion: Developing Ethical Professional Identity
Ethics in computer engineering is not about memorizing rules or codes. It’s about developing the moral character, critical thinking skills, and courage to make sound decisions when facing complex dilemmas with no clear answers.
As you begin your careers as computer engineers in the Philippines and beyond, remember that every line of code you write, every system you design, and every technical decision you make has ethical dimensions. Your work will shape the digital infrastructure of society for decades to come. The algorithms you create may determine who gets loans, jobs, or medical treatment. The security measures you implement may protect or expose vulnerable populations. The data systems you build may empower or exploit users.
Key Takeaways for Professional Practice:
- Ethics is not optional: It’s a fundamental part of professional competence, just like technical skills
- Codes of ethics provide guidance: Familiarize yourself with ACM/IEEE codes and revisit them throughout your career
- Anticipate ethical issues: Don’t wait for dilemmas to arise; proactively consider ethical implications in design
- Speak up early: Ethical problems are easier to address early in development than after deployment
- Document your concerns: Keep records of ethical issues and your responses to them
- Build ethical culture: Support colleagues who raise ethical concerns; create environments where ethics can be discussed openly
- Continue learning: Technology evolves rapidly, bringing new ethical challenges; commit to lifelong ethical learning
- Remember the human impact: Behind every data point is a person; behind every system are communities affected by your decisions
The opening quote from this module, “No good should humble themselves in the presence of Evil,” reminds us that engineers have a moral obligation to stand firm in defense of ethical principles, even when facing pressure from employers, clients, or market forces. Your professional ethics are not negotiable. The trust society places in engineers depends on our collective commitment to placing public welfare above private gain, safety above speed, and integrity above expediency.
As you move forward in your studies and careers, cultivate not just technical excellence but moral excellence. Develop the wisdom to see ethical issues, the courage to address them, and the integrity to do what’s right even when it’s difficult. The future of technology—and the society it shapes—depends on it.
10. References
Professional Codes and Standards:
- ACM, “ACM Code of Ethics and Professional Conduct,” Association for Computing Machinery, 2018. [Online]. Available: https://www.acm.org/code-of-ethics
- IEEE, “IEEE Code of Ethics,” Institute of Electrical and Electronics Engineers, 2020. [Online]. Available: https://www.ieee.org/about/corporate/governance/p7-8.html
- Software Engineering Code of Ethics and Professional Practice, ACM/IEEE-CS Joint Task Force, Version 5.2, 1999.
Books and Monographs:
- M. J. Quinn, Ethics for the Information Age, 8th ed. Boston, MA: Pearson, 2020.
- S. Vallor, Technology and the Virtues: A Philosophical Guide to a Future Worth Wanting. New York, NY: Oxford University Press, 2016.
- C. E. Harris, M. S. Pritchard, M. J. Rabins, R. James, and E. Englehardt, Engineering Ethics: Concepts and Cases, 6th ed. Boston, MA: Cengage Learning, 2019.
- D. G. Johnson, Computer Ethics, 4th ed. Upper Saddle River, NJ: Prentice Hall, 2009.
Journal Articles and Conference Papers:
- B. Friedman and H. Nissenbaum, “Bias in computer systems,” ACM Trans. Inf. Syst., vol. 14, no. 3, pp. 330-347, Jul. 1996, doi: 10.1145/230538.230561.
- N. Leveson and C. S. Turner, “An investigation of the Therac-25 accidents,” Computer, vol. 26, no. 7, pp. 18-41, Jul. 1993, doi: 10.1109/MC.1993.274940.
- D. Gotterbarn, K. Miller, and S. Rogerson, “Software engineering code of ethics,” Commun. ACM, vol. 40, no. 11, pp. 110-118, Nov. 1997, doi: 10.1145/265684.265699.
Case Studies and Reports:
- C. Cadwalladr and E. Graham-Harrison, “Revealed: 50 million Facebook profiles harvested for Cambridge Analytica in major data breach,” The Guardian, Mar. 17, 2018.
- J. Ewing, “Volkswagen Says 11 Million Cars Worldwide Are Affected in Diesel Deception,” The New York Times, Sep. 22, 2015.
- House Committee on Transportation and Infrastructure, “Final Committee Report: The Boeing 737 MAX Aircraft,” 116th Congress, Sep. 2020.
- J. Dastin, “Amazon scraps secret AI recruiting tool that showed bias against women,” Reuters, Oct. 10, 2018.
- A. Obermeyer et al., “Dissecting racial bias in an algorithm used to manage the health of populations,” Science, vol. 366, no. 6464, pp. 447-453, Oct. 2019, doi: 10.1126/science.aax2342.
Additional Resources:
- Online Ethics Center for Engineering and Science, National Academy of Engineering. [Online]. Available: https://onlineethics.org
- Markkula Center for Applied Ethics, Santa Clara University. [Online]. Available: https://www.scu.edu/ethics/
- Republic Act No. 10173, “Data Privacy Act of 2012,” Republic of the Philippines, 2012.
- European Parliament, “General Data Protection Regulation (GDPR),” Regulation (EU) 2016/679, Apr. 2016.
- D. C. Vladeck, “Machines without principals: Liability rules and artificial intelligence,” Washington Law Review, vol. 89, no. 1, pp. 117-150, 2014.
This educational module was prepared for computer engineering students in the Philippines. For questions, discussions, or case study contributions, please consult your course instructor or professional ethics advisor.
Last updated: November 2025
Explore More Content
- The Core: Why Bad Science Makes Great Scientists – Reflections on inspiration vs accuracy
- Philosophical Frameworks in Engineering – Deeper thoughts on technology and society
- Computer Engineering Career Guide – Turn passion into profession
