You are viewing a free preview of this lesson.
Subscribe to unlock all 10 lessons in this course and every other course on LearningBro.
Ethics refers to the principles of right and wrong that guide human behaviour. In computing, ethical issues arise when the use of technology affects people, society, or the environment in ways that may be harmful, unfair, or controversial.
As computer scientists, you need to understand how technology can be used responsibly and the ethical dilemmas that arise from its use.
An ethical issue in computing is a situation where the use of technology raises questions about what is morally right or wrong. Unlike legal issues (which are defined by law), ethical issues are often matters of personal judgement and debate — reasonable people may disagree.
Examples of ethical questions:
Technology makes it easy to collect, store, and analyse vast amounts of personal data. Every website visit, social media post, online purchase, and smartphone location creates a digital footprint.
The ethical question: How much personal data should organisations be allowed to collect, and how should it be used?
The internet enables free expression, but it also allows the spread of misinformation, hate speech, and harmful content.
Digital content (music, films, software, images) can be copied and distributed instantly at almost no cost. This raises questions about:
Technology companies design products to be as engaging (some would say addictive) as possible:
As AI and robotics advance, many jobs may be automated:
An ethical analysis should always consider who is affected — the stakeholders:
| Stakeholder | Examples |
|---|---|
| Individuals | Users, employees, customers |
| Companies | Tech firms, employers, developers |
| Society | Communities, vulnerable groups, future generations |
| Government | Lawmakers, regulators, public services |
Exam Tip: When answering ethics questions, always consider multiple perspectives. A good answer discusses the benefits AND drawbacks and considers the impact on different stakeholders. Avoid one-sided arguments.
flowchart TD
Q["Ethical Question:<br/>Is this use of tech right?"]
Q --> ID[Identify the Issue]
ID --> I1[Privacy]
ID --> I2[Censorship]
ID --> I3[Intellectual Property]
ID --> I4[Digital Addiction]
ID --> I5[Automation / Jobs]
Q --> SH{"Who are the<br/>Stakeholders?"}
SH --> SH1[Individuals]
SH --> SH2[Companies]
SH --> SH3[Society]
SH --> SH4[Government]
Q --> FW{Apply Framework}
FW --> FW1["Utilitarian:<br/>Greatest Good"]
FW --> FW2["Rights-based:<br/>Protect Rights"]
FW --> FW3["Duty-based:<br/>Right or Wrong"]
FW1 --> J[Balanced Judgement]
FW2 --> J
FW3 --> J
There are different ways to think about ethical problems:
| Framework | Key Idea |
|---|---|
| Utilitarian | The right action produces the greatest good for the greatest number |
| Rights-based | Every individual has fundamental rights that should not be violated |
| Duty-based | Some actions are inherently right or wrong, regardless of outcomes |
At GCSE, you do not need to name these frameworks, but showing that you can argue from different perspectives will improve your answers.
Organisations like the British Computer Society (BCS) and the Association for Computing Machinery (ACM) publish codes of conduct for IT professionals. Key principles include:
Ethical debate does not take place in a vacuum. In the UK, a tightly interlocking suite of statutes governs how personal information, creative works, and computer systems may be used. A competent GCSE Computer Science candidate must be able to name the legislation, summarise its effect, and discuss the ethical tension it attempts to resolve.
The Data Protection Act 2018 (DPA 2018) enacts the UK GDPR (the retained version of the EU General Data Protection Regulation following the UK's departure from the European Union). Together they set out seven data protection principles: lawfulness, fairness and transparency; purpose limitation; data minimisation; accuracy; storage limitation; integrity and confidentiality; and accountability. Every organisation that processes personal data — whether a multinational, a small business, or a local sports club — must demonstrate a lawful basis for doing so. The six lawful bases are consent, contract, legal obligation, vital interests, public task, and legitimate interests. Special category data (health, race, religion, sexual orientation, biometric identifiers, political opinion, trade union membership, genetic data) attracts a higher bar: processing requires explicit consent or another specific condition under Article 9.
A data controller decides the purposes and means of processing; a data processor acts on the controller's instructions. A school, for example, is a data controller for its pupil records, while a cloud supplier such as Google or Microsoft acts as its processor. Both share legal duties, but the controller carries the primary liability. Data subjects — you and I — hold enforceable rights: the right of access (the Subject Access Request), rectification, erasure ("right to be forgotten"), restriction, portability, objection, and rights relating to automated decision-making.
The Information Commissioner's Office (ICO) enforces the regime. Maximum fines reach £17.5 million or 4% of global annual turnover, whichever is higher. Real UK cases illustrate the scale: the ICO fined British Airways £20 million in 2020 after a Magecart-style skimming attack compromised 400,000 customer records; Marriott International was fined £18.4 million for a long-running breach of Starwood guest data; and TikTok was fined £12.7 million in 2023 for misuse of children's data. Ethical discussion around the DPA typically focuses on whether consent is ever truly informed when privacy notices are tens of thousands of words long, and whether data minimisation is compatible with the data-hungry business models of social media platforms.
The Computer Misuse Act 1990 (CMA) criminalises unauthorised interaction with computer systems. Section 1 covers unauthorised access (maximum two years). Section 2 covers unauthorised access with intent to commit a further offence (maximum five years). Section 3 covers unauthorised acts intended to impair the operation of a computer — covering malware, ransomware and denial-of-service attacks (maximum ten years). Section 3ZA, added in 2015, creates an offence of unauthorised acts causing or creating risk of serious damage, carrying a potential life sentence where national infrastructure is at stake. Section 3A (added in 2006) criminalises the making, supply, or obtaining of articles — for example exploit kits or stolen password lists — intended for use in CMA offences.
The Copyright, Designs and Patents Act 1988 (CDPA) gives automatic protection to original literary, dramatic, musical, and artistic works, including software source code and databases. No registration is needed; protection generally lasts for the life of the author plus 70 years. Proprietary licences reserve rights exclusively; open-source licences such as the GNU General Public Licence (GPL) permit copying and modification on condition of source-code distribution; Creative Commons (CC) licences allow granular sharing on terms such as BY, SA, NC, and ND.
The Freedom of Information Act 2000 (FOIA) gives any person the right to request information held by UK public authorities, subject to specific exemptions. The Investigatory Powers Act 2016 — sometimes called the "Snooper's Charter" — governs bulk interception, equipment interference, and the retention of internet connection records by communications service providers. These regimes set the legal boundary within which ethical arguments about transparency and surveillance must operate.
Exam-style question: "Modern technology creates ethical problems that cannot be solved by legislation alone." Discuss this statement with reference to at least two ethical issues and relevant UK law. [9 marks]
Grades 3-4 response. Technology can cause problems. Some companies take too much data and this is not fair. There is a law called the Data Protection Act that tries to stop this. People can also hack computers which is bad and against the Computer Misuse Act. Laws help but people still break them, so ethics matters as well.
Grades 5-6 response. Modern technology raises ethical problems that legislation tries to address. The Data Protection Act 2018 and UK GDPR require organisations to have a lawful basis for processing personal data and to follow principles such as data minimisation and storage limitation. However, users often give consent without reading privacy notices, so ethical responsibility remains with the data controller. The Computer Misuse Act 1990 criminalises unauthorised access, but international cybercriminals are hard to prosecute. Laws therefore set minimum standards but ethical behaviour by designers and users is still needed.
Grades 7-9 response. Legislation provides a necessary but insufficient framework. The Data Protection Act 2018, incorporating the UK GDPR, binds data controllers to seven principles and requires a lawful basis — for example informed consent or legitimate interests — before processing. Yet the "consent fatigue" identified by the ICO demonstrates that formal compliance does not equal ethical practice: users click through privacy notices they have not read, and platforms exploit the gap through dark patterns. Similarly, the Computer Misuse Act 1990 prohibits unauthorised access under section 1, but the TalkTalk breach (2015) — prosecuted under sections 1 and 3 — shows that criminalisation does not deter harm at scale. Environmental issues such as e-waste and datacentre carbon emissions are regulated only indirectly through the WEEE Regulations 2013 and general environmental law, leaving a large ethical residue. A rounded judgement is therefore that statute sets a floor, but professional codes (such as the BCS Code of Conduct), design ethics, and informed consumer choice must do the remaining work.
AQA alignment: This content is aligned with AQA GCSE Computer Science (8525) specification — specifically section 3.8 Ethical, legal and environmental impacts of digital technology on wider society (data protection, computer misuse, copyright, ethics, environment, culture). Assessed on Paper 2.