Software Rules the World—So Who Sets the Rules?
Why awareness and transparency are crucial for responsible software design.
As we have highlighted in our new software engineering book, any useful software affects a portion of the real world, thereby regulating some of its aspects. This raises several ethical questions, for instance: do the defined rules align with users’ expectations? To what extent are they arbitrarily decided by software designers rather than required by the external context? Are they acceptable for all stakeholders? Are they sustainable in the long run?
Ghezzi and co-authors recently published an interesting position paper on this subject1. They highlight that the ethical and societal consequences of software design decisions can no longer be treated as secondary concerns. More specifically, responsible design requires that software designers explicitly articulate and take responsibility for the intent behind systems. Unlike natural objects, software artifacts are created deliberately; therefore, their purposes, assumptions, and anticipated effects must be made explicit, inspectable, and open to scrutiny. Responsible design demands awareness of the broader human and societal context in which systems operate and transparency about design decisions, trade-offs, and consequences.
The paper argues that such awareness can be achieved only by systematically integrating, within software engineering, not only knowledge from the software’s application domain but also perspectives from the humanities, social sciences, ethics, and law. Responsibility is collective and shared among all stakeholders involved in design, deployment, regulation, and use of software.
Ethics is central to this process, not as an external checklist (as is currently required by many public and private bodies), but as an integral part of design deliberation. Ethical questions guide the assessment of what outcomes are desirable, acceptable, or harmful. Notably, responsibility extends beyond designers to decision-makers and users: systems should support informed consent, clarify accountability, and communicate how user actions translate into system behavior and consequences.
Awareness is a first step, but responsible design cannot be in place without design transparency. Unfortunately, software complexity has long challenged transparency, and the incorporation of AI intensifies opacity. This increased complexity in analysing software requires significant research and practical efforts to expand modeling, specification, and assurance techniques so that system intents, properties, and guarantees are understandable and verifiably accurate for all stakeholders, including regulators and end users.
How to incorporate these aspects into the software design and execution processes remains unclear, but it should become a central challenge for future software engineering. Any oversimplification of this matter, often motivated by the need for technology to evolve unconstrained to promote human progress, is extremely risky and should be kept in check. Europe, frequently accused of adopting a conservative and overly restrictive stance on these issues, should seek an appropriate balance among competing tensions and strive to maintain its leading role in this context. The solution is neither clear nor straightforward, but it is essential to invest in three key directions:
Increase awareness about these critical issues.
Seek consensus and reasonable compromises on a case-by-case basis.
Generalize good practices from individual cases.
Although this approach may be perceived as too slow or overly simplistic, it is the most reasonable and practical way to proceed in a phase where we really need to improve our understanding of a complex and critical issue.
Ghezzi, C., Ebrahimi, M., Isovic, D., Sirjani, M. (2026). Breaking Disciplinary Silos: The Case of Software Engineering. In: Hagedorn, L., Schmid, U., Winter, S., Woltran, S. (eds) Digital Humanism. DIGHUM 2025. Lecture Notes in Computer Science, vol 16319. Springer, Cham. https://doi.org/10.1007/978-3-032-11108-1_32.

