Accountable Software in the Age of AI-Generated Code
AI systems can now generate large amounts of working software in seconds.
Yet when systems fail, the machine does not join the incident call.
The engineers do.
Responsibility for software systems remains human, even as machines increasingly participate in writing the implementation.
If code can now be produced faster than engineers can realistically inspect it, an important question emerges:
What exactly should humans review?
This book explores how software engineering might adapt to this shift. It examines the possibility that accountability may move away from raw implementation and toward clearer descriptions of system behaviour.
Concepts such as behavioural contracts and the Oracle are introduced as possible ways to preserve human responsibility in a world where machines increasingly generate the code.