Picture, on one side, the heavy machinery of the Canadian federal administration. In 2016, the government launches the "Phoenix" project to modernise payroll for some 290,000 civil servants. The intention is technocratic and financial: centralise the systems, eliminate local compensation advisors — over a thousand positions cut — and save $70 million a year. The story becomes a national nightmare. The system is launched despite known and flagged critical flaws. Thousands of civil servants go unpaid for months; some lose their homes or drain their savings, while others are overpaid and must reimburse amounts they no longer have. In 2019, Canada's Parliamentary Budget Officer estimated the total cost of the failure at $2.6 billion — a figure that has continued to rise since. This is not a story about a software bug. It is a story about men and women terrified of saying "No" to their hierarchy.
On the other side, step into the offices of Etsy, the online artisan marketplace, during its hypergrowth phase. The environment is technically unstable: engineers modify the site's live code between 30 and 100 times a day. The risk of "breaking" the shop is ever-present. Yet when a developer makes an error that takes the site offline, they are not called in for a reprimand. They may instead receive the "Three-Armed Sweater Award" — presented each year to the most spectacular mistake — a prize that celebrates the year's most instructive accident. This is the story of a company that decided error was not a moral failing, but a valuable data point to capture.
This face-off explores how fear can paralyse a technically competent organisation, while psychological safety can transform human fallibility into a driver of performance.
The Phoenix project aimed to replace ageing payroll systems with a centralised solution based on Oracle PeopleSoft commercial software, developed and deployed by IBM. The rollout ended in massive operational failure. The damning reports of Canada's Auditor General — published in 2018 — concluded it was "an incomprehensible failure of project management and oversight". The root cause identified was not technical: it was a management culture where middle managers did not dare escalate bad news — failed tests, untenable deadlines — to their superiors, for fear of reprisals or being seen as "blockers" of progress. Dashboard reports presented to ministers showed green while operational indicators were turning deep red.
The R6 analysis identifies a severe pathology along the posture and coordination axes:
Strategic level (S). The strategy rested on an aggressive logic of production and centralisation (S3a) — "do more with less" — disconnected from the reality on the ground. Decision-makers ignored warning signals to protect the political timetable, creating a distortion of reality. This is a failure of the lucidity posture (S1a): the organisation preferred the coherence of its plan to the truth of the facts.
Organisational level (O). The system suffered from a critical breakdown of informational coherence (O1a). The primary function of O1a is to ensure reliable upward flow of information to guarantee decisional clarity. Here, governance inadvertently installed a filter of façade compliance. Furthermore, by eliminating local compensation experts before the system was functional, the government destroyed the cooperation capacity (O2b) needed to manage the transition — removing precisely those who could have corrected errors in real time.
Individual level (I). In this climate, the accountability competency (I1a) was perverted. Instead of exercising ethical responsibility — flagging danger for the end user — individuals adopted a survival posture: follow orders. Fear inhibited the exercise of genuine accountability. The system transformed competent professionals into silent executors, unable to activate the alert loops.
Etsy operates in a DevOps environment, where development and operations are merged for maximum speed. To manage this structural risk, the company institutionalised "Just Culture" and the practice of the "Blameless Post-Mortem". When an incident occurs, the objective is never to find "who" made the error, but "how" the system allowed it to happen. If an engineer accidentally deletes a database, the question is not "why were you careless?", but "why did the tool make it so easy for you to do that?" The Three-Armed Sweater Award — an actual sweater knitted by an Etsy artisan, hung on the office wall — goes each year to the most spectacular error, not the most harmful. It is the most surprising accidents, those that reveal the widest and most instructive gap between what was expected and what actually happened, that are recognised.
The R6 analysis highlights exceptional mastery of learning loops along the posture axis:
Organisational level (O). Etsy uses incidents as the primary fuel for its transformation capacity (O1b). Every error triggers a formal systemic investigation that always results in an improvement to a process, a tool, or documentation. The organisation does not pursue stability by prohibiting error — a rigid and illusory approach — but through adaptive resilience. The Post-Mortem is a mechanism of continuous transformation: it converts an individual failure into a collective asset.
Individual level (I). This culture reconfigures the accountability competency (I1a). At Etsy, being accountable does not mean "never making a mistake", but "reporting your error immediately, without fear, and actively participating in its analysis". This psychological safety drastically accelerates resolution time. It also liberates the innovation capacity (I1b): engineers dare to test new approaches because they know the system will support them when something goes wrong.
The comparison between Phoenix and Etsy illustrates two opposing philosophies when facing complexity.
In the Phoenix case, the organisation attempted to guarantee control through hierarchical constraint and fear of failure. By seeking to suppress human error through authority, it ended up suppressing the information about the error. It is a system that smothers the signal. The cost of this rigidity was astronomical — billions of dollars, because problems were only addressed once they had become catastrophic and impossible to conceal. Silence is expensive.
Etsy, by contrast, built a system that amplifies the signal. By accepting human error as an inevitable property of any complex system, the company transformed a cost (the outage) into an investment (the learning). The "freedom to fail" is not permissiveness: it is a rigorous method for increasing the technical robustness of the system. Where Phoenix failed through silence and denial, Etsy succeeds through liberated speech and radical clarity.
Psychological safety is not a benevolent soft skill designed to please teams — it is a technical sine qua non of performance and financial security. A system where truth does not travel upward is a system flying blind, and it will invariably hit a wall.
The lesson is powerful and immediate: if you want technical reliability, you must build a human infrastructure of trust. Stop asking "who is to blame?" — which closes communication — and start asking "what does this incident teach us about our system?" — which opens it. Transform fear into curiosity, and you will move from an organisation that endures its risks to one that masters them.