Most safety observation programs look successful.
Thousands of observations logged.
Participation targets exceeded.
Dashboards trending upward.
Yet serious incidents still happen.
The same hazards repeat.
Supervisors still rely on instinct over insight.
Because many observation programs are designed to generate activity, not intelligence.
They create noise.
Where we went wrong
Heinrich, and later Bird, shaped decades of safety thinking with the idea that serious incidents sit at the top of a pyramid built on minor events and unsafe acts.
Many organisations simplified that into an operational rule:
Increase observations → reduce serious incidents.
So volume became the goal.
More cards.
More walkabouts.
More behavioural observations.
More targets.
But major incidents rarely originate from the same causes as minor housekeeping issues.
They emerge from weak controls, degraded safeguards, poor planning, production pressure, and supervision gaps.
And the more you push for volume, the more your system drifts toward the easiest signals to collect.
Not the most important ones.
What observation targets actually produce
Once observation becomes a KPI, behaviour changes.
People report what is:
- Easy to spot.
- Quick to record.
- Low risk to raise.
- Unlikely to create tension.
So you get endless PPE reminders and housekeeping notes.
Important? Yes.
But the uncomfortable issues are harder to capture:
- Work starting without proper isolation.
- Shortcuts becoming normal practice.
- Pressure overriding permits.
- Supervisors stretched too thin.
- Risk assessments copied and reused.
- Maintenance delayed until failure.
These are systemic conditions.
They don’t fit neatly into a tick-box behavioural checklist.
So they disappear from the dataset.
Leadership sees activity.
The real risk picture stays blurred.
The safe behaviour illusion
Many programs reward “safe acts” because they feel positive and measurable.
But the more you measure reassurance, the less you capture reality.
A dashboard full of positive observations does not mean controls are strong.
It often means the system has become comfortable.
Here’s the irony: digital can make this worse
When organisations digitise observation cards without redesigning the system, they often accelerate the problem.
Digital makes it faster to submit low-value observations.
So you end up with more volume, more noise, and more dashboards.
But digital observations done properly can do something paper never could.
They can turn observation programs into a real prevention engine.
What digital observations should actually enable
The best digital safety observation systems shift the purpose of observation from reporting behaviour to capturing weak signals early.
Not by adding more forms.
By reducing friction and improving decision quality.
A well-designed digital observation approach can:
- Make reporting quick enough to happen in the moment.
- Capture photos and context, not vague descriptions.
- Identify repeat hazards across sites automatically.
- Show whether issues are being closed out properly
- Highlight trends in controls degrading, not just behaviours drifting
- Reveal where pressure is building operationally
In other words, digital makes it possible to connect frontline signals into one consolidated risk picture.
That is where insight comes from.
The difference is not the software. It’s the design
High-performing organisations use digital observations differently.
They do three things consistently.
They focus on conditions, not behaviours.
Work planning, isolation quality, access constraints, equipment condition, supervision coverage.
They remove friction.
Fast reporting, minimal fields, practical language. People report reality, not policy.
They close the loop visibly.
You said → we acted.
Actions tracked.
Learning shared.
Trends reviewed.
That’s when observation becomes prevention.
The uncomfortable reality
Observation data reflects what leadership rewards.
Reward volume → you get volume.
Reward compliance → you get paperwork.
Reward comfort → you get safe observations.
Reward clarity and action → you get truth.
Digital tools can amplify either outcome.
They can accelerate noise.
Or they can surface insight earlier than ever before.
The bottom line
The goal is not more observations.
The goal is earlier signal and better decisions.
Because the organisations that prevent serious incidents are not the ones with the most safety data.
They are the ones that can separate signal from noise and act before harm happens.
A final question
If you removed observation targets tomorrow, would the quality of reporting improve or collapse?
Your answer tells you whether your program is generating insight or simply maintaining numbers.
“Most observation programs don’t fail. They succeed at the wrong thing.”