Quality quote
“Quality is a perception of receiver.” — Lalit Kale
Five words. But unpack them and you find an entire product philosophy hiding inside.
We spend enormous energy defining quality in absolute terms: conformance to specification, absence of defects, performance benchmarks met. These are useful proxies. They give teams something concrete to measure and argue about in sprint reviews. But they all share a common flaw—they locate quality inside the artifact itself, as if it were a property you can inspect under a microscope.
It isn’t. Quality only exists in the moment a person encounters your work.
This isn’t relativism. It doesn’t mean every subjective complaint is equally valid or that you should ship broken software because “beauty is in the eye of the beholder.” It means the judgment call on quality belongs to the person receiving your work, not to the person producing it. Your checklist says the feature is done. Your user’s experience says otherwise. The user is right.
In practice, this distinction has teeth. A database query that returns results in 200ms feels instant to a developer running it locally against a tiny test dataset. The same query feels broken to a customer watching a spinner for three seconds over a mobile connection. Technically the same code. Entirely different quality perception. Closing that gap requires you to understand the receiver—their context, their expectations, their comparison points—not just your own engineering standards.
It also means quality is dynamic. What satisfied users two years ago may frustrate them today because alternatives have shifted their baseline expectations. Your software hasn’t changed; the perception has. Keeping quality high is therefore a moving target, which is uncomfortable for teams that want to ship a thing and call it finished.
The most useful thing about this framing is where it puts the burden of proof. You cannot declare your work high-quality. Only the receiver can. That keeps engineers honest, keeps product teams curious, and keeps everyone paying attention to the signal that matters most: what users actually experience, not what the test suite says they should.