My experience with Dexter Eng and the current automated enforcement process has left me deeply concerned about the fairness, integrity, and actual purpose of this system. Rather than serving as a safeguard for justice, it appears designed primarily to generate revenue for the city, often at the direc...t expense of ordinary residents. The automated machines—cameras, sensors, and other devices—were promoted as tools meant to deter violations and improve safety. In practice, however, they seem to function more like profit-generating mechanisms, churning out violations with little regard for context, accuracy, or actual public benefit.
This is where human oversight becomes essential. The role of an individual like Dexter Eng should be to step in where the machines fail—reviewing questionable citations, correcting errors, and ensuring that residents are not unfairly penalized. Instead, what I encountered was a process that felt dismissive, biased, and rooted in an agenda that prioritized upholding the machine-generated outcomes no matter how unreasonable or flawed they were.
Rather than exercising human judgment, Dexter Eng appeared more invested in reinforcing the automated system’s decisions, even when evidence clearly indicated an error. His demeanor suggested not neutrality or fairness, but rigidity—almost as though his primary mission was to validate the machine rather than to act as the corrective safeguard his position demands. It created a troubling impression that the process was less about justice and more about preserving a revenue stream.
When the person responsible for reviewing errors appears unwilling to acknowledge them, the entire purpose of having a “human reviewer” becomes meaningless. It creates an unjust loop: machines generate violations, residents challenge them, and the person meant to check for mistakes simply echoes the machine’s output without true consideration. This dynamic is not only unfair but undermines public trust in the system entirely.
The experience left me feeling that Dexter Eng operates with a malice-tinged adherence to the system’s monetary goals rather than any genuine commitment to fairness or accountability. A human being should bring balance, understanding, and discretion—qualities the machines inherently lack. Instead, the review felt cold, predetermined, and dismissive of context or common-sense judgment.
A system built on automation should require strong, ethical human oversight to prevent misuse. When that oversight fails, or seems aligned with generating revenue rather than correcting injustice, it highlights a much deeper problem. Residents deserve a process that values accuracy, fairness, and accountability—not one that hides behind technology while using humans as rubber stamps.
At minimum, the review process must be transparent, unbiased, and genuinely open to acknowledging machine errors. What I experienced was the opposite. Until these issues are addressed, and until individuals in positions like Dexter Eng’s demonstrate a commitment to fairness rather than reinforcing flawed automated outputs, the system will continue to feel unjust, punitive, and fundamentally misaligned with its stated purpose.