Why I'm Moving into Data Analytics

Darien Nguyen bio photo By Darien Nguyen

Every day in the clinical lab, data moves fast. Results come off the analyzer, get flagged, reviewed, and reported — and then the cycle starts again. It’s a world built on precision and speed.

But somewhere in that rhythm, I started noticing something: we generate an enormous amount of data, and most of it disappears the moment a result is released. The instrument logs, the QC trends, the reagent consumption patterns, the correlation between test volumes and turnaround times — all of it quietly accumulating, rarely interrogated.

That bothered me.

I’ve spent years working hands-on with high throughput platforms — Werfen’s NEO Iris and Gemini systems, Beckman’s AU680 and PK7400, Abbott’s Alinity s and c — and what I’ve learned is that these machines are only as powerful as the understanding we bring to their output. The hardware is impressive. But the insight lives in the data.

So I started asking: what happens to patients when labs run inefficiently? What does a spike in hemolyzed specimens tell us about a collection process? Can we predict instrument failures before they disrupt a shift? These aren’t just operational questions — they’re patient care questions.

Data has the power to paint a larger picture of patient outcomes. That’s the idea driving me toward analytics.

I’m building on a foundation I already have — Excel for tracking, Python from bioinformatics work, SQL and Tableau from graduate school — and pushing further. The goal is to bring a clinical lens to data work that often lacks it.

This site is where I’ll document that journey. More to come.