datahead's comments

datahead | 5 years ago | on: Open EMR

Hi Duff! Apologies for the mixup, should have seen that in your handle. No, I'm not part of CHL/TXR.

While not open source by any stretch, I see the personal EHR space being ushered along by companies following Apple's lead. Aside from complex patients, I think the generally healthy/mild-chronic person is uninterested in owning/managing their health data unfortunately. Apple is contributing useful tools to understand fundamental determinants of health including cardiovascular, sleep, fitness at massive scale. It just happens to come with your watch and phone, and their vision for health is starting to come into focus. This puts the patient in the position of generating the primary data (sensors, etc.) and sharing it with their care team on their terms (more or less). As telehealth becomes more prominent, I suspect the patient will be required to engage with their data more often as it will be the means of conveying a shared understanding vs. observations recorded in the clinical setting and stashed away in centralized EHRs.

Furthermore, if labs and other diagnostics are available directly to consumers it puts the individual in a position of ownership. I think the default position is whoever generates the data owns it, and determines how easy/hard it is to share with others. If the individual is empowered to generate information about themselves- this will start to swing toward "patient owned." I too look forward to more of this, but it will have to come with more direct to consumer and digital offerings.

One of the coolest examples I've seen of individuals taking ownership in open source med tech is openaps.org . I'm not one, but T1D's are some of the most resourceful and resilient folks around. Good on them for building a community to solve real problems together. Shout out to the #wearenotwaiting crew.

datahead | 5 years ago | on: Open EMR

Hi Duff! Happy to see you on HN. EDIT: not Fred.

I work for a large hosp. operations company and serve as the Dir. Engineering for our clinical operations group. Hacking Healthcare is required reading for new members of my team. It serves as an excellent introduction (with a healthy amount of critique) to the dynamics in the hc technology ecosystem. Thank you for providing this perspective on the industry and its challenges with tech.

We've been successful developing using open source technology internally. In fact, I take a fairly hard stance on disallowing proprietary healthcare specific "solutions" from working their way into our stack (aside from the EHR itself, it has staying power). We're lucky in that we are positioned as somewhat of a startup within a larger org, and are able to take that approach.

To avoid some of the issues you raise, we generally are working to reduce the surface area of the EHR to become simply the transactional backend which is then mirrored to a larger ecosystem of custom apps. This has the effect of boxing in the regulated entity. We focus on data integration (by spending $$$$ on custom HL7 interfaces, unfortunately not everyone can afford) to get outside of the walled garden. This means we can use the information/data for new and interesting purposes without worrying about the EHR vendor's roadblocks/tolls. More importantly to some people, we don't disrupt the billing cycle that originates from the EHR.

Do you notice any trends where healthcare operations/providers are starting to develop internal technology that integrates with the EHR to compliment vs. replace the core transactional system?

datahead | 8 years ago | on: Datomic Cloud

For anyone interested in background on Datomic, these are fun and informative resources.

The talk where I realized Rich is a data head too. Introducing his design goals and life experiences with db(s). https://www.infoq.com/presentations/The-Design-of-Datomic

"Clojure for the Brave and True" author, Dan Higginbotham, on key themes and overview. http://www.flyingmachinestudios.com/programming/datomic-for-...

Congrats to the Datomic team and Cognitect. I hope this move to cloud opens Datomic's design principals and ideas to a wider user base.

datahead | 8 years ago | on: Datomic Cloud

Our team signed a contract with Cognitect for Datomic late 2017.

This clause was in-place and stood out to me as well. I had a chance to ask their legal team about it. The clause is written in legal-ese, which always sounds overbearing.

I asked the question in the positive sense, "what if have some really nice metrics from our use cases, and want to talk about them at a conference?" They simply asked to be consulted and request written permission to share. The intent, like others have noted, is to request (legally: insist) that Cognitect have a chance to review and point out potential implementation issues (good or bad) prior to customers making performance statements about their product.

The clause can/does put a damper on 'notes from the field' reports, which often help when deciding on tech direction.I look for community based reports to reinforce perceptions of a tool (to a degree). Completely agree with OP, do your own performance testing.

One thing I will say is that it would be hard for someone who hasn't invested in learning the inner-workings of Datomic's decoupled architecture to pick apart storage speed vs. transactor speed. For example, storage speed (SQL, DEV, Dynamo, etc.) is not a concern of Datomic, but a key dependency to measurable perf. This may change in the AWS service announced today, and become more uniform on dynamo and S3 "storage resources". https://docs.datomic.com/cloud/whatis/architecture.html#stor...

Datomic is a unique product and there are many ways to make it sing (or blow up) depending on how you use it. We designed data models, streaming processes, and queries with Datomic in mind and have had success. Exactly how much success, I'm not at liberty to say just yet.

datahead | 9 years ago | on: A Little Known SQL Feature: Use Logical Windowing to Aggregate Sliding Ranges

This technique applies to many RDBMS, not just Oracle (as others have noted). Teradata, PostgreSQL, MS SQL all have 'analytical' functions like this. Analytical functions (over, partition by, etc) are extremely powerful and can help simplify architecture/design for the data science/analytical communities.

One of the persistent issues on my team is the reliance upon a dataframe representation w/ R or python to do this type of aggregation and windowing. Most people will eschew learning the 'advanced' SQL and instead bring data locally to do imperative style munging on it.

This creates a few issues, mainly adding complexity to the analytical stack: - Instead of querying the data and doing ETL/feature engineering in the db- you are moving data around (usually to less powerful machines, such as a laptop) for simple exploration.

- This wastes time and usually results in more dependencies (dplyr for example- no hate Hadley), sometimes even limiting you to single threaded operations. Teradata, for example, is massively parallel and will perform these operations in short order. I've seen Data Scientists wait 6hr for R to do the same thing a SQL query against a prod system returns in 3min.

- Code is not portable. A query can be executed and results retrieved through ODBC, JDBC or native connections. Without these, data engineers are often asked to install R (including libs) on some intermediate machine just to do munging/ETL/feature engineering. If SQL driven, moving from quantitative exploration to operational is quite easy (maybe just a query tune).

All that to say, I'm glad this post is highlighting some of the advanced SQL that I hope more people rely upon. All of these ideas are better articulated in MAD Skills [0]

[0] http://db.cs.berkeley.edu/papers/vldb09-madskills.pdf

datahead | 9 years ago | on: A machine that tracks basketball shots

Interesting that they cost about the same {"the-gun": 4750 , "noahlytics": 4800} + subscription @ $100/mo. Possibly a competitive pricing scenario for high school program adoption.

I also see a key feature being the tracking of performance over multiple sessions. Noahlytics will likely answer: who is shooting the best this week? May impact starting line-ups for coaches, etc.

page 2