(no title)
skwb | 1 year ago
I admit it sounds pedantic, but I'm not discussing IRB *exemptions* that are sometimes required by an institution nor am I discussing BAAs. I was specifically talking about the specific IRB applications (which I've submitted and signed before) that the blog author was talking about. Yes, HIPAA and other state and local regulations also govern what you can and cannot do with the data, but that's not what I argued.
Sorta off topic but the FDA doesn't care so much about SAR unless you're directly programming the MR machine's pulse sequence. If you're just doing quantification of some brain structure for monitoring a biomarker, they're primarily concerned if your product 1) matches an existing prerequisite and 2) functionality that your product achieves performance that you say it does. That is why the marketing around most of the early DL / AI based radiology startups were focused on language for "study prioritization" rather than more specific claims.
fluidcruft|1 year ago
FDA and human subject protections come from different laws with different legislative authority. The regulations are not the same except to the extent that the agencies themselves work to harmonize things. If you are doing anything covered by FDA you must follow FDA's regulations in addition to any other applicable human subjects research regulations. And because MRI scanners are Class II regulated devices it means that people are being scanned with a doctor's permission, an IRB's permission or the FDA's direct permission.
FDA "doesn't care" about SAR to the extent that they have published guidance that if you operate an approved MRI scanner within normal operating mode (which are settings defined by IEC that do not necessitate medical supervision), then the FDA will not automatically consider use of the scanner itself to elevate a study's risk (in the way that using something like a CT scanner with ionizing radiation would). Risk determination goes beyond whether or not the MRI itself is a risk though. For example a research study that diverts patients to MRI in a way that delays care in an emergent situation (say testing experimental sequences for stroke detection) is unlikely to overall qualify as minimal risk even if the scanner operates in normal mode because of other non-MRI risks associated with the study procedures.
Retrospective use of de-identified or anonymized medical records that already exist are of course a different thing because the risks to the patient are primarily privacy risks.
And you are correct the actual FDA labels of all the AI crap that's coming out are jokes compared to what a lot of sales bullshitters promise. But you better believe all the data submitted to the FDA by the MRI manufacturers support their accelerated acquisitions that use deep learning recons follow FDA's clinical trials regulations.
skwb|1 year ago
That said, there's plenty of buying and selling of radiological images for industry development on the second hand market. Now where the line of "research" vs. "industrial" work is, well that's something I would leave to legal council. But as you said any sort of "altering" of clinical outcomes is a clear IRB is required zone like DL based recon.