It took over a decade to sequence the human genome the first time, at a cost between half a billion and a billion dollars. Now, however? An entire human genome can be sequenced for one or two thousand dollars, in a matter of days. But we are far from the point at which an individual’s genomic data can be routinely incorporated into their health care. As an initial matter, electronic health records systems are not yet equipped to handle genomic data. Moreover, the Food and Drug Administration (FDA) is still in the process of establishing an appropriate regulatory framework.
Under the Food, Drug, and Cosmetic Act, in vitro diagnostic (IVD) products are medical devices, subject to FDA oversight.
For example, the agency regulates companion diagnostics, which provide information that is essential for the safe and effective use of a therapy. A companion diagnostic might be designed to identify patients who are likely to benefit from a therapy, or who are at risk for a serious side effect. It might also be used to monitor a patient’s response to a therapy, so that it can be adjusted as needed. The FDA also regulates genetic tests used for the diagnosis of certain diseases or conditions, for the identification of infectious agents, or for determining carrier status. To date, over 50 tests that query one (or multiple) genes have been approved. As a general matter, each test is approved for use in diagnosing a single disease, disorder, or infection.
But how should the agency approach new and emerging technologies that allow for the query of all genes? And what happens now that our ability to detect genetic variation outpaces our ability to understand what those variants mean?
This summer, the FDA released draft guidance documents to begin to address its regulation of Next Generation Sequencing (NGS)-based tests. It held a workshop in September, and has made the presentation slides and archived webcast available.
- Development and implementation of NGS standards
The draft guidance on standards specifies four performance metrics to be assessed: accuracy, reproducibility/repeatability, limits of detection and analyte specificity. It also recommends minimum performance thresholds.
Relying upon standards might eliminate the need for test developers to submit data for each new test they develop. Instead, an NGS platform would just have to conform to an FDA-recognized, consensus standard. The agency would regularly review the standard, and update it as technology (and our knowledge base) advances.
- Use of genetic databases to support clinical validity
The use of genetic databases would be a departure from the typical approach the agency uses to review new IVDs. Regulations require that valid scientific evidence support the analytical and clinical performance of an IVD, which has historically meant “well-documented case histories conducted by qualified experts.” By contrast, genetic databases have some unique attributes, including evidence generation by multiple parties (think “crowd sourcing”), and an aggregate body of data providing a strong evidence base.
The draft guidance applies to the use of publicly accessible databases. It includes recommendations for transparency in database operations; the collection of certain information about the data and its sources; Standard operating procedures (SOPs) for curation, aggregation and interpretation; reliance upon expert personnel; and privacy, security and data preservation measures.
At least one industry player has expressed some concern over the agency’s approach to genetic databases, noting that the draft guidance fails to indicate how databases will be used, and whether submissions that rely on databases would be subject to established timelines.
We can expect to see a lot of interest in the FDA’s next steps, which are sure to impact a rapidly growing market.
Follow me on Twitter: @sroberg_perez