Sections
You are here: Home » Meet » OpenTox 2011 » Abstracts » Integrated Data Analysis of in vitro Testing Approaches Advancing Biotech Product Development and Safety Evaluation

Integrated Data Analysis of in vitro Testing Approaches Advancing Biotech Product Development and Safety Evaluation

Erwin L. Roggen, Novozymes, Denmark

It is strongly advocated that the development of integrated testing strategies (ITS) is a logical next step to meet the simultaneous needs for faster throughput and animal reduction approaches (Berg et al., 2011). In a recent report by Blaauboer et al. (1999), it is proposed the integrated information may include physicochemical data, in vitro data, human and animal data, computational methods (e.g. Quantitative Structure Activity Relationship (QSAR)) and kinetic models. In praxis, the definition and development of an ITS differs considerably among the various industry sectors, while similarities exist within a specific sector or company.

The objectives of REACH from a public health perspective are (i) to classify and label chemicals, and (ii) to perform appropriate risk assessment. ITS are still in an early stage for use in chemical REACH safety assessment, but the standard testing requirements formulated by the legislation clearly demonstrate flexibility. This presentation will use recombinant proteins to be used in detergents, food or feed as examples.

In the first step as much relevant available information is gathered from as many sources as possible to ensure a sufficient amount of useful data. Being extracted from multiple types of in vivo studies from a number of animal species, complemented with in vitro and computational approaches, the available data is often very heterogeneous in quality, reliability and relevance for human safety assessment. Thus, historical data have to be weighted carefully before being included in any form of testing strategy or risk assessment. If the information gathered is not sufficient to make any qualified decision, it should be considered whether waiving exposure-based animal testing can be accepted before any test is performed. In addition, the available information should be used to help finding the most efficient strategy for filling information gaps and gaps in our knowledge and understanding of toxicity. 

To develop an efficient ITS tool, a possible approach could be statistics-based for categorical endpoint probability, e.g. “What is the probability that a prediction of sensitization is true?”. Thus, the basic method developed involves scoring the outcome as “yes” or “no” relative to a gold standard. This outcome should then be combined with a quality factor derived from quality assessment of available testing information. The quality assessment should describe the performance of the specific test (e.g. specificity, sensitivity, predictivity) as well as the results obtained (e.g. quality of in vivo data, number of animals used, number of compounds tested, the applicability domain of the test and reporting).

The results of such an approach will be modified according to the quality factors for weighting methodologies. However, it is still being discussed in which way this is to be achieved. 

 

Berg, N., De Wever, B., Fuchs, H., W., Gaca, M., Krul, C., Roggen E., L. (2011) Toxicology in the 21st Century – working our way towards a visionary reality. Toxicol. In Vitro, doi:10.1016/j.tiv.2011.02.008
Blaauboer, B.J., Barratt, M.D. and Houston, J.B. (1999). ECVAM Workshop Report 37. ATLA 27, 539-577

Document Actions