The dermal route in systemic exposure

To evaluate risk from dermal exposure, the amount of material on the skin must first be measured. The potential for dermal uptake must then be assessed for the potential health effects from systemic exposure. No standard methods exist for studying these processes, and published data are not comparable because of the different techniques used. Future validated methodology should provide a sound scientific basis for risk assessment. Methods for measuring skin and surface contamination will require development of reference contaminated surfaces and skin as part of quality control procedures. Biological monitoring is a valuable tool in the assessment of dermal absorption, in contributing to the validation of in vitro techniques, and in risk assessment and management. It will be necessary to conduct detailed investigations to support risk assessment for dermal exposure. Ultimately, predictive models will be established for exposure and for dermal absorption to support a generic approach and allow risk assessment strategies appropriate to actual workplace situations.


Measurement and Testing Framework Programme Four, surface contamination.
Exposure to hazardous substances in the workplace occurs by inhalation, ingestion, dermal contact, or some combination of these routes. Occupational hygienists and regulatory bodies have traditionally focused on inhalation exposure and many methods have been developed to measure inhalation exposure levels. There is a clear understanding of how such levels should be interpreted as part of a risk reduction strategy, but the situation is less clear for dermal exposure. As occupational exposure limits and improved control measures lead to decreasing atmospheric exposures, so the likelihood increases that dermal exposure will make a significant contribution to the overall systemic exposure to chemical substances at work. For some substances that readily penetrate the skin, it is likely that this may be the major route of exposure, but we do not yet have the methodology to allow us to understand the consequences of, and control the risks from, the different sources. Any assumption that derinal uptake can be simply prevented by use of gloves is probably inappropriate for many reasons, including the reluctance to wear protective equipment in some cultures and climates, the limited effectiveness of many types of gloves, possible deposition on unprotected areas such as the face, particularly for aerosols, and contamination of gloves (with the probability of enhanced absorption).
In response to the increasing recognition of the potential significance of dermal exposure (I), the European Commission funded a thematic network, under the Standards, Measurement and Testing Programme of Framework Programme Four, for a period of 3 years from November 1996 to October 1999. The Dermal Exposure Network (DEN) is aiming to create synergy between different aspects of dermal exposure so that a legislative, advisory and operational framework can be established, based on a sound scientific foundation, for dealing with the problems created by dermal exposure. While appreciating the importance of skin irritation and sensitization, DEN has focused on the potential for systemic exposure via the dermal route as the terms of reference. In order to help focus attention on the key issues, a Delphic survey was conducted among the DEN members to identify the most important problems associated with dermal 1 School of Biological Sciences, University of Surrey, Guildford, Surrey, United Kingdom. exposure, both currently and in the future (23 respondents out of the total of 30 participating organizations in 16 countries). An indication of the issues thought most likely to be of concern in the early years of the next millennium is shown in table l. This paper presents a summary of the positions and research needs identified by the DEN in the areas related to skin and surface contamination, percutaneous penetration, biological monitoring, and risk assessment.

Skin and surface confaminafion
Practical measurement methods have been developed to assess dermal exposure, but existing measurement methods can be criticized because they determine the mass of contaminant either depositing on the skin or retained on the skin at the end of the exposure period. Hazardous substances on the dermal surface will be taken up continuously into the body through the stratum corneum, driven by the concentration gradient between the dermal surface and the perfused tissue. The risk arising from dermal exposure is thus first related to the time-dependent concentration of a substance on the dermal surface rather than on the mass of material on that surface at any given time. Contamination of the skin can arise in many different ways as a result of the transport of mass between compartments. The compartments can be summarized as source, air, surface contaminant layer, outer and inner clothing contaminant layer separated by the fabric having a buffer capacity to represent the mass residing inside the clothing that does not come into contact with surfaces or skin, and skin contaminant layer consisting of contaminants, sweat, skin oil, and barrier cream (if applied). The skin contaminant layer is separated from perfused tissue by the stratum corneum, which acts as a rate-limiting barrier having a certain buffer capacity. The transport processes, according to a terminology proposed by Schneider et a1 (2), are (i) the emission of substances into the air and onto surfaces, outer clothing, and the skin contaminant layer (immersion being an event whereby a part of the body is submerged into a substance); (ii) deposition of substances from the air to surfaces, outer clothing, and the skin contaminant layer; (iii) resuspension or evaporation of substances from surfaces, outer clothing, and the skin contaminant layer to the air, as particulate, vapors, or both; (iv) transfer of substances by direct contact between surface, skin and outer and inner clothing contaminant layers in a direction towards the worker (removal being the corresponding transport in a direction away from a worker); (v) redistribution of substances between compartments of the same type (eg, redistribution of contaminants from one part of the skin contaminant layer to another as a result of touching the face with contaminated fingers); (vi) decontamination, which is the deliberate transport of contamination away from the entire system, for example, ventilation of room air, cleaning of room surfaces and outer clothing or washing off the skin contaminant layer, (in contrast, brushing dust off clothing being resuspension); and (vii) penetration and permeation, both of which involve transport of substances through the rate-limiting baniers, clothing, and stratum corneum.
Several transport processes are driven by the compartment concentration, in particular permeation from the skin contaminant layer through the stratum corneum. This layer has conventionally been represented as a 2-dimensional layer giving concentration as mass per surface area without the concentration of the hazardous agent in the layer itself being specified. This representation has contributed to the confusion regarding choice of measurement principles for, and interpretation of, dermal contamination in terms of dermal uptake. The concept of concentration is complicated for particles since they are discrete entities. Uptake of particles may be limited by the rate of dissolution rather than diffusion through the stratum corneum. If skin loading continues until a multi-layer forms, the additional mass will have less and less influence on uptake and may be likely to dislodge and fall off. For this reason the mass of particles in the skin contaminant layer may only have limited relevance for uptake.

Methods of measuring skin and surface contamination
Depending on the perspective from which the exposure scenario is investigated, a range of measurement approaches has been developed from the 1950s on. Several of these methods are used to assess both compartment mass (eg, total mass in surface layer) and mass transport processes (eg, dislodgeable mass). A clear distinction is not always made, and this lack of distinction has created confusion regarding the determination and interpretation of sampling efficiencies. As an example, measurement methods targeting 100% sampling efficiency should be intended for describing compartment mass. On the other hand, a 100% sampling efficiency for, for example, a wipe test to quantify transfer, is not necessarily desirable. The amount that can be transferred from surfaces depends on the actual type of surface, the contaminant and the forces acting on the contaminant layer, rather than on how much mass is present in the compartment. Measurement of transport thus must be based on conventions such as the use of standardized instrumental methods. Deposition from air is more predictable, and it enables a definition of ocular and various dermal deposition rates.
There is a close analogy between the task to assess all these transports and the task to assess the transport of airborne particles to the various parts of the airways. Historically, one approach was to propose biologically relevant particle-size fractions, based on experimental data, that reflected average deposition efficiencies in the airways. The other approach was to define the fraction as that obtained by a given sampling instrument. lnitially these size fractions were different. However, better experimental data on deposition efficiencies has resulted in the adoption of an international standard for the definition of the respirable, thoracic, and inhalable fractions, and recent development has resulted in samplers, with sampling efficiencies that match the definition of the fractions.
Surface contaminant layer. For surfaces, several in situ methods are available, but they can only quantify a limited range of substances, and never total mass. Methods based on removal, such as adhesive tape sampling, can have a high sampling efficiency, but there is no simple way of determining the sampling efficiency in field use.
Clothing contaminant layers. In clothing, contaminants are partitioned between an outer and inner contaminant layer, and a buffer to represent the mass residing inside the clothing which does not come into contact with surfaces or skin, respectively. Measurement of the entire mass in the clothing would, in principle, be no problem and, if this were done for nonpermeable fabrics, theoretically the outer contaminant layer would be measured. A method which measures the inner contaminant layer for permeable layers needs to be developed.
Skin contaminant layer. Skin rinsing methods are used for assessing the mass in the skin contaminant layer. The method typically recovers 40-90% of contaminants spiked onto the skin. However, spiking experiments have inherent problems, as it is unclear whether this result should be interpreted as a 40-90% sampling efficiency for the skin contaminant layer or if it reflects partitioning between this compartment and the stratum corneum or perfused tissue. The ultraviolet (UV) method quantifies the amount of a fluorescent tracer previously added to the contaminant by illuminating the skin homogeneously with UV light. However, fluorescent tracers have the ability to bind with the cell proteins in the stratum corneum, and the method is not able to differentiate between the skin contaminant layer and the stratum corneum.
Deposition. Adhesive tape can measure particle deposition, unaffected by loss. This method has been used for particle deposition onto, for example, the face, but the method does not work for parts of the body that come in contact with other surfaces. Charcoal cloth has been used to measure skin exposure to volatile compounds. A problem is that the method does not differentiate between vapors from air and liquid splashes.
Resuspension or evaporation. Some methods for assessing resuspension have been developed, such as simulation of the mechanical impact caused by a person walking on a carpet or the impact caused by air movements. Evaporation of water vapor from the skin (transepider-ma1 water loss) can be measured with simple-to-use instruments. A similar instrument could be developed for other vapors.
Transfer. Methods based on determining the increase in mass over time in the skin contaminant layer compartment only measure the net balance between transport to and from the compartment and thus are affected by deposition, re-suspension or evaporation, removal, and uptake. Surrogate skin methods such as pads, gloves, and coveralls rely on the assumption that they approximate the transport of mass to the skin contaminant layer from surfaces. However, the materials used do not have the roughness, stickiness, and other properties of human skin and so may not meet this assumption. Transfer from the inner clothing contaminant layer to the skin contaminant layer is usually estimated by using a patch or underwear sampling approach. No method appears to have been developed specifically for measuring this transfer. Details of the measurement methods and references can be found in the report by Schneider et a1 (2).

Future research
Future research should address the following topics: Comparability between studies of dermal exposure or surface contamination is limited and all appropriate variables have usually not been measured. This problem is in part explained by the lack of a structured approach and a consistent and generally agreed upon tesminology. Development is needed of methods for measuring concentration in the skin contaminant layer and of surrogate skin samplers that better mimic the properties of skin. Particles will be a special challenge. The direct measurement of mass transport must be based on conventions. Most existing methods differ in both the choice of factors to standardize and in their specification. Therefore the standardization of methods should be initiated. Protocols for generating surfaces and artificial skin with known and evenly distributed amounts of contaminants should be established as a prerequisite for developing and validating measurement methods.

Use of percutaneous penetration data in the risk assessment process
The procedure for risk assessment consists of comparing exposure levels with no observed adverse-effect levels (NOAEL). It is commonly considered to consist of 4 steps (3): hazard identification, dose-response assessment (which allows NOAEL evaluation), exposure assessment, and risk characterization, based on the previous steps. Intensity, frequency, duration, and probability of exposure should be estimated. On the basis of the NOAEL and the predicted exposure, it is possible to calculate the margin of safety (MoS) according to the following equation: Determining the percutaneous penetration of toxicants to estimate systemic exposure is only required if the MoS, calculated assuming systemic exposure equal to dermal exposure (ie, 100% absorption) is low (4). For general populations a MoS greater than 100 does not represent a significant risk (5).
To assess dermal uptake, percutaneous absorption data, contamination levels, and the toxicokinetics of occupational toxicants would be required. When no literature is available, a value of 100% (worst-case scenario) has been assumed for the dermal absorption of chemicals for the hazard assessment process. This is a very conservative approach, which has been adopted to encourage studies on the percutaneous penetration of compounds. A less conservative approach to risk assessment could be used when penetration rates through human skin are known. In fact it is clear that 100% absorption via skin does not correspond to realistic exposure. Correction using percutaneous penetration data can have a major impact on regulatory toxicology, because the adjusted dose used in the risk calculation may be reduced significantly. These estimates should be as close as possible to real exposure conditions. To achieve this, experiments should be conducted under finite dose conditions, using vehicles, concentrations of chemicals, and periods of exposure which reflect in-use conditions.

In vivo methods
Results from human volunteer studies are the most relevant but are rarely available. Measurement of the diffusion gradient over the skin by quantifying serum concentrations in vivo is difficult because these concentrations are very low, often below the detection limits. The evaluation of urinary and fecal excretion after dermal uptake is hampered by several factors, depending on the metabolism, distribution and kinetics of the substance. For costs and ethical reasons it is impossible to carry out experiments on volunteers for many compounds (and in some European Countries they are prohibited). On the other hand, dermal absorption in rodents, commonly used in laboratories, is very different from human absorption. Animal models are very useful for toxicokinetic studies, but they frequently overestimate dermal uptake in humans.

In vitro methods
In vitro methods seem to present considerable promise. They are rapid and inexpensive to perform. Protocols are well described in the literature, including the adoption of human skin as the membrane. In vitro techniques are commonly used to test drugs and cosmetics because it is considered that they can predict the in vivo absorption rate with a reasonable approximation. Thus, with some limitations, penetration rates can be obtained by properly designed in vitro experiments. There is a general interest in in vitro techniques, demonstrated by the fact that guidelines of the Organization for Economic Cooperation and Development (OECD) have been proposed recently for in vitro methods (6). However some members of the OECD have questioned whether there is adequate documentation of the validation of the methods. They claimed that experimental conditions regarding the treatment of the skin membrane and the composition of the receptor fluid vary from one laboratory to another, and also within the same laboratory, making comparison between the results of different studies impossible. Moreover comparisons between the in vitro and in vivo results presented in the literature are not sufficient to validate the in vitro methods.
Obviously, in vitro estimates cannot be exact because of the number of factors that can influence dermal penetration (although this is also true for gastrointestinal and respiratory uptake), such as different sites of application and concentrations of substances. Values obtained for the permeability coefficient (Kp) seem to be affected by increasing the applied dose, probably because diffusion through the stratum corneum is a rate-limiting process. Metabolism, formation of the stratum corneum reservoir, and loss of sink conditions can also play a role. The simulation of biological processes always has limitations, and it is impossible to calculate dermal uptake without accepting some simplifications. However in vitso percutaneous penetration data have a good scientific basis, in contrast to assumptions of default values such as 10% or 100% absorption.
In vitro experimental conditions should be carefully controlled to obtain relevant data for risk assessment. In their protocols the donor phase should reproduce the real conditions of human skin contamination, rather than using a convenient solvent such as acetone, the integrity of the membrane should be tested, and the receiving fluid should simulate the sink conditions of the peripheral blood flow. To understand the dermal uptake of chemicals bound to soil, dust, oil, water, solvents, and the like, information on the pure substance is helpful, but other factors must be taken into account (7). Exposure conditions such as the vehicle, concentration of the compound, area and site of exposure, and occlusion may be more relevant for risk assessment than penetration rates per se.
The standardization of in vitro tests and the comparison of results with in vivo data could produce internationally accepted penetration rates for the risk assessment process. In vitro systems should be standardized with a limited number of chemicals at different dose levels. In vitro and in vivo correlation studies should be conducted using human skin in vitro and volunteers for specifically chosen chemicals in order to support the acceptability of the approach adopted for generating in vitro data. Extensive absorption data can be determined according to this route for a series of chemicals.

Modeling techniques
With the use of data for a representative chemical or group of chemicals, a quantitative structure-activity relationship (QSAR) could be developed for prediction from physicochemical properties.
Skin notation assignment can be based on the comparison of dermal uptake with respiratory uptake at inhalation exposure levels equal to the time-weighted average threshold limit value instead of using acute dermal toxicity (dermal LD,,). Following this approach, the Dutch Expert Committee on Occupational Standards (DECOS) established that a skin notation should be assigned when the amount absorbed by the arms and forearms in 1 hour is more than 10% of the amount absorbed by inhalation in exposure to the occupational exposure limit for 8 hours (8). This approach is hampered by the fact that very little data are available in the literature concerning skin absorption. Fiserova-Bergerova et a1 (9) used absorption rates obtained with the Berner & Cooper model (10) and not experimental data to assign the skin notation.
Guy & Potts (1 1) have demonstrated that a QSAR model based on the physicochemical properties of compounds and derived from experimental data gives better results. Sartorelli et a1 (12) obtained even better results using in vitro data obtained under the same experimental conditions. However, currently, the available percutaneous penetration data are derived from different experimental protocols and are too inconsistent to allow a general QSAR approach.
In the literature the following restrictions are reported for the application of these predictive models in risk assessment (13): (i) compounds should be neutral, lowmolecular-weight (<750 daltons), and moderately hydrophobic and (ii) the exposure period should cossespond to pseudo steady-state conditions.
With the use of validated percutaneous penetration data, QSAR values could be used to predict the absorption of closely related compounds under similar exposure conditions. This approach could reduce time and costs.

Biological monitoring in risk assessment
Biological monitoring is the quantitative measurement of hazardous substances or their metabolites in biological media (usually blood, urine or breath) from potentially exposed persons. It is related to, but distinct from, biochemical effect monitoring, which determines the portion of a hazardous substance or its metabolite that has reached a toxicologic relevant target, and biological effect monitoring, which looks for reversible changes (usually, but not always, biochemical) resulting from exposure.
Biological monitoring is an essential part of controlling exposure to lead, and there is a binding biological limit value for the European Union (Council Directive 98/24/EC of 7 April 1998). For many other substances, biological monitoring is a tool to help assess occupational exposure of both persons and groups of workers with the objective of keeping exposure low enough to prevent damage to health. Unlike ambient air and skin monitoring, which only assess potential exposure through inhalation and the amount on the skin, biological monitoring can assess actual uptake by all routesthough the skin and by ingestion and inhalation. It can give an indication of the internal dose of a substance. It can help to assess the efficacy of the control procedures and equipment used to reduce exposure. For example, if the control of exposure relies on respiratory protective equipment (eg, masks) or dermal protection by gloves, then biological monitoring may be the only way to check these are working or being used correctly.
Regulators can use biological monitoring data in order to assign occupational exposure limits based on data from actual occupational exposures and internal doses in comparison with dose-response data from animal studies. Biological monitoring data can be used, for example, with physiologically based pharmacokinetic (PBPK) modeling, to predict target tissue doses and refine interspecies extrapolations in the toxicologic part of risk assessments.
Biological monitoring can also help employers and employees reduce exposure and risks from hazardous substances in the workplace, either by comparing the results of biological monitoring with some form of healthbased biological monitoring guidance values or by comparing them with hygiene-based guidance values. Appropriate guidance values can be derived from biological monitoring results obtained from workplaces employing good standards of occupational hygiene and with controls of exposure that are as good as is reasonably practical. It can help the employers or employees to see that their exposure controls are working or if they need improving.
The noninvasive forms of biological monitoring (based on urine or breath) offer the potential to be simple tools for small firms to check their exposure controls and target their (limited) resources to situations where they have problems. However, clearly defined guidelines would be required for such applications to be effective.

Problems associated with biological monitoring
Biological monitoring involves the collection of personal samples (blood, urine and breath) and their analysis to provide data. There are ethical issues however related to consent, use of data, and confidentiality. These are usually dealt with when the biological monitoring program is set up (often with the help of a physician). They involve giving the workers sufficient information about the nature and purpose of the program to enable them to give their informed consent and to say who should see the data generated. An important aspect of this process is to make it clear what the samples will be analyzed for and what they will not be analyzed for (alcohol, drugs, etc). It is also necessary to reassure the subjects that the data will not be used for any other purpose (14).
The collection of blood samples is usually considered invasive and, although tolerated by most workers, it is not popular. In practice, wherever possible, there has been a move away from blood sampling to less invasive samples based on urine or breath.
Advice on the correct sample to collect, when to collect it, the way to store it and transport it to a laboratory, and how to analyze it are available in reference books or from good laboratories (15-17).
One of the difficulties with biological monitoring is a lack of data to help interpret the results. There are biological monitoring guidance values for about 50 substances (18-20), but this is a very small proportion of the substances to which workers can be exposed. Part of the reason for the low number of biological monitoring guidance values is the large amount of data required to define a dose-response relationship which will allow a health-based biological value to be set. Part of the reason is historicaluntil recently biological monitoring was not considered as part of the occupational limit setting processand part is due to a lack of appreciation of the utility of biological monitoring. It is to be noted that many of the established guidance values are based on inhalation exposure data and inhalation exposure limits. Dermal exposure is generally not considered. Biological monitoring data should preferably be based on pharmacokinetic or pharmacodynamic data obtained in volunteer studies. Such data take account of interindividual human differences and thus allow for more than a generic interpretation of exposure or health effects (ie, risk assessment). Many of these issues are now being addressed and, in particular, the development of biological guidance values linked to good practice and the control of exposure offers the potential for more guidance values in the future (20).
One of the strengths of biological monitoring is that it assesses exposure by all routes, but this feature can also be a problem when exposure occurs through several different routes. At the moment we cannot distinguish the route of exposurethere are no dermal-specific or inhalation-specific markers of exposure, and an elevated biological monitoring value may need careful investigation before the source can be identified. This may be a relatively simple matter for an occupational hygienist or health professional. The advantage of biological monitoring is that the effort can be targeted towards the areas or workers with elevated exposure rather than towards all areas or workers.

Future prospects for biological monitoring
Traditionally, biological monitoring has been done by occupational physicians and health professionals, and this is still the case in many European countries. In part this circumstance was due to the reliance on blood sampling and in part due to the links between biological monitoring and health surveillance. In addition, health professionals have been trained to deal with the ethical and consent issues. The move towards noninvasive sampling (mainly urine, but breath sampling is also possible for some substances) and guidance on the ethical issues coupled with the growing awareness of the usefulness of biological monitoling as a tool for exposure assessment has legitimately moved biological monitoring into the area of occupational hygienists and health and safety professionals. The comparative ease of biological sample collection makes it a simple procedure that small firms may find usefulfor example, a urine sample collected at the end of the workday.
There is a growing use of biological monitoring in risk assessment, both by regulatory bodies and employers. The move towards noninvasive sampling will continue in many countries as all involved appreciate the benefits.
Although there is now more guidance for biological monitoringboth when and how to do it and also with respect to the interpretation of the resultsthere is still a need for more. The 50 or so substances with biological monitoring available are a small percentage of those which workers may be exposed to and which can be absorbed through the skin. Increased guidance will, in turn, lead to increasing use as part of both risk assessment and risk management. Biological monitoring guidance values linked to "good practice" should (i) help regulators set guidance values with fewer data, (ii) help employers demonstrate they are controlling exposure, and (iii) help workers by reducing their exposure to hazardous substances and give them (and their physicians) confidence that substances are being controlled and their health is not being harmed.
The overall trend towards reducing airborne occupational exposure levels means that skin as a route of entry will become relatively more significant, and, as a consequence, there will be a greater need for biological monitoring to monitor exposure.
The developments in molecular biology will bring greater potential for the phenotyping of genetic polymorp h i s m~ in metabolism and give rise to some of the interindividual variations in biological monitoring results or their impact on risk assessments. There are many ethical issues attached to this area, and it may take some time before clear guidelines emerge.
Most risk assessments are for single substances, but most workers are exposed to mixtures. Biological monitoring linked to PBPK modeling may offer ways of looking at mixtures in a structured way in order to improve risk assessments in the workplace.

Quality aspects
The quality of biological monitoring data depends on many processes, from sample collection to analysis. The literature includes well-established procedures and methods for ensuring the quality of results and a recent re-view of external quality control programs (21). In addition, there are several well-established, international quality assurance schemes available for most of the common biological monitoring analytes. The 2 largest schemes are operated by the University of Erlangen-Nurnberg in Germany and the Finnish Institute of Occupational Health. There is a need for these schemes to grow to match the number of new substances for which biological monitoring is helping to assess exposure and risk.

Risk assessment of dermal exposure
Knowledge of skin contamination and the levels of internal exposure measured by biological monitoring is scant, in part because the methodology for assessing dermal exposure (22,23) has not yet been validated. Nevertheless, large data bases on whole-body dermal exposures in occupational settings using pads or coveralls together with monitoring gloves have been constructed, especially for agricultural pesticides, throughout the world. Some additional data have been collected, using biological monitoring. Occasionally these data have been obtained in conjunction with each other (24, 25). These data are widely used for the registration of agricultural pesticides, again throughout the world.
Recently, similar approaches for the assessment of exposure have been used for biocides (26). For industrial chemicals, no substantial data on dermal exposures are available. The focus on exposure has previously been directed towards inhalation exposure, although health effects due to dermal exposure to, for example, coal tar and several pesticides, such as organophospho~us compounds, have been known for a long time.
For assessing dermal exposure to new and existing chemicals in the European Union, use is made of a dermal version of EASE (estimation and assessment of substance exposure) (27), which is based on a very limited set of experiments on the adherence of material to hands immersed in a few different liquids and expert judgment. It has been shown in the actual use of EASE for notifications and in European Union monographs on existing substances that the level of exposure to the skin is frequently unacceptably high, and therefore the importance of further experimental verification of the levels of dermal exposure occusring during various occupational tasks that may lead to appreciable deposition of chemicals on the clothing or skin of workers or to the transfer of contamination to clothing or skin by contact is indicated.
To fill this knowledge gap, research is needed to gather sufficient dermal exposure data to develop predictive exposure models that have theoretical and data-base structured components. An outline research program is proposed in the appendix.
The DEN will produce position papers on the use of biological monitoring data and strategies for risk assessment both for single chemicals in notification and registration procedures and for the workplace, especially small and middle-size enterprises.
The risk assessment process itself is rather complex for dermal exposures, comparing exposure estimates and a health-based occupational exposure limit value, since it requires many components which are, in most cases, not available (28). The major missing links are adequate data on dermal absorption relevant for the exposure scenarios involved (29) and the process by which dermal noeffect levels can be determined without dermal toxicity data. The route-to-route extrapolation (30, 3 1) required for this process is still insufficiently studied to ascertain such values with reasonable certainty, thus requiring assessment factors which may have to be relatively large.
The large group of people that make up the risk assessment subgroup of DEN, working in over 20 institutes or organizations from more than 10 different European countries, indicate the great interest in, and the importance of, the risk assessment of dermal exposure.

Concluding remarks
Risk assessment in relation to potential systemic effects of dermal exposure is much less sophisticated than for ingestion or inhalation. This situation is largely due to limitations in the methodology, but also because of previous poor recognition of the problems associated with dermal penetration. In order to evaluate risk it is necessary to (i) measure the amounts of material that may be deposited on the skin from direct contact with a chemical, from contact with contaminated surfaces and clothing, and from chemicals in the air; (ii) to assess dermal uptake into the systemic circulation; and (iii) to evaluate the potential health effects of the resultant internal exposure.
Progress in the measurement of skin and surface contamination has been hindered by the lack of accepted terminology and methodology. DEN has paved the way for progress by agreeing on essential definitions, compiling a data base of accepted methodology, and identifying priorities for future research.
With respect to dermal penetration and absorption, there is a need for the generation of good quality data, using validated methodology to support the comparison of different materials and using protocols of relevance to occupational exposure. Initially such data are likely Biological monitoring involves the analysis of low levels of substance(s) or metabolites in urine or breath. It may be used in the validation of in vitro models of percutaneous penetration. With the use of an understanding of the metabolism, kinetics, and toxicology of the substance, a significant improvement in risk assessment can be made. Since well-developed biological monitoring has the advantage of estimating either the relevant uptake of a compound in the body or the relevant (early) health effect level, it is a powerful tool in risk assessment and risk management processes.
We believe there is a need to establish predictive exposure models for use in generic risk assessment and for risk assessment strategies appropriate to actual workplace situations. To this end, we are developing a comprehensive research program to incorporate data on percutaneous penetration validated by biological monitoring and physiologically-based pharmacokinetic models.
Maximum value will be obtained from future research if investigations focus on priority substances wherever applicable, so that complete data sets can be incorporated into the developing models. The important emerging issues identified by DEN form a suitable basis for identifying such priorities.
We envisage that our paper will stimulate discussions to the benefit of the ongoing process of developing more adequate methodologies for the assessment and control of dermal exposure. While we have attempted to adopt a comprehensive approach to the subject, we recognize that the work instigated by DEN will inevitably raise further questions. Of note are the need to distinguish between different sources of exposure, how to deal with risk assessment for mixtures of chemical substances (for which some innocuous components may enhance dermal penetration of the more hazardous components), and other issues related to risk management.