Background Radiation

Michael I. Ojovan , ... Stepan N. Kalmykov , in An Introduction to Nuclear Waste Immobilisation (Third Edition), 2019

5.4 Background Radiation

Background radiations levels are not abiding and vary worldwide from approximately 2 to 8×10 3  μSv/yr (Fig. 5.2).

Effigy five.two. Average annual doses of radiation in unlike countries (in millisieverts per yr).

Some areas with sizable populations have much higher than average background radiation levels. The highest are found primarily in Brazil, Republic of india and China and are due to high concentrations of radioactive minerals in the soil. In Brazil, for example, the monazite sand deposits along sure beaches result in external radiation levels ~l   µGy/h. Natural radiation fields vary around certain average magnitudes everywhere. Temporal changes in background range too occur during short to long-fourth dimension frames; hours to days, months to years, and centuries or more. In addition, there are changes in background from terrestrial and catholic sources. Table v.five quantifies contributions to background radiation from diverse sources. Table 5.6 gives ranges and average doses from artificial sources of exposure.

Table 5.5. Annual constructive natural background radiation doses, mSv

Source Dose range Worldwide boilerplate
Inhalation (222Rn) 0.two–10 1.26
Cosmic rays 0.three–1 0.39
Terrestrial gamma rays 0.3–one 0.48
Ingestion (40K) 0.2–1 0.29
Full i–13 2.4

Table 5.6. Annual effective radiation doses of bogus origin, mSv

Source Dose range Worldwide average
Medical diagnosis From zero to upward to several tens 0.6
Atmospheric nuclear tests Some higher doses around exam sites all the same occur 0.005
Occupational exposure From zero to upward to 20 0.005
Chernobyl blow In 1986 the average dose to >300,000 recovery workers was nearly 150   mSv and >350,000 other individuals received doses >10   mSv 0.002
Nuclear fuel cycle (public exposure) Doses are up to 0.02   mSv for disquisitional groups at 1   km from some nuclear reactor sites 0.0002
Total From essentially zip to several tens 0.6

Fig. v.3 shows the boilerplate almanac radiation dose to the United kingdom population from all sources. It is worth noting that 85% of the dose is from natural sources, largely associated with Rn emission from the ground. The dose arising from manmade radioactive discharges, both deliberate and accidental, is dominated past beneficial medical applications.

Figure five.3. Average annual radiation dose to the UK population.

Courtesy of J. Plant, Regal College, UK.

Air travel adds about 1.v–5.0   µSv/h of flight to the average groundwork dose. Exposure dose limits recommended by the ICRP are as follows: for full general public 1   mSv/yr, and for nuclear worker xx   mSv/year averaged over v consecutive years.

Read full affiliate

URL:

https://www.sciencedirect.com/science/article/pii/B9780081027028000054

Resource in the Near-Surface Earth

P.K. Killeen , ... Thousand.50. Ford , in Treatise on Geophysics (Second Edition), 2015

11.14.2.7.three.4 Groundwork radiation

Groundwork radiations is any radiation detected by the gamma ray spectrometer non originating from the source that is being analyzed, in this case the lithosphere ( IAEA, 1976). In the case of a laboratory spectrometer, it includes radiation coming from or through the walls, ceiling and floor, and the pb counting chamber or shield. In the field, it includes the radiation from the vehicle, be it a human being conveying the spectrometer, a truck, or an aircraft. In improver, there is a cosmic ray component (most important for airborne surveys) and radioactivity in the atmosphere caused by radon and its daughter products and products from nuclear fallout. To the extent possible, the background should be minimized. In an airborne survey, for example, whatever flying musical instrument dials or emergency exit signs, etc. in the aircraft, which are luminized with radium (institute in older aircraft), should be replaced or removed. Calibration sources (as described in Department xi.fourteen.ii.7.3.1 ) should be removed completely or at to the lowest degree exist shielded if they must exist carried on the survey. Whatsoever remaining background should exist measured accurately to enable its subtraction. The greatest problem with background is the variable or unknown component primarily caused by the radioactive decay of particles in the air. Background measurement was reviewed in some detail specially with reference to airborne gamma ray surveys (Grasty, 1979; Grasty et al., 1988; IAEA, 2003). More often than not, the background radiation in a borehole gamma ray log is considered to be zippo except in a few exceptional cases, since the detector is surrounded past the source. Catholic ray effects are generally negligible, and the radioactivity of the air has no influence in a liquid-filled hole and negligible influence in an air-filled pigsty (except in uranium exploration holes) due to the small volume of air.

Read total chapter

URL:

https://www.sciencedirect.com/science/article/pii/B9780444538024002098

Carcinogenesis

J.A. Jones , ... F. Karouia , in Comprehensive Toxicology, 2010

14.10.8.two Dose Limitation

Human beings are exposed to natural background radiation every day from dissimilar sources, for instance, the basis, and the universe. Average annual exposures worldwide to natural radiations sources (both high and low LET) would generally be expected to be in the range of 1–10  mSv, with two.4   mSv being the present estimate of the central value (UNSCEAR 2000). In the Usa, the majority of exposure to background ionizing radiations comes from exposure to radon gas and its decay products and as a effect the annual groundwork exposures are slightly higher, that is, 3   mSv (BRIR-7 2006). Figure 3 summarizes the estimate sources and relative amounts of loftier- and low-Let radiations that comprise the natural background exposure.

Read full affiliate

URL:

https://www.sciencedirect.com/science/article/pii/B9780080468846014111

Principal Cosmic Radiation

Peter K.F. Grieder , in Cosmic Rays at Earth, 2001

Lengthened Neutrino Sources and Spectra

Interactions of cosmic rays with the background radiations or with interstellar matter in our ain and other galaxies but also in intergalactic space could produce a diffuse background with a spectrum that extends to the highest energies ( Stecker, 1979; Volkova, 1980; Berezinsky and Learned, 1992; Yoshida and Teshima, 1993).

In addition the large number of unresolved AGNs, GRBs and blazars are likely to produce a relatively Strong diffuse flux of high free energy neutrinos of all flavors (Stecker et al., 1992b). Finally, topological defects could be some other, though highly speculative, source of very energetic neutrinos (Protheroe and Stanev, 1996; Sigl et al., 1997; Birkel and Sarkar, 1998; Yoshida et al., 1997; Protheroe, 1999).

Neutrino spectra of various lengthened sources from older calculations are illustrated in Fig. 5.74 together with the atmospheric background (Stecker, 1979 and 1992b; Berezinsky and Learned 1992; Szabo and Protheroe, 1992 and 1994; Yoshida and Teshima, 1993). When comparing the neutrino intensity from unresolved AGNs shown in this figure with the flux of discrete sources shown in Fig. 5.73 it is readily recognized that the former could easily mask distant betoken sources.

Figure 5.74. Predicted spectra of some diffuse astrophysical loftier energy neutrino sources ( ν μ + ν ¯ μ ) . The spectrum of the lengthened neutrinos from unresolved AGNs is the integrated ν μ + ν ¯ μ intensity and is from a calculation of Stecker et al. (1991 and 1992a). Too shown is the groundwork expected from photo-meson production of the extragalactic high energy catholic rays with the cosmic groundwork radiations for z = 0 and integrated over cosmic time to reddish a shift of z = 2.2 (Stecker et al., 1992b), and to galaxy formation according to Berezinsky et al. (1991), labeled "bright phase models" (see also Berezinsky, 1995). The horizontal atmospheric neutrino spectrum from high energy cosmic rays interacting in the World's atmosphere is as well indicated (Stecker, 1979; Hunker et al., 1978; Volkova, 1980) (see likewise Berezinsky and Ozernoy, 1981).

The results of another fix of calculations of the diffuse high energy neutrino flux from a diverseness of sources are summarized in Figs. v.75 to 5.78. In Fig. five.75 we prove the muon neutrino ( ν μ + ν ¯ μ ) spectra from interactions of the cosmic radiations with the interstellar medium arriving from the management of the galactic centre (50 = 0°, b = 0°) and perpendicular to it (b = 0°) from calculations of Ingelman and Thunman (1996), Domokos et al. (1993) and Berezinsky et al. (1993).

Effigy 5.75. Predicted diffuse muon neutrino and antineutrino intensities from catholic ray interactions with the interstellar medium. Curves A and B are from the piece of work of Ingelman and Thunman (1996), C and D from Domokos et al. (1993), and Eastward and F from Berezinsky et al. (1993). Curves A, C and E are for the galactic coordinates l = 0°, b = 0°, curves B, D and F for b = 90°. The hatched area, Thou, represents the atmospheric neutrino background for the zenith angle range from θ = 0° (vertical, lower boundary of hatched area K) to θ = xc° (horizontal, upper boundary of K), afterwards Lipari (1993). Curves Grand to J are neutrino intensities from cosmic ray interactions with the microwave background: One thousand is for a maximum cosmic ray energy of 3 · 1021 eV, H for Emax = 3 · 10xx eV (Protheroe and Johnson, 1995 and 1996); I is from the work of Lee (1996) assuming that the highest free energy catholic rays are due to gamma ray bursters (GRB); and J is from the work of Colina and Schramm (1985) (after Protheroe, 1999).

Figure five.76. Prediction of the diffuse muon neutrino and antineutrino intensities from blazars co-ordinate to calculations of Mannheim (1995) for pγ (model A), bend A, and pp + pγ (model B) processes, bend B, and likewise Protheroe (1997) curves C and D, respectively. Curve Due east is from the work of Halzen and Zas (1997) for pγ only. Bend F shows the contribution from gamma ray bursters (GRB) according to Waxman and Bahcall (1997) (after Protheroe, 1999). The hatched area, K, shows the atmospheric neutrino background later on Lipari (1993) over the entire zenith bending range from 0° (vertical, lower boundary of 1000) to xc° (horizontal, upper boundary of Thou).

Figure 5.77. Highly speculative muon neutrino and antineutrino intensities from model calculations of topological defects (for details meet Protheroe, 1999 and references therein). Curve A which is based on MXcii = 10xiv.one GeV and a magnetic field of B = 10−nine G is just ruled out according to Protheroe and Stanev (1996), and curve B for YardXc two = 2 · 1016 GeV and B = 10−12 K is classified as just allowed according to Sigl et al. (1997). Curve C is from the piece of work of Birkel and Sarkar (1998) for One thousandXc 2 = x12 GeV, B = 0 Thousand; and D is from Yoshida et al. (1997) for MTenc 2 = tenxvi GeV and B = 0 G. The hatched surface area, E, indicates the atmospheric neutrino groundwork from the calculation of Lipari (1993) for the total zenith angle range from 0° (vertical, lower limit of E) to 90° (horizontal, upper limit of E).

Effigy 5.78. Muon neutrino ( ν μ + ν ¯ μ ) and electron neutrino ( ν e + ν ¯ e ) intensities from topological defects according to model calculations of Protheroe and Stanev (1996) using an injection spectrum that follows approximately an E −1.five dependence and extends up to ∼ MTenc 2/two, containing ∼3% nucleons and 97% pions. The curves labeled p, north, γ and e are production spectra of MX -decays and set up bounds for the models and parameters (run into text). The curves utilise for M10c 2 = 1014.1 GeV, a magnetic field of B = 10−9 Grand and a model parameter of p = 2 (constant injection), where Q(t) = Q 0(t/t0 )−2+p is the injection rate per evolving book. The spectra of appreciable particles are normalized to the three··1011 GeV signal of the cosmic ray spectrum, outlined by circles, every bit determined from air shower measurements (Bird et al., 1995). Indicated, too, are gamma ray intensities measured with the SAS-II (Thompson and Fichtel, 1982) and EGRET (Fichtel, 1996) satellite instruments and a HEGRA point at 100 TeV (Karle, 1995).

Also shown in the same figure are the spectra resulting from interactions of the cosmic radiation with the cosmic microwave groundwork radiations (CMBR) for different cosmic ray spectral and source assumptions, as outlined in the captions (Protheroe and Johnson, 1995 and 1996; Lee 1996; Hill and Schramm, 1985). The recent atmospheric neutrino spectrum of Lipari (1993) which represents the background for any extraterrestrial neutrino flux measurements is as well shown.

Another prepare of neutrino spectra due to proton blazars and pγ and pp+pγ processes (Mannheim, 1995; Protheroe, 1997; Halzen and Zas, 1997), and from gamma ray bursters (Waxman and Bahcall, 1997) are shown in Fig. 5.76.

Topological defects could be another yet highly speculative source of very loftier energy neutrinos. The results of four calculations using different sets of topological model parameters and assumptions, such as different Ten-particle masses and magnetic field strengths, specified in the caption, are presented in Fig. 5.77 (Protheroe and Stanev, 1996; Sigl et al., 1997; Birkel and Sarkar, 1998; Yoshida et al., 1997).

Spectra of and then-called "visible particles" (p, n, γ, due east) that are associated with the decay of massive X-particles from topological defects resulting from the calculation of Protheroe and Stanev (1996) that produced the neutrino spectrum shown in Fig. 5.77 (bend A), are displayed in Fig. v.78 together with the neutrino spectrum. Also indicated are gamma ray information in the GeV energy range from SAS-II and EGRET satellite measurements (Thompson and Fichtel, 1982; Fichtel, 1996), and from the HEGRA experiment at 1 TeV (Karle, 1995). These spectra ready stringent limits on topological models and their many parameters.

Read full affiliate

URL:

https://www.sciencedirect.com/science/article/pii/B9780444507105500075

Medical Geology

In Developments in Earth and Ecology Sciences, 2004

Natural Radionuclides and Risks

In their newspaper "Two centuries since discovery of the chemic element uranium" (1989), Five. Omaljev and A. Antonovic wrote as follows: "Uranium today is tasked with a heavy burden of guilt for electric current and time to come radioactive pollution of the planet. Specially worrying are uncertainties regarding possible (uncontrolled) mutagenic changes in living organisms, man higher up all. Systematic biological and medical investigations of this question for practical purposes began only after the diminutive bombing of Japan, and a period of 45 years is as well brusque to see the long – term. risks." The authors ask whether this means the end of the dominion of uranium.

Together with economic and other global questions, rubber is certainly a pressing concern, and about people hold it to be the most alarming problem of the present twenty-four hours. In the event that nuclear power plants show to be faulty for any reason and radioactive substances are emanated into the environment or if nuclear tests are carried out and nuclear weapons used on a massive scale, the consequences could be catastrophic. At the present level of civilization, the modern world is increasingly destructive, and the consequences are unfortunately moving always closer to a state of ending. We annotation only what was washed recently to the population and environment of Iraq, Yugoslavia, and Bosnia and herzegovina by tens of tons of depleted uranium used in the making of projectiles.

The anthropogenic component of the biosphere'southward radiation background is what has raised this global question. However, even after eventual cessation of the rule of uranium, there would remain a group of natural radionuclides with all potential risks to human health 54 . According to the results of studies performed to engagement, natural radiation of the lithosphere (biosphere) is responsible for one fourth of the total bones radiation load of all living beings. To exist specific, the role of natural radionuclides and their radiations as sources of radiation run a risk to the biocenosis has been present since the beginning of life on World. Thus, natural radionuclides have through radiomutations non merely affected the evolution of living organisms, but also influenced the dynamics of equilibrium of ecological systems.

In contrast to the large doses of ionizing radiation produced predominantly by fabricated radionuclides, it is fairly logical to maintain that in the case of so – called small doses, i.eastward., doses that do not crusade clinical symptoms but which ert singled-out bionegative activeness, natural sources of radioactive radiation assume the leading function. It is known that opinions amidst radiobiologists are profoundly divided nigh the effect of small-scale doses. Nonetheless, many authors hold that a slight increment of absorbed radiation raises the probability of cataracts, increases the incidence of radiation illness, contributes to the formation of tumors, shortens life expectancy, and slows fetal development in the mother's uterus. Following the engagement of earth experts on an international projection during the period of 1980 – 1984, information technology was concluded (among other things) that: 1) pocket-sized doses of ionizing radiation ert distinct bionegative action (boldface mine – M.M.); 2) the bionegative action of small doses of ionizinq radiation conforms to a linear dependence with no threshold; 3) since it is impossible to isolate small doses of ionizing radiations every bit one of a large number of carcinogenic factors, it is necessary to prevent increase in the level of all of the indicated factors; and 4) we need to monitor constantly the radiations load of the population through the whole nutrient chain, including cultivated plants, livestock, and foods of plant and fauna origin.

Medical geology tin make a great contribution in establishing natural preconditions for occurrence of small doses of ionizing radiation in rocks, soil, and water, and in planning measures designed to lower the level of carcinogenic factors. This is especially true because the content of radioactive elements in the natural environs is non ever the same, and organisms inhabiting a given region are exposed to radioactive radiation of varying strength: greater in some cases, bottom in others. Nosotros shall therefore dwell in the text to follow on sure geological factors that deserve special attention.

The kickoff and very important step in investigating a region of interest is to isolate uranium – begetting rocks and formations in it. We accept already discussed the kinds of rocks (such as acidic granites) that are potential carriers of radionuclides. Nonetheless, it was non stressed that uranium – bearing geological formations can potentially encompass enormous areas and thereby increase the natural radiations load of a large number of people. Thus, schists rich in organic substances and uraninite (for instance, the Chittanunga formation in the states of Alabama and Kentucky, roofing an area of the society of 100 m km2) occur in many regions of the globe. In the U.s.a., the Phosphoria formation (phosphate deposits with 0.003 – 0.03% uranium) and other formations on the Colorado Plateau and in Eastern Utah, Northeast Arizona, and Northwest New United mexican states comprehend an even greater area. The Francevillien formation in Gabonese republic covers an surface area of about 35 thousand km2. When it is taken into account that the Oklo deposit or phenomenon (a natural nuclear reactor) is found in that formation, it is easy to grasp the potential danger from constant reactions of fission in that region. Large amounts of thorium are institute in sands of marine beaches in Southern Republic of india and (to some extent) Australia. By conducting regional geomedical investigation (mapping) of the natural radionuclides in rocks and soil in an area of interest to u.s.a., nosotros are able to isolate radioecogeological ranges (regions) in which the biocenosis (including human beings) carries a certain radiations load. It is understandable that some regions of the earth are well – known for high content of radioactive elements (the state of Kerala in India, the La Plata Plateau in Brazil, the wider region of the Oklo uranium mine in Gabon, etc.).

That man has been unable to adapt to natural radiation fields is shown by the results of international enquiry indicating that regions with elevated content of natural radionuclides in rocks and soil are characterized by slow population increment, increase in the number of birth defects, increase in the number of organic diseases, and increased bloodshed (especially among children). Circuitous studies on the question of existence or non – existence of radiogeochemical endemias remain to be carried out. In their book "Radiation Hygiene in Biotechnology" (1991), B. Petrovic and R. Mitrovic land that there is enquiry indicating decisive significance of radiogeochemical endemias, but likewise work that negates them. For example, English investigators maintain that elevated concentration of radon Rn222 in drinking water in the county of Devon (United kingdom of great britain and northern ireland) represents the main cause of more than frequent incidence of cancerous diseases amidst the population. Like conclusions were reached by American investigators who carried out radioepidemiological studies on more than than a million people in 111 cities in usa of Iowa and Illinois: they affirm that considerably elevated concentrations of the indicated radionuclide in drinking h2o above the boilerplate level lead to significant differences in mortality from malignant diseases of the bones (which are manifested particularly in persons more than than thirty years old). Also, Soviet investigators reported results of measuring natural radioactive decay in the bones of persons who died of leukemia and ones who died of diverse injuries: the total action of gamma – emitting radionuclides was more two times greater in bones of those who died of leukemia than in basic of persons who died of traumatic causes. From the results of his own investigations and reports of others, the French scientist Pincet concludes that there exists a significant correlation between the level of the radiations background in the biosphere and mortality due to malignant diseases. Information technology follows from all that has been said that familiarity with the radiogeological regionalization of a sure area is very important in order to institute the radiation load of the population in information technology: doctors need to pay special attending to the most unsafe radioecogeological ranges.

Depending on the rocks from which it was formed and content of the clay component, soil can be radioactively contaminated to a greater or lesser extent. Natural radionuclides accumulated in the soil are incorporated metabolically into plants and through contaminated food fmd their way to the organism of human being and animals. The given danger to living beings understandably is especially expressed within radioecogeological ranges marked by an elevated background of natural radioactive elements. The problem is further complicated past the presence of strontium, cesium, and other artificial products of long – lived radionuclides, which volition be discussed in a special affiliate.

Uranium tin exist relatively hands transported, concentrated, and carried away from its principal deposit, depending on geological (hydrogeological) conditions. A very important function in this is played by groundwater, since uranium from the moment of its oxidation and dissolution in water moves almost freely through the Earth's crust (Fig. ii.37.). Uranium by itself is not necessarily all that dangerous, but the products of its radioactivity not just threaten nature with radioactive radiation, they are also toxic equally elernents 55 . The function of aqueous solutions is therefore of primary significance for the creation of chemical dispersion aureoles of uranium and its decomposition in the surface dirt layer of river sediments. Nosotros have in mind here anomalous zones, with uranium content of upwardly to 200 ppb and more (equally much equally 400 ppb and sometimes ten times higher in mine waters). Nether favorable hydrogeological conditions (for example, the presence of extended error zones), uranium can be transported over very great distances, which must be kept in mind. Thus, for example, the origin of high uranium content (50 – 100 ppb) in one leap in Colorado is linked with rocks at to the lowest degree 100 km away. For all of these reasons, in hydrogeological investigations of a region, special attending must be paid to the content of radioisotopes in drinking h2o and water used for product of food, so as to prevent them from entering the human organism in this way.

Fig. 2.37. Ratio of uranium concentration c(µg/l) and water mineralization m (mg/1) from Ogallala formation in Texas (Us)(S.N. Davis, R.J.M. de Wiest, 1966). 1- samples with HCO3, over fifty% of all anions; 2 – samples with HCO3 is less than fifty% of all anions.

Pollution of the environs by radioactive elements is certainly at the very top of the list of all problems that accept ever existed and chosen for protection of nature and mankind. The complexity and danger of pollution of nature by radioactive elements have forced man to study in the minutest detail processes involving radioactive elements. Natural and bogus radionuclides introduced into the organism by ingestion or inhalation is distributed to individual organs in keeping with metabolism of the radionuclide itself and sensitivity of the organ to radiation. All of this is taken into account in estimating size of the contribution of radionuclides from drinking water, food products, and other sources of radiation to full irradiation of the population.

Radiation that from the identify of its generation enters inhabited regions or regions not direct connected with production or utilization of nuclear energy should not exceed 0.5 sieverts (Sv) 56 per person annually. In regions with widespread radioactive minerals, the dose of groundwork irradiation a man receives can be 4 times greater than the given limit, which must have greater or lesser harmful effects on human wellness 57 . If some short-term radiation is involved, more considerable changes in the cardiovascular organization can set in at a dose greater than 150 – 200 mSv, while a dose exceeding 700 – 1,000 mSv is usually fataL The term maximal permissible concentration is used in practice, although it is highly questionable whether at that place exists a limit beneath which safety is guaranteed. Even if a maximal permissible dose of radiation is establish, it remains for us to found the action of individual radioisotopes, especially inasmuch as they emit unlike kinds of radiation and at unlike rates. Moreover, some elements have a trend to accumulate in different parts of the human organism. Radioisotopes of such elements cause concentrated biological damage. For example, plutonium, radium, and strontium accrue in the bones; iodine accumulates in the thyroid gland; and lead and h2o – soluble isotopes accumulate in the kidneys. The lungs are the critical organ in the case of radionuclides in undissolved class.

Differences in the biological danger caused by dissimilar radioisotopes are generally slap-up. For example, if nosotros compare Ra228 and H3 (both with a half – life of several years and emission of low – energy beta rays), we see that the time of their residence in the man organism is different. A large part of H3 disappears from the organism later on several weeks, whereas R228 remains in the bones forever. Several other specificities in behavior of the more of import radioisotopes are examined below.

The about widespread isotope of radium Ra226 is the most toxic of all inorganic components and has very strong carcinogenic action. Its biological half – life is quite long, 45 years. Its chemical beliefs is like to that of calcium. By means of resorption from the soil, information technology readily enters plants and later reaches man by manner of animals and nutrient of brute and plant origin. Although the average concentration of this element introduced with food is 111 – 185 mBq/day, American investigators established slap-up variability of its concentration in different kinds of food. For example, the very loftier activity and content of R226 in filberts as a natural phenomenon has occupied the attention of many investigators.

Co-ordinate to B. Petrovic and R. Mitrovic (1991), many workers accept investigated the content of Ra226 in the human organism and attempted to decide the radiation load of this radionuclide. It has been established that in inhabitants of Europe and Central America, its average corporeality in bones is 0.37 – 0.55 mBq/chiliad of ash (which for the boilerplate man corresponds to an activity of ane.eleven – 1.48 mBq). This figure is somewhat college in regions of N America and Asia.

The presence of radon in water is non dangerous to homo health. However, radon in the gaseous state erts undesirable influence. Radon is fid in the lungs every bit the critical organ. The real danger of radon is from its brusk – lived products bismuth and polonium. Radon and its products emit predominantly alpha rays, which accept a very limited range, only are energetically quite strong and within a diameter of less than ane mm crusade heavy harm to the lungs. The take a chance of cancer is directly dependent on the concentration of this gas in the infinite where man spends his time and is specially high for miners working in shafts in uranium deposits.

The positive influence of radon and unknown doses of blastoff rays from products of its breakdown on the human organism is widely used in balneology. Radioactive (radon) waters alter the blood movie, lower blood force per unit area, bear upon certain allergic diseases, act on performance of the central and vegetative nervous organization, stimulate many compensatory – adaptive responses of the organism, etc.

Read full chapter

URL:

https://www.sciencedirect.com/science/commodity/pii/S157191970480003X

Imaging Through the Atmosphere☆

North.S. Kopeika , ... Y. Yitzhaky , in Reference Module in Earth Systems and Environmental Sciences, 2014

Seeing Limit

The resolution impairment caused by the atmospheric background radiations can be understood in the following mode. The subtract of the MTF with increasing spatial frequency signifies dissimilarity degradation at higher spatial frequencies. At some relatively high spatial frequency, the organisation MTF has decreased to such a low contrast that information technology is below the threshold dissimilarity office required by the observer at the output. This means that such a high spatial frequency content of an image cannot exist resolved by the observer considering of its poor contrast.

The dissimilarity degradation caused by atmospheric background causes the overall organisation MTF to be damped uniformly across the spatial frequency spectrum. Equally shown in Effigy viii , this leads to a reduction in f r max or an increase in Δten in eqn [6], thus impairing resolution. For orientation, recognition, and identification requirements, equally opposed to unproblematic detection, f r max should be divided by the required number of Television receiver lines or spatial frequencies (discussed in Biberman (1973), or Kopeika (1998), concerning the Johnson chart).

Figure viii. Reduction in usable resolution imposed by uniform damping of modulation contract part (MCF) by atmospheric background. MTF, modulation transfer function.

Read total chapter

URL:

https://www.sciencedirect.com/science/article/pii/B9780124095489090539

Measuring Temperature

Dario Camuffo , in Microclimate for Cultural Heritage (Tertiary Edition), 2019

17.3.6 Proficient Practices and Misleading Interpretations

By and large speaking, ecology diagnostics based on the results of merely one methodology may exist misleading and a good norm is to cantankerous-compare findings derived from contained methodologies. The problem is that a number of unlike mechanisms may produce the same thermal paradigm and a thermogram may be useful, merely if information technology is considered alone, it is neither necessary nor sufficient to identify which of the various mechanisms is going on. While thermograms constitute a useful investigation tool, they are non per se sufficient to draw conclusions without the support of other specific investigations that are always necessary for confirmation. In the following discussion, some useful examples will be reported in order to explicate this methodology.

Critical factors are cold spots, capillary rise, and water percolation. Nearly handbooks suggest that dampness in masonry, eastward.thou. capillary rise and water percolation are easily recognized from the cold spot generated past evaporation from the clammy surface. The reason is that the latent oestrus of evaporation is supplied partly by the air and partly by the surface that is consequently cooled. This is mostly true, equally shown in Fig. 17.24.

Fig. 17.24

Fig. 17.24. Visible and IR paradigm of dampness for capillary rising and gutter percolation on the corner. The damp area is contoured with the green line in visible picture and appears as the colder role in the thermogram.

However, several examples have been found where the coldest part of the thermogram is not damp, or fifty-fifty where dampness is found in the warmest part of information technology, as shown in Fig. 17.25. In Grimani palace, Venice, unsealed windowsills allowed rainwater to percolate into the wall on both sides forming damp spots (Camuffo et al., 2011). The spots were cold in the morn every bit generally expected just warm in the afternoon. The reason is that dampness in the wall constitutes a thermal bridge, connecting the within with the outside surface. In the early morning (Fig. 17.25A), the damp spots were the coldest areas not only because of reduced evaporation but also considering the within rut was transported by conductivity to the outside. In the afternoon (Fig. 17.25B), the oestrus flow was reversed: the external surface, facing east, was overheated during the whole morning and the external heat entering the room formed the hot spots. Dampness increases the thermal electrical conductivity and the thermal capacity. When the conductivity increases, it forms a thermal bridge and the heat menses too increases, and this effect may largely dominate over a weak evaporation cooling. Also, the window posts facing the solar beams and the glass panes were overheated and formed secondary warm bands inside.

Fig. 17.25

Fig. 17.25. Water infiltration: sometimes a colder spot, sometimes a warmer one. Temperature (°C) mapping of a room with damp spots on a wall. Clammy spots of percolated water prevarication on both sides of the windowsills, on 14 May 2008. At 9:00 (A), the clammy spots found the coldest areas; at xv:00 (B), they are warm for the thermal bridge established with the exterior. Other warm spots correspond to heating from window posts and glass panes. Thermogram executed sampling with a nonimaging radiometer and plotting with computer mapping.

From Camuffo et al. (2011) past kind permission of Nardini Editore, encounter credits.

In the next thermogram (Fig. 17.26), windows of a historic palace are contoured by cold bands in winter. Possible hypotheses are: (1) the wooden frame is afflicted by leakage and cold air blows in, cooling the nearby masonry; (2) some h2o percolated into the wall from the window and the evaporation causes the cooling in the damp area; and (iii) the outdoor temperature is lower, and oestrus naturally passes from higher to lower temperature levels crossing the wall and the window pane, which is colder. Still, some heat crosses diagonally the wall via a shorter path going out through the windowsill and the ii lateral posts, forming a common cold ring all around the window. In this instance, the vertical cold bands on both sides were due to estrus transferred across the shorter diagonal path; the cold area under the window, where individual bricks are distinguishable, is due to water percolation that locally increases the rut conductivity across the masonry, forming a thermal span with different efficiency respective to bricks or mortar. The problem with thermograms is that different mechanisms may have the same appearance.

Fig. 17.26

Fig. 17.26. These windows are surrounded by cold bands. Common cold bands may be caused by air leakage through the frame, water infiltration, and shorter path for heat transfer. See text for caption.

From Camuffo et al. (2011) by kind permission of Nardini Editore, come across credits.

Other two examples of warmer evaporating surfaces: in winter, the water flowing in the Venice canals is mild existence continually exchanged with the bounding main that acts every bit a huge thermal buffer. In contrast, the temperature of the air in this region is cold or very cold. Palaces built on the side of the border of canals have their basement immersed in water and continually receive heat by conductivity. In the first example (Fig. 17.27), the basement is damp and characterized by a light-green–dark-brown belt of algal infestation. Although this ring is damp and evaporates, it is warmer than the upper part of the building, which is dry out, but is in contact with the cold air without benefitting from the rut supplied from seawater.

Fig. 17.27

Fig. 17.27. Sometimes rise damp is warmer. Grimani palace, Venice, visible and IR mixed motion picture. In winter, the water temperature in the canal is at 4°C (yellow–orange) and the clammy band infested by algae on the building (orange) has virtually the aforementioned level. The upper part of the wall, dry, is colder (magenta). The rut supplied past the canal is dominant over the rut lost due to (small-scale) evaporation. The evaporating band in the basement is warmer.

In the second example, a funeral monument in a Venice church building has a basement in contact with the soil, which is mostly damp due to a variable water tabular array fed by the tide level (Fig. 17.28). The hole-and-corner water migrates transporting heat. The clammy funeral basement has modest indoor evaporation, just especially benefits from the heat supply. Once more, this evaporating surface is warmer than the residue of the church building.

Fig. 17.28

Fig. 17.28. Sometimes rising damp is warmer. The funeral monument of Canova, within the Basilica dei Frari, Venice, is afflicted past capillary rise of h2o originating from the canal. Nonetheless, the indoor evaporation occurs at a very slow charge per unit and the rut transported by the mild canal water is ascendant over the loss of latent heat due to evaporation.

Courtesy of Curia Patriarcale of Venice ©, used with permission.

In these examples, evaporation was occurring in the warmest function of the epitome. The next thermogram will show another potentially misleading case: a cold area, merely not related to dampness. In the warm season, in this crypt (Fig. 17.29), warm air is penetrating from outside, heating pillars, ceiling, and walls. All the same, corners and the border between flooring and walls are hardly reached past the warm air entering the catacomb.

Fig. 17.29

Fig. 17.29. Do colder area e'er stand for ascension clammy? A ventilated crypt in the warm season. The external air warms pillars and walls but hardly reaches corners and edges, which remain colder. The floor (in brick) and the column (in squared rock blocks), with college conductivity. For this reason, the floor at the base of the column is colder. This is not capillary ascent but higher conductivity compared with bricks (by the way, bricks are more porous and favour capillary suction).

The result is that all corners and edges are colder and sometimes this common cold may crusade elevated moisture levels and a favourable habitat for mould colonization. In this case, the cold areas are not due to capillary rising and evaporation but only due to reduced ventilation; for this reason, they might be affected by condensation, i.e. the reverse of evaporation.

Glass is mainly transparent, partially reflecting and slightly absorbing the visible light, while it is mainly absorbing, partially reflecting, partially transparent in the near infrared (NIR) (i.e. partially transparent for wavelengths from 750   nm to three   μm), and almost opaque to the IR radiation above iii   μm. For this reason, radiometric readings monitor the radiant estrus from the glass but are likewise afflicted by the reflection from bodies in front end of the drinking glass, e.g. the operator (Fig. 17.30A and B ). An example of IR absorption is given in Fig. 17.30C where a glass pane, ii   mm thick, is put in front of a loving cup of tea, covering the right-hand side, which becomes invisible to the thermal camera that operates in the 8-to-14-μm window. If a thermal camera operating in the ane.v-to-five.0-μm range had been used, a faint image of the one-half-cup would take appeared by glass transparency.

Fig. 17.30

Fig. 17.thirty. Problem of reflection and transmission met with glass. (A) Visible light and IR reflection on the glass panes of a glass door bookcase. The reflected image of the operator apparently increases by some 4°C the glass temperature. (B) IR reflection on stained glass panes. The temperature difference pointing only at the forehead or exterior it is 2.8°C. The metal frames appear warm because they reflect the IR emitted by the operator. (C) A cup of tea, and the same with a sparse glass pane (GP) placed in front of it, on the right-hand side. The IR absorption of the glass pane makes information technology invisible to the thermal camera operating in the 8-to-fourteen-μm window.

(A,B) From Camuffo (2010), by kind permission of Nardini Editore, see credits.

Metals are good IR reflectors and conduct as mirroring surfaces. For this reason, radiometers are not suitable to measure metal (or glass) temperature. A trick solution is to stick on the metallic (or the glass) surface a blackness vinyl electrical tape characterized by high emissivity (see afterward). In a few minutes, the record will assume the temperature of the metal, allowing to measure out the temperature of this black surface, avoiding reflected or transmitted radiation.

A skillful use of paper (emissivity ɛ  =   93%) is to allow measuring the air temperature. Air is transparent to IR, just a canvas of paper reaches equilibrium with the air temperature and a radiometer, or a thermal imaging photographic camera, may mensurate information technology. Unrolling a jumbo roll of thick Kraft wrapping paper, or a roll of thick paper towels, a long strip is obtained that may be raised in vertical from the floor to the ceiling to measure the thermal layering of the air in the room. This method has the reward of taking readings of both the surface and the air temperature using the aforementioned instrument, i.e. a thermal camera, thus avoiding errors due to the item response (or scale) of different methodologies.

A very advantageous practice (Camufo, 2010; Camuffo et al., 2010b) is to use a roll of newspaper to obtain vertical temperature profiles, as in Fig. 17.31, during a field survey in a church to control efficiency and risks of a warm-air heating arrangement. One time the paper roll is raised vertically, it is sufficient to take repeated thermograms at regular time intervals to monitor how estrus is distributed and the resulting thermal layering.

Fig. 17.31

Fig. 17.31. Air and surface temperature simultaneously detected with a thermal imaging photographic camera. Plaster, wood, and paper have virtually the same emissivity. With a vertical strip of newspaper, it is possible to monitor both the surface and the air temperature in the same thermogram. The vertical temperature profile is obtained from the pixels sampled on the paper strip. On the right, the vertical temperature profiles made with repeated thermograms starting when the warm air heating was turned on, i.e. at time 0   min (blue line), and after sixty, 75, 90, 100, and 130   min (colour codes in the legend). After 2   h, the top level was heated by ten°C, churchgoer level by v°C.

Source Church in Agordo, Italian Dolomites; From Camuffo (2010), see credits.

The same thermograms testify, in add-on to the thermal profile in air, the direct impact of heating on objects, decorations, and structures that will respond in a different way, depending on their heat conductivity. This method is peculiarly useful because the image provides at the same time forcing factor (i.e. warm air) and effect (i.e. response of all objects). The thermogram is accurate because paper and the materials most usually found inside historic buildings, i.e. mortar, bricks, plaster, frescoes, tapestry, and wood, all have almost the same emissivity, i.east. 0.xc   < ɛ  <   0.94. This methodology is useful for ecology diagnostics and preventive conservation purposes, also as for tuning the heating arrangement, east.yard. lowering the warm air temperature or increasing the air blowing rate to reduce internal layering and overheating in the upper office of the building.

In general, radiometers and thermal radiation cameras are equipped with a knob to adjust the emissivity value. Permit u.s. suppose an farthermost example where the target is an object made of stainless steel with ɛ  =   7%. This means that only 7% of the radiation emitted by the target will contribute to decide the reading, while 93% of the radiation is originated from other bodies and reflected by the target. In this example, the target behaves equally a mirror. The temperature of the object is irrelevant and the instrumental reading volition be high or low, depending on the reflected radiation, i.east. if the surrounding environment is warm or cold. To correct the instrument, one should conform both the knob of the surface emissivity and supply the surrounding temperature; still, the signal-to-noise ratio is so small that the reading is exceedingly uncertain. A metallic temperature cannot exist measured with a radiometer or a thermal camera. As opposed, if the object is a book with ɛ  =   93%, only the 7% of the radiation constituting the signal is reflected, then it makes sense to right the reading. This ways to adjust the emissivity knob, just likewise to specify the temperature of the environment from which the perturbing radiation is emitted.

A popular solution to avoid the reflection from tertiary bodies is to simulate a black torso with the help of a black vinyl electrical record (ASTM E1933-14, 2014). The method consists in applying a piece of blackness record (95   < ɛ   <  97%) on the target surface, waiting a few minutes so that the tape reaches thermal equilibrium, and then pointing and taking a shot of the tape.

This blackness tape method is also used to determine the unknown emissivity of surfaces. The unknown emissivity may be recognized in diverse ways, due east.1000. traditionally by comparison with a reference and with the aid of some formulae and measurements based on: (i) a known reference emitter; (ii) a reference temperature measured directly (e.g. with a thermocouple); (iii) two unlike, known temperatures to which the target is raised; (4) reflectivity of an object at unlike temperature (Madding, 1999), or with an image processing software (Pitarma et al., 2016). The instruction transmission of the thermal camera may provide further details nigh these or other possibilities.

If the surrounding environment includes emitting surfaces at different temperatures, eastward.g. hot or cold spots, incandescent lamps, windows, persons, and the white-torso temperature differs from the target temperature, it is necessary to perform a correction based on the target emissivity and the effective ambience temperature.

Opposed to the black tape method that avoids the groundwork radiation diffused in the surround, the groundwork radiation may be determined with the help of a diffusive mirror, i.eastward. a 'white body'. 78 This can exist obtained with a finely crumpled foil of reflective polished aluminium (ɛ   =  five%); where the crumpling transforms directional reflectivity into lengthened reflectivity. Pointing and shooting this white body surface positioned near the target, one obtains a reading that is 95% representative of the environs temperature and only 5% of the aluminium.

Afterwards these determinations, it is possible to conform the emissivity knob of the IR thermal photographic camera to the appropriate emissivity value, and add the input of the effective temperature that characterizes the groundwork radiation. This operation will allow performing more precise temperature determinations of the target object. Nevertheless, it may be useful to clarify that a precise determination does not imply that the thermogram volition provide a dainty image. An example is shown of a thermogram of some objects taken in wintertime in a heated room and then repeated in summer. In wintertime (Fig. 17.32A), the wall temperature is around xx°C, but the air is a few degrees warmer for the convector radiators and the nigh exposed parts of the objects proceeds some oestrus and become distinguishable from the wall behind them. Books with gold titles on the spine are visible because they reverberate the IR emitted by the operator. In summer (Fig. 17.32B), the room is non conditioned, and walls and objects are at the same temperature level, or with very small differences. This uniformity makes the objects undistinguishable from the groundwork. In winter, the image is aesthetically overnice for the contrast of temperatures; the temperature determination may require emissivity correction. In summertime, the image is obscured because everything is homogeneous, at that place is no demand for correction and the temperature determination is very precise. This follows a principle that pleasant images having overnice colour contrasts do not necessarily reverberate precise temperature determinations and may need corrections, and vice versa.

Fig. 17.32

Fig. 17.32. (A) Thermogram of some objects in winter, in a heated room, when air and objects are a fleck warmer than the wall and become distinguishable (left). (B) The same in summer, but in a room without air workout, where the temperature is homogeneous and objects are indistinguishable from the wall (correct).

Read total affiliate

URL:

https://world wide web.sciencedirect.com/science/article/pii/B9780444641069000171

Volume 3

T. Paunesku , G.E. Woloschak , in Encyclopedia of Environmental Health (Second Edition), 2014

Introduction

Ionizing radiation is a part of the natural environment humans are exposed to on Earth with nigh groundwork radiations coming from radionuclides naturally present in nutrient and the air (radioactive potassium and radon). Additionally, different parts of the earth have different background radiation derived from pollutants and space radiation. In recent years, radiations from medical procedures (such as computed tomography (CT) scans, nuclear medicine procedures, and others) is on the rise and now leads to probably v times as much radiation exposure to humans equally information technology did 20 years ago. The yearly radiation dose to which an private is exposed can be estimated using the dose chart on the website of the American Nuclear Gild ( world wide web.ans.org/pi/resources/dosechart).

Several issues are of particular importance when considering the furnishings of ionizing radiation on organisms, including the quality of the radiation, the total dose, and the dose charge per unit of the radiations exposure. Although any ionizing radiation causes ionizations in the matter it traverses, particulate radiation have high linear energy transfer (high Let) and can deposit larger amounts of damaging free energy per unit gram of tissue. High LET radiations include alpha particles (the girl products of radon gas, for instance) and neutrons (which outcome from nuclear bombs).

The main cellular target of ionizing radiation is nuclear deoxyribonucleic acid (Dna) and the two major, best-known, and nigh studied complications associated with radiation are mutations that lead to evolution of cancer and mutations that occur in germ cells of irradiated organisms causing hereditary mutations. In recent years it has go more apparent that the effects of radiation on nuclear DNA are heavily influenced by the cross-interaction between radiation-induced DNA lesions themselves and the current status of the damaged DNA site in the cell. An actively transcribed DNA sequence volition be repaired differently from one that is inactive in a given cell type. In addition, relative distance of the lesion from regulatory regions of the chromosome such as telomeres and centromeres, also play a large role in how the impairment will be repaired. Availability and proximity of DNA repair protein machineries and biochemical state of Deoxyribonucleic acid, such as Dna methylation, all play a office in events that deal with a new DNA lesion. In other words, although the primary DNA sequence is the molecule damaged past radiations, additional epigenetic factors are responsible for the final fixation of mutations. Ultimate issue of Dna damage depends on the environment of the prison cell/tissue/organism, besides every bit cellular factors overall cellular metabolism and the state of protein mechanism dealing with reactive oxygen species (ROS). All of these factors create the environment encountered past the Dna dissentious issue ( Effigy ane ), and all of them play a role in (one) the initiation and promotion of cancer, (2) 'fixation' of mutations in the germline cells, (iii) teratogenic effects in the progeny, and (4) normal tissue toxicity. The first two categories of outcomes depend primarily on the Deoxyribonucleic acid damage itself. Because Deoxyribonucleic acid damage depends on a chance ionization outcome caused by irradiation, cancer and hereditary mutations display the stochastic furnishings of radiations. On the other hand, the 2d ii outcomes depend more often than not on inherent and relatively constant cellular biochemistry, therefore they showcase primarily the deterministic effects of irradiation.

Figure 1. Genetics, epigenetics, and environment modulate effects of irradiation on organisms. The spectra of possible outcomes of irradiation include teratogenic effects, tissue toxicities, hereditary effects, and the development of cancer.

At the level of a whole organism, any or all of these effects can be observed ( Figure 2 ). Each individual prison cell experiencing Deoxyribonucleic acid impairment may either (i) repair the damage fully and return to normal homeostasis; (ii) repair the impairment partially and fail to return to a state of homeostasis, acquiring genomic instability which leads to new DNA mutations; (three) repair the damage partially and differentiate or (four), fail to repair the impairment and die. Because radiation of a whole organism affects many cells any one of the four scenarios is likely to happen in some cells or tissues. Therefore, irradiation of an beast can be expected to cause tissue toxicities to a degree dependent on tissue type and irradiation parameters, mostly as an outcome of inappropriate jail cell differentiation and cell death. In an embryo the aforementioned state of affairs tin pb to teratogenic effects. Partial DNA repair and/or genomic instability, on the other hand, can result in hereditary mutations and development of cancers.

Figure two. An organism's reaction to radiation may favor cell survival leading to a genomic instability scenario, with aggregating of mutations, epigenetic DNA changes, and transcription arrest. Long-term results of these events could include the development of cancer or the accumulation of hereditary mutations. Conversely, when a reaction to radiations triggers mechanisms for the maintenance of genomic stability, jail cell death or differentiation can occur; this can atomic number 82 to the development of tissue toxicities or teratogenic effects.

Historically, the consequence of ionizing radiation on organisms was not recognized until decades after the discovery of radionuclides and X-rays and their use both therapeutically and diagnostically. In fact, many early radiologists and radiation oncologists died of cancers caused by radiation effects, never realizing the extent to which the radiation exposures induce cancer development. A more complete understanding of the possible spectra radiation furnishings was reached only later on the use of atomic bombs in Nippon. This lead to an era of extensive research in radiation biology, prompted primarily by the common cold war era concerns most nuclear warfare, as detailed recently past Abbott. While studies of bomb survivors and accidentally exposed humans continue to occupy much of radiation enquiry, controlled radiation exposures of model animals became principal focus of government sponsored radiation enquiry programs. In Usa, Europe and Asia studies of the effects of ionizing radiation included use of (1) external beam radiation (gamma rays and neutrons, similar in quality to the fission spectrum neutrons) and (2) radionuclides (such as plutonium, cesium, etc.). Radiation exposures were advisedly controlled and studied in different animals, near usually mice, rats and dogs. It is worth noting that this work preceded the knowledge about DNA structure and genetic code.

The first experiments looking at the effects of exposing Deoxyribonucleic acid to radiation were conducted in 1950's completely in vitro, by Peacocke and others. The construction of DNA and its role every bit genetic material were uncovered during the aforementioned period. Subsequently the inception of molecular cellular biology started the studies of the molecular effects of radiation.

Meanwhile, medical enquiry on radiation furnishings was progressing steadily, leading to development of intervention procedure that mitigates the os marrow syndrome caused by radiation exposure. Thus, four victims of a radiation accident in Vinca (in the former Yugoslavia) in 1959 were saved by the first human bone marrow transplantations performed by Dr. Mathe in France. This rescue intervention was based on a single successful animal experiment -- the bone marrow transplantation in hamsters. The victims of the Vinca blow received combined doses of 165 to 227   cGy neutrons and 158 to 209   cGy gamma rays. Whole trunk exposures of these doses are lethal without medical intervention, and the exposed foursome survived primarily because of bone marrow transplantations. The health of these individuals was followed throughout their lives; nineteen years after the blow their lymphocytes nonetheless showed chromosomal aberrations such as dicentrics and ring chromosomes. Hematopoietic stem cell transplantation is nevertheless used today as ane of the standard treatments later on radiations injury albeit with different success.

The effects of radiation are so circuitous that, of necessity, most researchers focus only on one or a few of the effects of radiation in a unmarried study. Presented below are results of contempo investigations of the effects of irradiation at molecular, cellular, and whole organism levels. The conclusions of these studies corroborate each of the iv major types of irradiation damage to the organisms. These effects are oftentimes overlapping and difficult to distinguish from one another.

Read total chapter

URL:

https://world wide web.sciencedirect.com/science/commodity/pii/B9780124095489091223