earthquake prediction research: Topics by Science.gov

#### Sample records for earthquake prediction research

1. Earthquake Prediction is Coming

ERIC Educational Resources Information Center

MOSAIC, 1977

1977-01-01

Describes (1) several methods used in earthquake research, including P:S ratio velocity studies, dilatancy models; and (2) techniques for gathering base-line data for prediction using seismographs, tiltmeters, laser beams, magnetic field changes, folklore, animal behavior. The mysterious Palmdale (California) bulge is discussed. (CS)

2. Earthquakes: Predicting the unpredictable?

USGS Publications Warehouse

Hough, Susan E.

2005-01-01

The earthquake prediction pendulum has swung from optimism in the 1970s to rather extreme pessimism in the 1990s. Earlier work revealed evidence of possible earthquake precursors: physical changes in the planet that signal that a large earthquake is on the way. Some respected earthquake scientists argued that earthquakes are likewise fundamentally unpredictable. The fate of the Parkfield prediction experiment appeared to support their arguments: A moderate earthquake had been predicted along a specified segment of the central San Andreas fault within five years of 1988, but had failed to materialize on schedule. At some point, however, the pendulum began to swing back. Reputable scientists began using the "P-word" in not only polite company, but also at meetings and even in print. If the optimism regarding earthquake prediction can be attributed to any single cause, it might be scientists' burgeoning understanding of the earthquake cycle.

3. Geophysical Anomalies and Earthquake Prediction

Jackson, D. D.

2008-12-01

some understanding of their sources and the physical properties of the crust, which also vary from place to place and time to time. Anomalies are not necessarily due to stress or earthquake preparation, and separating the extraneous ones is a problem as daunting as understanding earthquake behavior itself. Fourth, the associations presented between anomalies and earthquakes are generally based on selected data. Validating a proposed association requires complete data on the earthquake record and the geophysical measurements over a large area and time, followed by prospective testing which allows no adjustment of parameters, criteria, etc. The Collaboratory for Study of Earthquake Predictability (CSEP) is dedicated to providing such prospective testing. Any serious proposal for prediction research should deal with the problems above, and anticipate the huge investment in time required to test hypotheses.

4. Can We Predict Earthquakes?

ScienceCinema

Johnson, Paul

2018-01-16

The only thing we know for sure about earthquakes is that one will happen again very soon. Earthquakes pose a vital yet puzzling set of research questions that have confounded scientists for decades, but new ways of looking at seismic information and innovative laboratory experiments are offering tantalizing clues to what triggers earthquakes â and when.

5. Sociological aspects of earthquake prediction

USGS Publications Warehouse

Spall, H.

1979-01-01

Henry Spall talked recently with Denis Mileti who is in the Department of Sociology, Colorado State University, Fort Collins, Colo. Dr. Mileti is a sociologst involved with research programs that study the socioeconomic impact of earthquake prediction

6. Prototype operational earthquake prediction system

USGS Publications Warehouse

Spall, Henry

1986-01-01

An objective if the U.S. Earthquake Hazards Reduction Act of 1977 is to introduce into all regions of the country that are subject to large and moderate earthquakes, systems for predicting earthquakes and assessing earthquake risk. In 1985, the USGS developed for the Secretary of the Interior a program for implementation of a prototype operational earthquake prediction system in southern California.

7. Earthquake prediction research at the Seismological Laboratory, California Institute of Technology

USGS Publications Warehouse

Spall, H.

1979-01-01

Nevertheless, basic earthquake-related information has always been of consuming interest to the public and the media in this part of California (fig. 2.). So it is not surprising that earthquake prediction continues to be a significant reserach program at the laboratory. Several of the current spectrum of projects related to prediction are discussed below.

8. Earthquake prediction using extinct monogenetic volcanoes: A possible new research strategy

Szakács, Alexandru

2011-04-01

Volcanoes are extremely effective transmitters of matter, energy and information from the deep Earth towards its surface. Their capacities as information carriers are far to be fully exploited so far. Volcanic conduits can be viewed in general as rod-like or sheet-like vertical features with relatively homogenous composition and structure crosscutting geological structures of far more complexity and compositional heterogeneity. Information-carrying signals such as earthquake precursor signals originating deep below the Earth surface are transmitted with much less loss of information through homogenous vertically extended structures than through the horizontally segmented heterogeneous lithosphere or crust. Volcanic conduits can thus be viewed as upside-down "antennas" or waveguides which can be used as privileged pathways of any possible earthquake precursor signal. In particular, conduits of monogenetic volcanoes are promising transmitters of deep Earth information to be received and decoded at surface monitoring stations because the expected more homogenous nature of their rock-fill as compared to polygenetic volcanoes. Among monogenetic volcanoes those with dominantly effusive activity appear as the best candidates for privileged earthquake monitoring sites. In more details, effusive monogenetic volcanic conduits filled with rocks of primitive parental magma composition indicating direct ascent from sub-lithospheric magma-generating areas are the most suitable. Further selection criteria may include age of the volcanism considered and the presence of mantle xenoliths in surface volcanic products indicating direct and straightforward link between the deep lithospheric mantle and surface through the conduit. Innovative earthquake prediction research strategies can be based and developed on these grounds by considering conduits of selected extinct monogenetic volcanoes and deep trans-crustal fractures as privileged emplacement sites of seismic monitoring stations

9. On Earthquake Prediction in Japan

PubMed Central

UYEDA, Seiya

2013-01-01

Japan’s National Project for Earthquake Prediction has been conducted since 1965 without success. An earthquake prediction should be a short-term prediction based on observable physical phenomena or precursors. The main reason of no success is the failure to capture precursors. Most of the financial resources and manpower of the National Project have been devoted to strengthening the seismographs networks, which are not generally effective for detecting precursors since many of precursors are non-seismic. The precursor research has never been supported appropriately because the project has always been run by a group of seismologists who, in the present author’s view, are mainly interested in securing funds for seismology — on pretense of prediction. After the 1995 Kobe disaster, the project decided to give up short-term prediction and this decision has been further fortified by the 2011 M9 Tohoku Mega-quake. On top of the National Project, there are other government projects, not formally but vaguely related to earthquake prediction, that consume many orders of magnitude more funds. They are also un-interested in short-term prediction. Financially, they are giants and the National Project is a dwarf. Thus, in Japan now, there is practically no support for short-term prediction research. Recently, however, substantial progress has been made in real short-term prediction by scientists of diverse disciplines. Some promising signs are also arising even from cooperation with private sectors. PMID:24213204

10. Testing an earthquake prediction algorithm

USGS Publications Warehouse

Kossobokov, V.G.; Healy, J.H.; Dewey, J.W.

1997-01-01

A test to evaluate earthquake prediction algorithms is being applied to a Russian algorithm known as M8. The M8 algorithm makes intermediate term predictions for earthquakes to occur in a large circle, based on integral counts of transient seismicity in the circle. In a retroactive prediction for the period January 1, 1985 to July 1, 1991 the algorithm as configured for the forward test would have predicted eight of ten strong earthquakes in the test area. A null hypothesis, based on random assignment of predictions, predicts eight earthquakes in 2.87% of the trials. The forward test began July 1, 1991 and will run through December 31, 1997. As of July 1, 1995, the algorithm had forward predicted five out of nine earthquakes in the test area, which success ratio would have been achieved in 53% of random trials with the null hypothesis.

11. Collaboratory for the Study of Earthquake Predictability

Schorlemmer, D.; Jordan, T. H.; Zechar, J. D.; Gerstenberger, M. C.; Wiemer, S.; Maechling, P. J.

2006-12-01

Earthquake prediction is one of the most difficult problems in physical science and, owing to its societal implications, one of the most controversial. The study of earthquake predictability has been impeded by the lack of an adequate experimental infrastructure---the capability to conduct scientific prediction experiments under rigorous, controlled conditions and evaluate them using accepted criteria specified in advance. To remedy this deficiency, the Southern California Earthquake Center (SCEC) is working with its international partners, which include the European Union (through the Swiss Seismological Service) and New Zealand (through GNS Science), to develop a virtual, distributed laboratory with a cyberinfrastructure adequate to support a global program of research on earthquake predictability. This Collaboratory for the Study of Earthquake Predictability (CSEP) will extend the testing activities of SCEC's Working Group on Regional Earthquake Likelihood Models, from which we will present first results. CSEP will support rigorous procedures for registering prediction experiments on regional and global scales, community-endorsed standards for assessing probability-based and alarm-based predictions, access to authorized data sets and monitoring products from designated natural laboratories, and software to allow researchers to participate in prediction experiments. CSEP will encourage research on earthquake predictability by supporting an environment for scientific prediction experiments that allows the predictive skill of proposed algorithms to be rigorously compared with standardized reference methods and data sets. It will thereby reduce the controversies surrounding earthquake prediction, and it will allow the results of prediction experiments to be communicated to the scientific community, governmental agencies, and the general public in an appropriate research context.

12. The nature of earthquake prediction

USGS Publications Warehouse

Lindh, A.G.

1991-01-01

Earthquake prediction is inherently statistical. Although some people continue to think of earthquake prediction as the specification of the time, place, and magnitude of a future earthquake, it has been clear for at least a decade that this is an unrealistic and unreasonable definition. the reality is that earthquake prediction starts from the long-term forecasts of place and magnitude, with very approximate time constraints, and progresses, at least in principle, to a gradual narrowing of the time window as data and understanding permit. Primitive long-term forecasts are clearly possible at this time on a few well-characterized fault systems. Tightly focuses monitoring experiments aimed at short-term prediction are already underway in Parkfield, California, and in the Tokai region in Japan; only time will tell how much progress will be possible.

13. Intermediate-term earthquake prediction

USGS Publications Warehouse

Knopoff, L.

1990-01-01

The problems in predicting earthquakes have been attacked by phenomenological methods from pre-historic times to the present. The associations of presumed precursors with large earthquakes often have been remarked upon. the difficulty in identifying whether such correlations are due to some chance coincidence or are real precursors is that usually one notes the associations only in the relatively short time intervals before the large events. Only rarely, if ever, is notice taken of whether the presumed precursor is to be found in the rather long intervals that follow large earthquakes, or in fact is absent in these post-earthquake intervals. If there are enough examples, the presumed correlation fails as a precursor in the former case, while in the latter case the precursor would be verified. Unfortunately, the observer is usually not concerned with the 'uniteresting' intervals that have no large earthquakes

14. Geochemical challenge to earthquake prediction.

PubMed Central

Wakita, H

1996-01-01

The current status of geochemical and groundwater observations for earthquake prediction in Japan is described. The development of the observations is discussed in relation to the progress of the earthquake prediction program in Japan. Three major findings obtained from our recent studies are outlined. (i) Long-term radon observation data over 18 years at the SKE (Suikoen) well indicate that the anomalous radon change before the 1978 Izu-Oshima-kinkai earthquake can with high probability be attributed to precursory changes. (ii) It is proposed that certain sensitive wells exist which have the potential to detect precursory changes. (iii) The appearance and nonappearance of coseismic radon drops at the KSM (Kashima) well reflect changes in the regional stress state of an observation area. In addition, some preliminary results of chemical changes of groundwater prior to the 1995 Kobe (Hyogo-ken nanbu) earthquake are presented. PMID:11607665

15. Hypothesis testing and earthquake prediction.

PubMed

Jackson, D D

1996-04-30

Requirements for testing include advance specification of the conditional rate density (probability per unit time, area, and magnitude) or, alternatively, probabilities for specified intervals of time, space, and magnitude. Here I consider testing fully specified hypotheses, with no parameter adjustments or arbitrary decisions allowed during the test period. Because it may take decades to validate prediction methods, it is worthwhile to formulate testable hypotheses carefully in advance. Earthquake prediction generally implies that the probability will be temporarily higher than normal. Such a statement requires knowledge of "normal behavior"--that is, it requires a null hypothesis. Hypotheses can be tested in three ways: (i) by comparing the number of actual earth-quakes to the number predicted, (ii) by comparing the likelihood score of actual earthquakes to the predicted distribution, and (iii) by comparing the likelihood ratio to that of a null hypothesis. The first two tests are purely self-consistency tests, while the third is a direct comparison of two hypotheses. Predictions made without a statement of probability are very difficult to test, and any test must be based on the ratio of earthquakes in and out of the forecast regions.

16. Hypothesis testing and earthquake prediction.

PubMed Central

Jackson, D D

1996-01-01

Requirements for testing include advance specification of the conditional rate density (probability per unit time, area, and magnitude) or, alternatively, probabilities for specified intervals of time, space, and magnitude. Here I consider testing fully specified hypotheses, with no parameter adjustments or arbitrary decisions allowed during the test period. Because it may take decades to validate prediction methods, it is worthwhile to formulate testable hypotheses carefully in advance. Earthquake prediction generally implies that the probability will be temporarily higher than normal. Such a statement requires knowledge of "normal behavior"--that is, it requires a null hypothesis. Hypotheses can be tested in three ways: (i) by comparing the number of actual earth-quakes to the number predicted, (ii) by comparing the likelihood score of actual earthquakes to the predicted distribution, and (iii) by comparing the likelihood ratio to that of a null hypothesis. The first two tests are purely self-consistency tests, while the third is a direct comparison of two hypotheses. Predictions made without a statement of probability are very difficult to test, and any test must be based on the ratio of earthquakes in and out of the forecast regions. PMID:11607663

17. Historical earthquake research in Austria

Hammerl, Christa

2017-12-01

Austria has a moderate seismicity, and on average the population feels 40 earthquakes per year or approximately three earthquakes per month. A severe earthquake with light building damage is expected roughly every 2 to 3 years in Austria. Severe damage to buildings ( I 0 > 8° EMS) occurs significantly less frequently, the average period of recurrence is about 75 years. For this reason the historical earthquake research has been of special importance in Austria. The interest in historical earthquakes in the past in the Austro-Hungarian Empire is outlined, beginning with an initiative of the Austrian Academy of Sciences and the development of historical earthquake research as an independent research field after the 1978 "Zwentendorf plebiscite" on whether the nuclear power plant will start up. The applied methods are introduced briefly along with the most important studies and last but not least as an example of a recently carried out case study, one of the strongest past earthquakes in Austria, the earthquake of 17 July 1670, is presented. The research into historical earthquakes in Austria concentrates on seismic events of the pre-instrumental period. The investigations are not only of historical interest, but also contribute to the completeness and correctness of the Austrian earthquake catalogue, which is the basis for seismic hazard analysis and as such benefits the public, communities, civil engineers, architects, civil protection, and many others.

18. Using remote sensing to predict earthquake impacts

Fylaktos, Asimakis; Yfantidou, Anastasia

2017-09-01

Natural hazards like earthquakes can result to enormous property damage, and human casualties in mountainous areas. Italy has always been exposed to numerous earthquakes, mostly concentrated in central and southern regions. Last year, two seismic events near Norcia (central Italy) have occurred, which led to substantial loss of life and extensive damage to properties, infrastructure and cultural heritage. This research utilizes remote sensing products and GIS software, to provide a database of information. We used both SAR images of Sentinel 1A and optical imagery of Landsat 8 to examine the differences of topography with the aid of the multi temporal monitoring technique. This technique suits for the observation of any surface deformation. This database is a cluster of information regarding the consequences of the earthquakes in groups, such as property and infrastructure damage, regional rifts, cultivation loss, landslides and surface deformations amongst others, all mapped on GIS software. Relevant organizations can implement these data in order to calculate the financial impact of these types of earthquakes. In the future, we can enrich this database including more regions and enhance the variety of its applications. For instance, we could predict the future impacts of any type of earthquake in several areas, and design a preliminarily model of emergency for immediate evacuation and quick recovery response. It is important to know how the surface moves, in particular geographical regions like Italy, Cyprus and Greece, where earthquakes are so frequent. We are not able to predict earthquakes, but using data from this research, we may assess the damage that could be caused in the future.

19. Earthquake predictions using seismic velocity ratios

USGS Publications Warehouse

Sherburne, R. W.

1979-01-01

Since the beginning of modern seismology, seismologists have contemplated predicting earthquakes. The usefulness of earthquake predictions to the reduction of human and economic losses and the value of long-range earthquake prediction to planning is obvious. Not as clear are the long-range economic and social impacts of earthquake prediction to a speicifc area. The general consensus of opinion among scientists and government officials, however, is that the quest of earthquake prediction is a worthwhile goal and should be prusued with a sense of urgency.

20. Dim prospects for earthquake prediction

Geller, Robert J.

I was misquoted by C. Lomnitz's [1998] Forum letter (Eos, August 4, 1998, p. 373), which said: [I wonder whether Sasha Gusev [1998] actually believes that branding earthquake prediction a ‘proven nonscience’ [Geller, 1997a] is a paradigm for others to copy.”Readers are invited to verify for themselves that neither “proven nonscience” norv any similar phrase was used by Geller [1997a].

1. A note on evaluating VAN earthquake predictions

Tselentis, G.-Akis; Melis, Nicos S.

The evaluation of the success level of an earthquake prediction method should not be based on approaches that apply generalized strict statistical laws and avoid the specific nature of the earthquake phenomenon. Fault rupture processes cannot be compared to gambling processes. The outcome of the present note is that even an ideal earthquake prediction method is still shown to be a matter of a “chancy” association between precursors and earthquakes if we apply the same procedure proposed by Mulargia and Gasperini [1992] in evaluating VAN earthquake predictions. Each individual VAN prediction has to be evaluated separately, taking always into account the specific circumstances and information available. The success level of epicenter prediction should depend on the earthquake magnitude, and magnitude and time predictions may depend on earthquake clustering and the tectonic regime respectively.

2. Stigma in science: the case of earthquake prediction.

PubMed

Joffe, Helene; Rossetto, Tiziana; Bradley, Caroline; O'Connor, Cliodhna

2018-01-01

This paper explores how earthquake scientists conceptualise earthquake prediction, particularly given the conviction of six earthquake scientists for manslaughter (subsequently overturned) on 22 October 2012 for having given inappropriate advice to the public prior to the L'Aquila earthquake of 6 April 2009. In the first study of its kind, semi-structured interviews were conducted with 17 earthquake scientists and the transcribed interviews were analysed thematically. The scientists primarily denigrated earthquake prediction, showing strong emotive responses and distancing themselves from earthquake 'prediction' in favour of 'forecasting'. Earthquake prediction was regarded as impossible and harmful. The stigmatisation of the subject is discussed in the light of research on boundary work and stigma in science. The evaluation reveals how mitigation becomes the more favoured endeavour, creating a normative environment that disadvantages those who continue to pursue earthquake prediction research. Recommendations are made for communication with the public on earthquake risk, with a focus on how scientists portray uncertainty. © 2018 The Author(s). Disasters © Overseas Development Institute, 2018.

3. Signals of ENPEMF Used in Earthquake Prediction

Hao, G.; Dong, H.; Zeng, Z.; Wu, G.; Zabrodin, S. M.

2012-12-01

The signals of Earth's natural pulse electromagnetic field (ENPEMF) is a combination of the abnormal crustal magnetic field pulse affected by the earthquake, the induced field of earth's endogenous magnetic field, the induced magnetic field of the exogenous variation magnetic field, geomagnetic pulsation disturbance and other energy coupling process between sun and earth. As an instantaneous disturbance of the variation field of natural geomagnetism, ENPEMF can be used to predict earthquakes. This theory was introduced by A.A Vorobyov, who expressed a hypothesis that pulses can arise not only in the atmosphere but within the Earth's crust due to processes of tectonic-to-electric energy conversion (Vorobyov, 1970; Vorobyov, 1979). The global field time scale of ENPEMF signals has specific stability. Although the wave curves may not overlap completely at different regions, the smoothed diurnal ENPEMF patterns always exhibit the same trend per month. The feature is a good reference for observing the abnormalities of the Earth's natural magnetic field in a specific region. The frequencies of the ENPEMF signals generally locate in kilo Hz range, where frequencies within 5-25 kilo Hz range can be applied to monitor earthquakes. In Wuhan, the best observation frequency is 14.5 kilo Hz. Two special devices are placed in accordance with the S-N and W-E direction. Dramatic variation from the comparison between the pulses waveform obtained from the instruments and the normal reference envelope diagram should indicate high possibility of earthquake. The proposed detection method of earthquake based on ENPEMF can improve the geodynamic monitoring effect and can enrich earthquake prediction methods. We suggest the prospective further researches are about on the exact sources composition of ENPEMF signals, the distinction between noise and useful signals, and the effect of the Earth's gravity tide and solid tidal wave. This method may also provide a promising application in

4. Japanese earthquake predictability experiment with multiple runs before and after the 2011 Tohoku-oki earthquake

Hirata, N.; Tsuruoka, H.; Yokoi, S.

2011-12-01

The current Japanese national earthquake prediction program emphasizes the importance of modeling as well as monitoring for a sound scientific development of earthquake prediction research. One major focus of the current program is to move toward creating testable earthquake forecast models. For this purpose, in 2009 we joined the Collaboratory for the Study of Earthquake Predictability (CSEP) and installed, through an international collaboration, the CSEP Testing Centre, an infrastructure to encourage researchers to develop testable models for Japan. We started Japanese earthquake predictability experiment on November 1, 2009. The experiment consists of 12 categories, with 4 testing classes with different time spans (1 day, 3 months, 1 year and 3 years) and 3 testing regions called 'All Japan,' 'Mainland,' and 'Kanto.' A total of 160 models, as of August 2013, were submitted, and are currently under the CSEP official suite of tests for evaluating the performance of forecasts. We will present results of prospective forecast and testing for periods before and after the 2011 Tohoku-oki earthquake. Because a seismic activity has changed dramatically since the 2011 event, performances of models have been affected very much. In addition, as there is the problem of authorized catalogue related to the completeness magnitude, most models did not pass the CSEP consistency tests. Also, we will discuss the retrospective earthquake forecast experiments for aftershocks of the 2011 Tohoku-oki earthquake. Our aim is to describe what has turned out to be the first occasion for setting up a research environment for rigorous earthquake forecasting in Japan.

5. Japanese earthquake predictability experiment with multiple runs before and after the 2011 Tohoku-oki earthquake

Hirata, N.; Tsuruoka, H.; Yokoi, S.

2013-12-01

The current Japanese national earthquake prediction program emphasizes the importance of modeling as well as monitoring for a sound scientific development of earthquake prediction research. One major focus of the current program is to move toward creating testable earthquake forecast models. For this purpose, in 2009 we joined the Collaboratory for the Study of Earthquake Predictability (CSEP) and installed, through an international collaboration, the CSEP Testing Centre, an infrastructure to encourage researchers to develop testable models for Japan. We started Japanese earthquake predictability experiment on November 1, 2009. The experiment consists of 12 categories, with 4 testing classes with different time spans (1 day, 3 months, 1 year and 3 years) and 3 testing regions called 'All Japan,' 'Mainland,' and 'Kanto.' A total of 160 models, as of August 2013, were submitted, and are currently under the CSEP official suite of tests for evaluating the performance of forecasts. We will present results of prospective forecast and testing for periods before and after the 2011 Tohoku-oki earthquake. Because a seismic activity has changed dramatically since the 2011 event, performances of models have been affected very much. In addition, as there is the problem of authorized catalogue related to the completeness magnitude, most models did not pass the CSEP consistency tests. Also, we will discuss the retrospective earthquake forecast experiments for aftershocks of the 2011 Tohoku-oki earthquake. Our aim is to describe what has turned out to be the first occasion for setting up a research environment for rigorous earthquake forecasting in Japan.

6. Earthquake prediction with electromagnetic phenomena

SciTech Connect

Hayakawa, Masashi, E-mail: hayakawa@hi-seismo-em.jp; Advanced Wireless & Communications Research Center, UEC, Chofu Tokyo; Earthquake Analysis Laboratory, Information Systems Inc., 4-8-15, Minami-aoyama, Minato-ku, Tokyo, 107-0062

Short-term earthquake (EQ) prediction is defined as prospective prediction with the time scale of about one week, which is considered to be one of the most important and urgent topics for the human beings. If this short-term prediction is realized, casualty will be drastically reduced. Unlike the conventional seismic measurement, we proposed the use of electromagnetic phenomena as precursors to EQs in the prediction, and an extensive amount of progress has been achieved in the field of seismo-electromagnetics during the last two decades. This paper deals with the review on this short-term EQ prediction, including the impossibility myth of EQsmore » prediction by seismometers, the reason why we are interested in electromagnetics, the history of seismo-electromagnetics, the ionospheric perturbation as the most promising candidate of EQ prediction, then the future of EQ predictology from two standpoints of a practical science and a pure science, and finally a brief summary.« less

7. Scoring annual earthquake predictions in China

Zhuang, Jiancang; Jiang, Changsheng

2012-02-01

The Annual Consultation Meeting on Earthquake Tendency in China is held by the China Earthquake Administration (CEA) in order to provide one-year earthquake predictions over most China. In these predictions, regions of concern are denoted together with the corresponding magnitude range of the largest earthquake expected during the next year. Evaluating the performance of these earthquake predictions is rather difficult, especially for regions that are of no concern, because they are made on arbitrary regions with flexible magnitude ranges. In the present study, the gambling score is used to evaluate the performance of these earthquake predictions. Based on a reference model, this scoring method rewards successful predictions and penalizes failures according to the risk (probability of being failure) that the predictors have taken. Using the Poisson model, which is spatially inhomogeneous and temporally stationary, with the Gutenberg-Richter law for earthquake magnitudes as the reference model, we evaluate the CEA predictions based on 1) a partial score for evaluating whether issuing the alarmed regions is based on information that differs from the reference model (knowledge of average seismicity level) and 2) a complete score that evaluates whether the overall performance of the prediction is better than the reference model. The predictions made by the Annual Consultation Meetings on Earthquake Tendency from 1990 to 2003 are found to include significant precursory information, but the overall performance is close to that of the reference model.

8. The 2004 Parkfield, CA Earthquake: A Teachable Moment for Exploring Earthquake Processes, Probability, and Earthquake Prediction

Kafka, A.; Barnett, M.; Ebel, J.; Bellegarde, H.; Campbell, L.

2004-12-01

The occurrence of the 2004 Parkfield earthquake provided a unique "teachable moment" for students in our science course for teacher education majors. The course uses seismology as a medium for teaching a wide variety of science topics appropriate for future teachers. The 2004 Parkfield earthquake occurred just 15 minutes after our students completed a lab on earthquake processes and earthquake prediction. That lab included a discussion of the Parkfield Earthquake Prediction Experiment as a motivation for the exercises they were working on that day. Furthermore, this earthquake was recorded on an AS1 seismograph right in their lab, just minutes after the students left. About an hour after we recorded the earthquake, the students were able to see their own seismogram of the event in the lecture part of the course, which provided an excellent teachable moment for a lecture/discussion on how the occurrence of the 2004 Parkfield earthquake might affect seismologists' ideas about earthquake prediction. The specific lab exercise that the students were working on just before we recorded this earthquake was a "sliding block" experiment that simulates earthquakes in the classroom. The experimental apparatus includes a flat board on top of which are blocks of wood attached to a bungee cord and a string wrapped around a hand crank. Plate motion is modeled by slowly turning the crank, and earthquakes are modeled as events in which the block slips ("blockquakes"). We scaled the earthquake data and the blockquake data (using how much the string moved as a proxy for time) so that we could compare blockquakes and earthquakes. This provided an opportunity to use interevent-time histograms to teach about earthquake processes, probability, and earthquake prediction, and to compare earthquake sequences with blockquake sequences. We were able to show the students, using data obtained directly from their own lab, how global earthquake data fit a Poisson exponential distribution better

9. The October 1992 Parkfield, California, earthquake prediction

USGS Publications Warehouse

Langbein, J.

1992-01-01

A magnitude 4.7 earthquake occurred near Parkfield, California, on October 20, 992, at 05:28 UTC (October 19 at 10:28 p.m. local or Pacific Daylight Time).This moderate shock, interpreted as the potential foreshock of a damaging earthquake on the San Andreas fault, triggered long-standing federal, state and local government plans to issue a public warning of an imminent magnitude 6 earthquake near Parkfield. Although the predicted earthquake did not take place, sophisticated suites of instruments deployed as part of the Parkfield Earthquake Prediction Experiment recorded valuable data associated with an unusual series of events. this article describes the geological aspects of these events, which occurred near Parkfield in October 1992. The accompnaying article, an edited version of a press conference b Richard Andrews, the Director of the California Office of Emergency Service (OES), describes governmental response to the prediction.

10. The U.S. Earthquake Prediction Program

USGS Publications Warehouse

Wesson, R.L.; Filson, J.R.

1981-01-01

There are two distinct motivations for earthquake prediction. The mechanistic approach aims to understand the processes leading to a large earthquake. The empirical approach is governed by the immediate need to protect lives and property. With our current lack of knowledge about the earthquake process, future progress cannot be made without gathering a large body of measurements. These are required not only for the empirical prediction of earthquakes, but also for the testing and development of hypotheses that further our understanding of the processes at work. The earthquake prediction program is basically a program of scientific inquiry, but one which is motivated by social, political, economic, and scientific reasons. It is a pursuit that cannot rely on empirical observations alone nor can it carried out solely on a blackboard or in a laboratory. Experiments must be carried out in the real Earth.

11. Earthquake prediction evaluation standards applied to the VAN Method

Jackson, David D.

Earthquake prediction research must meet certain standards before it can be suitably evaluated for potential application in decision making. For methods that result in a binary (on or off) alarm condition, requirements include (1) a quantitative description of observables that trigger an alarm, (2) a quantitative description, including ranges of time, location, and magnitude, of the predicted earthquakes, (3) documented evidence of all previous alarms, (4) a complete list of predicted earthquakes, (5) a complete list of unpredicted earthquakes. The VAN technique [Varotsos and Lazaridou, 1991; Varotsos et al., 1996] has not yet been stated as a testable hypothesis. It fails criteria (1) and (2) so it is not ready to be evaluated properly. Although telegrams were transmitted in advance of claimed successes, these telegrams did not fully specify the predicted events, and all of the published statistical evaluations involve many subjective ex post facto decisions. Lacking a statistically demonstrated relationship to earthquakes, a candidate prediction technique should satisfy several plausibility criteria, including: (1) a reasonable relationship between the location of the candidate precursor and that of the predicted earthquake, (2) some demonstration that the candidate precursory observations are related to stress, strain, or other quantities related to earthquakes, and (3) the existence of co-seismic as well as pre-seismic variations of the candidate precursor. The VAN technique meets none of these criteria.

12. Discussion of the design of satellite-laser measurement stations in the eastern Mediterranean under the geological aspect. Contribution to the earthquake prediction research by the Wegener Group and to NASA's Crustal Dynamics Project

NASA Technical Reports Server (NTRS)

Paluska, A.; Pavoni, N.

1983-01-01

Research conducted for determining the location of stations for measuring crustal dynamics and predicting earthquakes is discussed. Procedural aspects, the extraregional kinematic tendencies, and regional tectonic deformation mechanisms are described.

13. Earthquake prediction: the interaction of public policy and science.

PubMed Central

Jones, L M

1996-01-01

Earthquake prediction research has searched for both informational phenomena, those that provide information about earthquake hazards useful to the public, and causal phenomena, causally related to the physical processes governing failure on a fault, to improve our understanding of those processes. Neither informational nor causal phenomena are a subset of the other. I propose a classification of potential earthquake predictors of informational, causal, and predictive phenomena, where predictors are causal phenomena that provide more accurate assessments of the earthquake hazard than can be gotten from assuming a random distribution. Achieving higher, more accurate probabilities than a random distribution requires much more information about the precursor than just that it is causally related to the earthquake. PMID:11607656

14. Earthquake prediction; new studies yield promising results

USGS Publications Warehouse

Robinson, R.

1974-01-01

On Agust 3, 1973, a small earthquake (magnitude 2.5) occurred near Blue Mountain Lake in the Adirondack region of northern New York State. This seemingly unimportant event was of great significance, however, because it was predicted. Seismologsits at the Lamont-Doherty geologcal Observatory of Columbia University accurately foretold the time, place, and magnitude of the event. Their prediction was based on certain pre-earthquake processes that are best explained by a hypothesis known as "dilatancy," a concept that has injected new life and direction into the science of earthquake prediction. Although much mroe reserach must be accomplished before we can expect to predict potentially damaging earthquakes with any degree of consistency, results such as this indicate that we are on a promising road.

15. Strong ground motion prediction using virtual earthquakes.

PubMed

Denolle, M A; Dunham, E M; Prieto, G A; Beroza, G C

2014-01-24

Sedimentary basins increase the damaging effects of earthquakes by trapping and amplifying seismic waves. Simulations of seismic wave propagation in sedimentary basins capture this effect; however, there exists no method to validate these results for earthquakes that have not yet occurred. We present a new approach for ground motion prediction that uses the ambient seismic field. We apply our method to a suite of magnitude 7 scenario earthquakes on the southern San Andreas fault and compare our ground motion predictions with simulations. Both methods find strong amplification and coupling of source and structure effects, but they predict substantially different shaking patterns across the Los Angeles Basin. The virtual earthquake approach provides a new approach for predicting long-period strong ground motion.

16. Quantitative Earthquake Prediction on Global and Regional Scales

2006-03-01

The Earth is a hierarchy of volumes of different size. Driven by planetary convection these volumes are involved into joint and relative movement. The movement is controlled by a wide variety of processes on and around the fractal mesh of boundary zones, and does produce earthquakes. This hierarchy of movable volumes composes a large non-linear dynamical system. Prediction of such a system in a sense of extrapolation of trajectory into the future is futile. However, upon coarse-graining the integral empirical regularities emerge opening possibilities of prediction in a sense of the commonly accepted consensus definition worked out in 1976 by the US National Research Council. Implications of the understanding hierarchical nature of lithosphere and its dynamics based on systematic monitoring and evidence of its unified space-energy similarity at different scales help avoiding basic errors in earthquake prediction claims. They suggest rules and recipes of adequate earthquake prediction classification, comparison and optimization. The approach has already led to the design of reproducible intermediate-term middle-range earthquake prediction technique. Its real-time testing aimed at prediction of the largest earthquakes worldwide has proved beyond any reasonable doubt the effectiveness of practical earthquake forecasting. In the first approximation, the accuracy is about 1-5 years and 5-10 times the anticipated source dimension. Further analysis allows reducing spatial uncertainty down to 1-3 source dimensions, although at a cost of additional failures-to-predict. Despite of limited accuracy a considerable damage could be prevented by timely knowledgeable use of the existing predictions and earthquake prediction strategies. The December 26, 2004 Indian Ocean Disaster seems to be the first indication that the methodology, designed for prediction of M8.0+ earthquakes can be rescaled for prediction of both smaller magnitude earthquakes (e.g., down to M5.5+ in Italy) and

17. Current affairs in earthquake prediction in Japan

Uyeda, Seiya

2015-12-01

As of mid-2014, the main organizations of the earthquake (EQ hereafter) prediction program, including the Seismological Society of Japan (SSJ), the MEXT Headquarters for EQ Research Promotion, hold the official position that they neither can nor want to make any short-term prediction. It is an extraordinary stance of responsible authorities when the nation, after the devastating 2011 M9 Tohoku EQ, most urgently needs whatever information that may exist on forthcoming EQs. Japan's national project for EQ prediction started in 1965, but it has made no success. The main reason for no success is the failure to capture precursors. After the 1995 Kobe disaster, the project decided to give up short-term prediction and this stance has been further fortified by the 2011 M9 Tohoku Mega-quake. This paper tries to explain how this situation came about and suggest that it may in fact be a legitimate one which should have come a long time ago. Actually, substantial positive changes are taking place now. Some promising signs are arising even from cooperation of researchers with private sectors and there is a move to establish an "EQ Prediction Society of Japan". From now on, maintaining the high scientific standards in EQ prediction will be of crucial importance.

18. Triggering Factor of Strong Earthquakes and Its Prediction Verification

Ren, Z. Q.; Ren, S. H.

After 30 yearsS research, we have found that great earthquakes are triggered by tide- generation force of the moon. ItSs not the tide-generation force in classical view- points, but is a non-classical viewpoint tide-generation force. We call it as TGFR (Tide-Generation ForcesS Resonance). TGFR strongly depends on the tide-generation force at time of the strange astronomical points (SAP). The SAP mostly are when the moon and another celestial body are arranged with the earth along a straight line (with the same apparent right ascension or 180o difference), the other SAP are the turning points of the moonSs relatively motion to the earth. Moreover, TGFR have four different types effective areas. Our study indicates that a majority of earthquakes are triggering by the rare superimposition of TGFRsS effective areas. In China the great earthquakes in the plain area of Hebei Province, Taiwan, Yunnan Province and Sichuan province are trigger by the decompression TGFR; Other earthquakes are trig- gered by compression TGFR which are in Gansu Province, Ningxia Provinces and northwest direction of Beijing. The great earthquakes in Japan, California, southeast of Europe also are triggered by compression of the TGFR. and in the other part of the world like in Philippines, Central America countries, and West Asia, great earthquakes are triggered by decompression TGFR. We have carried out examinational immediate prediction cooperate TGFR method with other earthquake impending signals such as suggested by Professor Li Junzhi. The successful ratio is about 40%(from our fore- cast reports to the China Seismological Administration). Thus we could say the great earthquake can be predicted (include immediate earthquake prediction). Key words: imminent prediction; triggering factor; TGFR (Tide-Generation ForcesS Resonance); TGFR compression; TGFR compression zone; TGFR decompression; TGFR decom- pression zone

19. Microearthquake networks and earthquake prediction

USGS Publications Warehouse

Lee, W.H.K.; Steward, S. W.

1979-01-01

A microearthquake network is a group of highly sensitive seismographic stations designed primarily to record local earthquakes of magnitudes less than 3. Depending on the application, a microearthquake network will consist of several stations or as many as a few hundred . They are usually classified as either permanent or temporary. In a permanent network, the seismic signal from each is telemetered to a central recording site to cut down on the operating costs and to allow more efficient and up-to-date processing of the data. However, telemetering can restrict the location sites because of the line-of-site requirement for radio transmission or the need for telephone lines. Temporary networks are designed to be extremely portable and completely self-contained so that they can be very quickly deployed. They are most valuable for recording aftershocks of a major earthquake or for studies in remote areas.

20. Risk and return: evaluating Reverse Tracing of Precursors earthquake predictions

Zechar, J. Douglas; Zhuang, Jiancang

2010-09-01

In 2003, the Reverse Tracing of Precursors (RTP) algorithm attracted the attention of seismologists and international news agencies when researchers claimed two successful predictions of large earthquakes. These researchers had begun applying RTP to seismicity in Japan, California, the eastern Mediterranean and Italy; they have since applied it to seismicity in the northern Pacific, Oregon and Nevada. RTP is a pattern recognition algorithm that uses earthquake catalogue data to declare alarms, and these alarms indicate that RTP expects a moderate to large earthquake in the following months. The spatial extent of alarms is highly variable and each alarm typically lasts 9 months, although the algorithm may extend alarms in time and space. We examined the record of alarms and outcomes since the prospective application of RTP began, and in this paper we report on the performance of RTP to date. To analyse these predictions, we used a recently developed approach based on a gambling score, and we used a simple reference model to estimate the prior probability of target earthquakes for each alarm. Formally, we believe that RTP investigators did not rigorously specify the first two successful' predictions in advance of the relevant earthquakes; because this issue is contentious, we consider analyses with and without these alarms. When we included contentious alarms, RTP predictions demonstrate statistically significant skill. Under a stricter interpretation, the predictions are marginally unsuccessful.

1. Earthquakes.

ERIC Educational Resources Information Center

Walter, Edward J.

1977-01-01

Presents an analysis of the causes of earthquakes. Topics discussed include (1) geological and seismological factors that determine the effect of a particular earthquake on a given structure; (2) description of some large earthquakes such as the San Francisco quake; and (3) prediction of earthquakes. (HM)

2. The earthquake prediction experiment at Parkfield, California

USGS Publications Warehouse

Roeloffs, E.; Langbein, J.

1994-01-01

Since 1985, a focused earthquake prediction experiment has been in progress along the San Andreas fault near the town of Parkfield in central California. Parkfield has experienced six moderate earthquakes since 1857 at average intervals of 22 years, the most recent a magnitude 6 event in 1966. The probability of another moderate earthquake soon appears high, but studies assigning it a 95% chance of occurring before 1993 now appear to have been oversimplified. The identification of a Parkfield fault "segment" was initially based on geometric features in the surface trace of the San Andreas fault, but more recent microearthquake studies have demonstrated that those features do not extend to seismogenic depths. On the other hand, geodetic measurements are consistent with the existence of a "locked" patch on the fault beneath Parkfield that has presently accumulated a slip deficit equal to the slip in the 1966 earthquake. A magnitude 4.7 earthquake in October 1992 brought the Parkfield experiment to its highest level of alert, with a 72-hour public warning that there was a 37% chance of a magnitude 6 event. However, this warning proved to be a false alarm. Most data collected at Parkfield indicate that strain is accumulating at a constant rate on this part of the San Andreas fault, but some interesting departures from this behavior have been recorded. Here we outline the scientific arguments bearing on when the next Parkfield earthquake is likely to occur and summarize geophysical observations to date.

3. One research from turkey on groundwater- level changes related earthquake

Kirmizitas, H.; Göktepe, G.

2003-04-01

( Kütahya-Gediz Earthquake on March, 28, 1970, Diyarbakir-Lice Earthquake on September, 6, 1975, Van-Muradiye Earthquake on November, 24, 1976, Erzurum-Kars Earthquake on October, 30, 1983, Gölcük Earthquake on August, 17, 1999 , Afyon-Sultanhisar Earthquake on February, 3, 2002). Furthermore, Iran Earthquake on November, 27, 1979 has been measured and recorded from thousands kilometeres away in drilling wells in Turkey. Altough there are a lot of studies and researches on earthquake prediction and groundwater level changes related earthquake, it is still difficult to say certain results are obtained on this subject. Nowadays, it is well known the importance of these researches on earthquakes. Due to take certain results on earthqauke-water level changes relations, studies must be carried out on this way.

4. On some methods for assessing earthquake predictions

Molchan, G.; Romashkova, L.; Peresan, A.

2017-09-01

A regional approach to the problem of assessing earthquake predictions inevitably faces a deficit of data. We point out some basic limits of assessment methods reported in the literature, considering the practical case of the performance of the CN pattern recognition method in the prediction of large Italian earthquakes. Along with the classical hypothesis testing, a new game approach, the so-called parimutuel gambling (PG) method, is examined. The PG, originally proposed for the evaluation of the probabilistic earthquake forecast, has been recently adapted for the case of 'alarm-based' CN prediction. The PG approach is a non-standard method; therefore it deserves careful examination and theoretical analysis. We show that the PG alarm-based version leads to an almost complete loss of information about predicted earthquakes (even for a large sample). As a result, any conclusions based on the alarm-based PG approach are not to be trusted. We also show that the original probabilistic PG approach does not necessarily identifies the genuine forecast correctly among competing seismicity rate models, even when applied to extensive data.

5. Gambling score in earthquake prediction analysis

Molchan, G.; Romashkova, L.

2011-03-01

The number of successes and the space-time alarm rate are commonly used to characterize the strength of an earthquake prediction method and the significance of prediction results. It has been recently suggested to use a new characteristic to evaluate the forecaster's skill, the gambling score (GS), which incorporates the difficulty of guessing each target event by using different weights for different alarms. We expand parametrization of the GS and use the M8 prediction algorithm to illustrate difficulties of the new approach in the analysis of the prediction significance. We show that the level of significance strongly depends (1) on the choice of alarm weights, (2) on the partitioning of the entire alarm volume into component parts and (3) on the accuracy of the spatial rate measure of target events. These tools are at the disposal of the researcher and can affect the significance estimate. Formally, all reasonable GSs discussed here corroborate that the M8 method is non-trivial in the prediction of 8.0 ≤M < 8.5 events because the point estimates of the significance are in the range 0.5-5 per cent. However, the conservative estimate 3.7 per cent based on the number of successes seems preferable owing to two circumstances: (1) it is based on relative values of the spatial rate and hence is more stable and (2) the statistic of successes enables us to construct analytically an upper estimate of the significance taking into account the uncertainty of the spatial rate measure.

6. 76 FR 69761 - National Earthquake Prediction Evaluation Council (NEPEC)

Federal Register 2010, 2011, 2012, 2013, 2014

2011-11-09

... DEPARTMENT OF THE INTERIOR U.S. Geological Survey National Earthquake Prediction Evaluation... 96-472, the National Earthquake Prediction Evaluation Council (NEPEC) will hold a 1\\1/2\\-day meeting.... Geological Survey on proposed earthquake predictions, on the completeness and scientific validity of the...

7. 76 FR 19123 - National Earthquake Prediction Evaluation Council (NEPEC)

Federal Register 2010, 2011, 2012, 2013, 2014

2011-04-06

... Earthquake Prediction Evaluation Council (NEPEC) AGENCY: U.S. Geological Survey, Interior. ACTION: Notice of meeting. SUMMARY: Pursuant to Public Law 96-472, the National Earthquake Prediction Evaluation Council... proposed earthquake predictions, on the completeness and scientific validity of the available data related...

8. Earthquake prediction in Japan and natural time analysis of seismicity

Uyeda, S.; Varotsos, P.

2011-12-01

M9 super-giant earthquake with huge tsunami devastated East Japan on 11 March, causing more than 20,000 casualties and serious damage of Fukushima nuclear plant. This earthquake was predicted neither short-term nor long-term. Seismologists were shocked because it was not even considered possible to happen at the East Japan subduction zone. However, it was not the only un-predicted earthquake. In fact, throughout several decades of the National Earthquake Prediction Project, not even a single earthquake was predicted. In reality, practically no effective research has been conducted for the most important short-term prediction. This happened because the Japanese National Project was devoted for construction of elaborate seismic networks, which was not the best way for short-term prediction. After the Kobe disaster, in order to parry the mounting criticism on their no success history, they defiantly changed their policy to "stop aiming at short-term prediction because it is impossible and concentrate resources on fundamental research", that meant to obtain "more funding for no prediction research". The public were and are not informed about this change. Obviously earthquake prediction would be possible only when reliable precursory phenomena are caught and we have insisted this would be done most likely through non-seismic means such as geochemical/hydrological and electromagnetic monitoring. Admittedly, the lack of convincing precursors for the M9 super-giant earthquake has adverse effect for us, although its epicenter was far out off shore of the range of operating monitoring systems. In this presentation, we show a new possibility of finding remarkable precursory signals, ironically, from ordinary seismological catalogs. In the frame of the new time domain termed natural time, an order parameter of seismicity, κ1, has been introduced. This is the variance of natural time kai weighted by normalised energy release at χ. In the case that Seismic Electric Signals

9. Large-Scale Earthquake Countermeasures Act and the Earthquake Prediction Council in Japan

SciTech Connect

Rikitake, T.

1979-08-07

The Large-Scale Earthquake Countermeasures Act was enacted in Japan in December 1978. This act aims at mitigating earthquake hazards by designating an area to be an area under intensified measures against earthquake disaster, such designation being based on long-term earthquake prediction information, and by issuing an earthquake warnings statement based on imminent prediction information, when possible. In an emergency case as defined by the law, the prime minister will be empowered to take various actions which cannot be taken at ordinary times. For instance, he may ask the Self-Defense Force to come into the earthquake-threatened area before the earthquake occurrence.more » A Prediction Council has been formed in order to evaluate premonitory effects that might be observed over the Tokai area, which was designated an area under intensified measures against earthquake disaster some time in June 1979. An extremely dense observation network has been constructed over the area.« less

10. Testing earthquake prediction algorithms: Statistically significant advance prediction of the largest earthquakes in the Circum-Pacific, 1992-1997

USGS Publications Warehouse

Kossobokov, V.G.; Romashkova, L.L.; Keilis-Borok, V. I.; Healy, J.H.

1999-01-01

Algorithms M8 and MSc (i.e., the Mendocino Scenario) were used in a real-time intermediate-term research prediction of the strongest earthquakes in the Circum-Pacific seismic belt. Predictions are made by M8 first. Then, the areas of alarm are reduced by MSc at the cost that some earthquakes are missed in the second approximation of prediction. In 1992-1997, five earthquakes of magnitude 8 and above occurred in the test area: all of them were predicted by M8 and MSc identified correctly the locations of four of them. The space-time volume of the alarms is 36% and 18%, correspondingly, when estimated with a normalized product measure of empirical distribution of epicenters and uniform time. The statistical significance of the achieved results is beyond 99% both for M8 and MSc. For magnitude 7.5 + , 10 out of 19 earthquakes were predicted by M8 in 40% and five were predicted by M8-MSc in 13% of the total volume considered. This implies a significance level of 81% for M8 and 92% for M8-MSc. The lower significance levels might result from a global change in seismic regime in 1993-1996, when the rate of the largest events has doubled and all of them become exclusively normal or reversed faults. The predictions are fully reproducible; the algorithms M8 and MSc in complete formal definitions were published before we started our experiment [Keilis-Borok, V.I., Kossobokov, V.G., 1990. Premonitory activation of seismic flow: Algorithm M8, Phys. Earth and Planet. Inter. 61, 73-83; Kossobokov, V.G., Keilis-Borok, V.I., Smith, S.W., 1990. Localization of intermediate-term earthquake prediction, J. Geophys. Res., 95, 19763-19772; Healy, J.H., Kossobokov, V.G., Dewey, J.W., 1992. A test to evaluate the earthquake prediction algorithm, M8. U.S. Geol. Surv. OFR 92-401]. M8 is available from the IASPEI Software Library [Healy, J.H., Keilis-Borok, V.I., Lee, W.H.K. (Eds.), 1997. Algorithms for Earthquake Statistics and Prediction, Vol. 6. IASPEI Software Library]. ?? 1999 Elsevier

11. Earthquake Prediction in a Big Data World

Kossobokov, V. G.

2016-12-01

The digital revolution started just about 15 years ago has already surpassed the global information storage capacity of more than 5000 Exabytes (in optimally compressed bytes) per year. Open data in a Big Data World provides unprecedented opportunities for enhancing studies of the Earth System. However, it also opens wide avenues for deceptive associations in inter- and transdisciplinary data and for inflicted misleading predictions based on so-called "precursors". Earthquake prediction is not an easy task that implies a delicate application of statistics. So far, none of the proposed short-term precursory signals showed sufficient evidence to be used as a reliable precursor of catastrophic earthquakes. Regretfully, in many cases of seismic hazard assessment (SHA), from term-less to time-dependent (probabilistic PSHA or deterministic DSHA), and short-term earthquake forecasting (StEF), the claims of a high potential of the method are based on a flawed application of statistics and, therefore, are hardly suitable for communication to decision makers. Self-testing must be done in advance claiming prediction of hazardous areas and/or times. The necessity and possibility of applying simple tools of Earthquake Prediction Strategies, in particular, Error Diagram, introduced by G.M. Molchan in early 1990ies, and Seismic Roulette null-hypothesis as a metric of the alerted space, is evident. The set of errors, i.e. the rates of failure and of the alerted space-time volume, can be easily compared to random guessing, which comparison permits evaluating the SHA method effectiveness and determining the optimal choice of parameters in regard to a given cost-benefit function. These and other information obtained in such a simple testing may supply us with a realistic estimates of confidence and accuracy of SHA predictions and, if reliable but not necessarily perfect, with related recommendations on the level of risks for decision making in regard to engineering design, insurance

12. The Ordered Network Structure and Prediction Summary for M≥7 Earthquakes in Xinjiang Region of China

Men, Ke-Pei; Zhao, Kai

2014-12-01

M ≥7 earthquakes have showed an obvious commensurability and orderliness in Xinjiang of China and its adjacent region since 1800. The main orderly values are 30 a × k (k = 1,2,3), 11 12 a, 41 43 a, 18 19 a, and 5 6 a. In the guidance of the information forecasting theory of Wen-Bo Weng, based on previous research results, combining ordered network structure analysis with complex network technology, we focus on the prediction summary of M ≥ 7 earthquakes by using the ordered network structure, and add new information to further optimize network, hence construct the 2D- and 3D-ordered network structure of M ≥ 7 earthquakes. In this paper, the network structure revealed fully the regularity of seismic activity of M ≥ 7 earthquakes in the study region during the past 210 years. Based on this, the Karakorum M7.1 earthquake in 1996, the M7.9 earthquake on the frontier of Russia, Mongol, and China in 2003, and two Yutian M7.3 earthquakes in 2008 and 2014 were predicted successfully. At the same time, a new prediction opinion is presented that the future two M ≥ 7 earthquakes will probably occur around 2019 - 2020 and 2025 - 2026 in this region. The results show that large earthquake occurred in defined region can be predicted. The method of ordered network structure analysis produces satisfactory results for the mid-and-long term prediction of M ≥ 7 earthquakes.

13. Prediction of Earthquakes by Lunar Cicles

Rodriguez, G.

2007-05-01

Prediction of Earthquakes by Lunar Cicles Author ; Guillermo Rodriguez Rodriguez Afiliation Geophysic and Astrophysicist. Retired I have exposed this idea to many meetings of EGS, UGS, IUGG 95, from 80, 82.83,and AGU 2002 Washington and 2003 Niza I have thre aproximition in Time 1º Earthquakes hapen The same day of the years every 18 or 19 years (cicle Saros ) Some times in the same place or anhother very far . In anhother moments of the year , teh cicle can be are ; 14 years, 26 years, 32 years or the multiples o 18.61 years expecial 55, 93, 224, 150 ,300 etcetc. For To know the day in the year 2º Over de cicle o one Lunation ( Days over de date of new moon) The greats Earthquakes hapens with diferents intervals of days in the sucesives lunations (aproximately one month) like we can be see in the grafic enclosed. For to know the day of month 3º Over each day I have find that each 28 day repit aproximately the same hour and minute. The same longitude and the same latitud in all earthquakes , also the littles ones . This is very important because we can to proposse only the precaution of wait it in the street or squares Whenever some times the cicles can be longuers or more littles This is my special way of cientific metode As consecuence of the 1º and 2º principe we can look The correlation between years separated by cicles of the 1º tipe For example 1984 and 2002 0r 2003 and consecutive years include 2007...During 30 years I have look de dates. I am in my subconcense the way but I can not make it in scientific formalisme

14. Material contrast does not predict earthquake rupture propagation direction

USGS Publications Warehouse

Harris, R.A.; Day, S.M.

2005-01-01

Earthquakes often occur on faults that juxtapose different rocks. The result is rupture behavior that differs from that of an earthquake occurring on a fault in a homogeneous material. Previous 2D numerical simulations have studied simple cases of earthquake rupture propagation where there is a material contrast across a fault and have come to two different conclusions: 1) earthquake rupture propagation direction can be predicted from the material contrast, and 2) earthquake rupture propagation direction cannot be predicted from the material contrast. In this paper we provide observational evidence from 70 years of earthquakes at Parkfield, CA, and new 3D numerical simulations. Both the observations and the numerical simulations demonstrate that earthquake rupture propagation direction is unlikely to be predictable on the basis of a material contrast. Copyright 2005 by the American Geophysical Union.

15. 78 FR 64973 - National Earthquake Prediction Evaluation Council (NEPEC)

Federal Register 2010, 2011, 2012, 2013, 2014

2013-10-30

... updates on past topics of discussion, including work with social and behavioral scientists on improving... probabilities; USGS collaborative work with the Collaboratory for Study of Earthquake Predictability (CSEP...

16. Fixed recurrence and slip models better predict earthquake behavior than the time- and slip-predictable models: 2. Laboratory earthquakes

Rubinstein, Justin L.; Ellsworth, William L.; Beeler, Nicholas M.; Kilgore, Brian D.; Lockner, David A.; Savage, Heather M.

2012-02-01

The behavior of individual stick-slip events observed in three different laboratory experimental configurations is better explained by a "memoryless" earthquake model with fixed inter-event time or fixed slip than it is by the time- and slip-predictable models for earthquake occurrence. We make similar findings in the companion manuscript for the behavior of natural repeating earthquakes. Taken together, these results allow us to conclude that the predictions of a characteristic earthquake model that assumes either fixed slip or fixed recurrence interval should be preferred to the predictions of the time- and slip-predictable models for all earthquakes. Given that the fixed slip and recurrence models are the preferred models for all of the experiments we examine, we infer that in an event-to-event sense the elastic rebound model underlying the time- and slip-predictable models does not explain earthquake behavior. This does not indicate that the elastic rebound model should be rejected in a long-term-sense, but it should be rejected for short-term predictions. The time- and slip-predictable models likely offer worse predictions of earthquake behavior because they rely on assumptions that are too simple to explain the behavior of earthquakes. Specifically, the time-predictable model assumes a constant failure threshold and the slip-predictable model assumes that there is a constant minimum stress. There is experimental and field evidence that these assumptions are not valid for all earthquakes.

17. Gambling scores for earthquake predictions and forecasts

Zhuang, Jiancang

2010-04-01

This paper presents a new method, namely the gambling score, for scoring the performance earthquake forecasts or predictions. Unlike most other scoring procedures that require a regular scheme of forecast and treat each earthquake equally, regardless their magnitude, this new scoring method compensates the risk that the forecaster has taken. Starting with a certain number of reputation points, once a forecaster makes a prediction or forecast, he is assumed to have betted some points of his reputation. The reference model, which plays the role of the house, determines how many reputation points the forecaster can gain if he succeeds, according to a fair rule, and also takes away the reputation points betted by the forecaster if he loses. This method is also extended to the continuous case of point process models, where the reputation points betted by the forecaster become a continuous mass on the space-time-magnitude range of interest. We also calculate the upper bound of the gambling score when the true model is a renewal process, the stress release model or the ETAS model and when the reference model is the Poisson model.

18. Statistical short-term earthquake prediction.

PubMed

Kagan, Y Y; Knopoff, L

1987-06-19

A statistical procedure, derived from a theoretical model of fracture growth, is used to identify a foreshock sequence while it is in progress. As a predictor, the procedure reduces the average uncertainty in the rate of occurrence for a future strong earthquake by a factor of more than 1000 when compared with the Poisson rate of occurrence. About one-third of all main shocks with local magnitude greater than or equal to 4.0 in central California can be predicted in this way, starting from a 7-year database that has a lower magnitude cut off of 1.5. The time scale of such predictions is of the order of a few hours to a few days for foreshocks in the magnitude range from 2.0 to 5.0.

19. Modified-Fibonacci-Dual-Lucas method for earthquake prediction

Boucouvalas, A. C.; Gkasios, M.; Tselikas, N. T.; Drakatos, G.

2015-06-01

The FDL method makes use of Fibonacci, Dual and Lucas numbers and has shown considerable success in predicting earthquake events locally as well as globally. Predicting the location of the epicenter of an earthquake is one difficult challenge the other being the timing and magnitude. One technique for predicting the onset of earthquakes is the use of cycles, and the discovery of periodicity. Part of this category is the reported FDL method. The basis of the reported FDL method is the creation of FDL future dates based on the onset date of significant earthquakes. The assumption being that each occurred earthquake discontinuity can be thought of as a generating source of FDL time series The connection between past earthquakes and future earthquakes based on FDL numbers has also been reported with sample earthquakes since 1900. Using clustering methods it has been shown that significant earthquakes (<6.5R) can be predicted with very good accuracy window (+-1 day). In this contribution we present an improvement modification to the FDL method, the MFDL method, which performs better than the FDL. We use the FDL numbers to develop possible earthquakes dates but with the important difference that the starting seed date is a trigger planetary aspect prior to the earthquake. Typical planetary aspects are Moon conjunct Sun, Moon opposite Sun, Moon conjunct or opposite North or South Modes. In order to test improvement of the method we used all +8R earthquakes recorded since 1900, (86 earthquakes from USGS data). We have developed the FDL numbers for each of those seeds, and examined the earthquake hit rates (for a window of 3, i.e. +-1 day of target date) and for <6.5R. The successes are counted for each one of the 86 earthquake seeds and we compare the MFDL method with the FDL method. In every case we find improvement when the starting seed date is on the planetary trigger date prior to the earthquake. We observe no improvement only when a planetary trigger coincided with

20. A test to evaluate the earthquake prediction algorithm, M8

USGS Publications Warehouse

Healy, John H.; Kossobokov, Vladimir G.; Dewey, James W.

1992-01-01

A test of the algorithm M8 is described. The test is constructed to meet four rules, which we propose to be applicable to the test of any method for earthquake prediction:  1. An earthquake prediction technique should be presented as a well documented, logical algorithm that can be used by  investigators without restrictions. 2. The algorithm should be coded in a common programming language and implementable on widely available computer systems. 3. A test of the earthquake prediction technique should involve future predictions with a black box version of the algorithm in which potentially adjustable parameters are fixed in advance. The source of the input data must be defined and ambiguities in these data must be resolved automatically by the algorithm. 4. At least one reasonable null hypothesis should be stated in advance of testing the earthquake prediction method, and it should be stated how this null hypothesis will be used to estimate the statistical significance of the earthquake predictions. The M8 algorithm has successfully predicted several destructive earthquakes, in the sense that the earthquakes occurred inside regions with linear dimensions from 384 to 854 km that the algorithm had identified as being in times of increased probability for strong earthquakes. In addition, M8 has successfully "post predicted" high percentages of strong earthquakes in regions to which it has been applied in retroactive studies. The statistical significance of previous predictions has not been established, however, and post-prediction studies in general are notoriously subject to success-enhancement through hindsight. Nor has it been determined how much more precise an M8 prediction might be than forecasts and probability-of-occurrence estimates made by other techniques. We view our test of M8 both as a means to better determine the effectiveness of M8 and as an experimental structure within which to make observations that might lead to improvements in the algorithm

1. Testing prediction methods: Earthquake clustering versus the Poisson model

USGS Publications Warehouse

Michael, A.J.

1997-01-01

Testing earthquake prediction methods requires statistical techniques that compare observed success to random chance. One technique is to produce simulated earthquake catalogs and measure the relative success of predicting real and simulated earthquakes. The accuracy of these tests depends on the validity of the statistical model used to simulate the earthquakes. This study tests the effect of clustering in the statistical earthquake model on the results. Three simulation models were used to produce significance levels for a VLF earthquake prediction method. As the degree of simulated clustering increases, the statistical significance drops. Hence, the use of a seismicity model with insufficient clustering can lead to overly optimistic results. A successful method must pass the statistical tests with a model that fully replicates the observed clustering. However, a method can be rejected based on tests with a model that contains insufficient clustering. U.S. copyright. Published in 1997 by the American Geophysical Union.

2. Prediction of earthquake-triggered landslide event sizes

Braun, Anika; Havenith, Hans-Balder; Schlögel, Romy

2016-04-01

Seismically induced landslides are a major environmental effect of earthquakes, which may significantly contribute to related losses. Moreover, in paleoseismology landslide event sizes are an important proxy for the estimation of the intensity and magnitude of past earthquakes and thus allowing us to improve seismic hazard assessment over longer terms. Not only earthquake intensity, but also factors such as the fault characteristics, topography, climatic conditions and the geological environment have a major impact on the intensity and spatial distribution of earthquake induced landslides. We present here a review of factors contributing to earthquake triggered slope failures based on an "event-by-event" classification approach. The objective of this analysis is to enable the short-term prediction of earthquake triggered landslide event sizes in terms of numbers and size of the affected area right after an earthquake event occurred. Five main factors, 'Intensity', 'Fault', 'Topographic energy', 'Climatic conditions' and 'Surface geology' were used to establish a relationship to the number and spatial extend of landslides triggered by an earthquake. The relative weight of these factors was extracted from published data for numerous past earthquakes; topographic inputs were checked in Google Earth and through geographic information systems. Based on well-documented recent earthquakes (e.g. Haiti 2010, Wenchuan 2008) and on older events for which reliable extensive information was available (e.g. Northridge 1994, Loma Prieta 1989, Guatemala 1976, Peru 1970) the combination and relative weight of the factors was calibrated. The calibrated factor combination was then applied to more than 20 earthquake events for which landslide distribution characteristics could be cross-checked. One of our main findings is that the 'Fault' factor, which is based on characteristics of the fault, the surface rupture and its location with respect to mountain areas, has the most important

3. The initial subevent of the 1994 Northridge, California, earthquake: Is earthquake size predictable?

USGS Publications Warehouse

Kilb, Debi; Gomberg, J.

1999-01-01

We examine the initial subevent (ISE) of the M?? 6.7, 1994 Northridge, California, earthquake in order to discriminate between two end-member rupture initiation models: the 'preslip' and 'cascade' models. Final earthquake size may be predictable from an ISE's seismic signature in the preslip model but not in the cascade model. In the cascade model ISEs are simply small earthquakes that can be described as purely dynamic ruptures. In this model a large earthquake is triggered by smaller earthquakes; there is no size scaling between triggering and triggered events and a variety of stress transfer mechanisms are possible. Alternatively, in the preslip model, a large earthquake nucleates as an aseismically slipping patch in which the patch dimension grows and scales with the earthquake's ultimate size; the byproduct of this loading process is the ISE. In this model, the duration of the ISE signal scales with the ultimate size of the earthquake, suggesting that nucleation and earthquake size are determined by a more predictable, measurable, and organized process. To distinguish between these two end-member models we use short period seismograms recorded by the Southern California Seismic Network. We address questions regarding the similarity in hypocenter locations and focal mechanisms of the ISE and the mainshock. We also compare the ISE's waveform characteristics to those of small earthquakes and to the beginnings of earthquakes with a range of magnitudes. We find that the focal mechanisms of the ISE and mainshock are indistinguishable, and both events may have nucleated on and ruptured the same fault plane. These results satisfy the requirements for both models and thus do not discriminate between them. However, further tests show the ISE's waveform characteristics are similar to those of typical small earthquakes in the vicinity and more importantly, do not scale with the mainshock magnitude. These results are more consistent with the cascade model.

4. The 2008 Wenchuan Earthquake and the Rise and Fall of Earthquake Prediction in China

Chen, Q.; Wang, K.

2009-12-01

Regardless of the future potential of earthquake prediction, it is presently impractical to rely on it to mitigate earthquake disasters. The practical approach is to strengthen the resilience of our built environment to earthquakes based on hazard assessment. But this was not common understanding in China when the M 7.9 Wenchuan earthquake struck the Sichuan Province on 12 May 2008, claiming over 80,000 lives. In China, earthquake prediction is a government-sanctioned and law-regulated measure of disaster prevention. A sudden boom of the earthquake prediction program in 1966-1976 coincided with a succession of nine M > 7 damaging earthquakes in the densely populated region of the country and the political chaos of the Cultural Revolution. It climaxed with the prediction of the 1975 Haicheng earthquake, which was due mainly to an unusually pronounced foreshock sequence and the extraordinary readiness of some local officials to issue imminent warning and evacuation order. The Haicheng prediction was a success in practice and yielded useful lessons, but the experience cannot be applied to most other earthquakes and cultural environments. Since the disastrous Tangshan earthquake in 1976 that killed over 240,000 people, there have been two opposite trends in China: decreasing confidence in prediction and increasing emphasis on regulating construction design for earthquake resilience. In 1976, most of the seismic intensity XI areas of Tangshan were literally razed to the ground, but in 2008, many buildings in the intensity XI areas of Wenchuan did not collapse. Prediction did not save life in either of these events; the difference was made by construction standards. For regular buildings, there was no seismic design in Tangshan to resist any earthquake shaking in 1976, but limited seismic design was required for the Wenchuan area in 2008. Although the construction standards were later recognized to be too low, those buildings that met the standards suffered much less

5. Introduction to the special issue on the 2004 Parkfield earthquake and the Parkfield earthquake prediction experiment

USGS Publications Warehouse

Harris, R.A.; Arrowsmith, J.R.

2006-01-01

The 28 September 2004 M 6.0 Parkfield earthquake, a long-anticipated event on the San Andreas fault, is the world's best recorded earthquake to date, with state-of-the-art data obtained from geologic, geodetic, seismic, magnetic, and electrical field networks. This has allowed the preearthquake and postearthquake states of the San Andreas fault in this region to be analyzed in detail. Analyses of these data provide views into the San Andreas fault that show a complex geologic history, fault geometry, rheology, and response of the nearby region to the earthquake-induced ground movement. Although aspects of San Andreas fault zone behavior in the Parkfield region can be modeled simply over geological time frames, the Parkfield Earthquake Prediction Experiment and the 2004 Parkfield earthquake indicate that predicting the fine details of future earthquakes is still a challenge. Instead of a deterministic approach, forecasting future damaging behavior, such as that caused by strong ground motions, will likely continue to require probabilistic methods. However, the Parkfield Earthquake Prediction Experiment and the 2004 Parkfield earthquake have provided ample data to understand most of what did occur in 2004, culminating in significant scientific advances.

6. Fixed recurrence and slip models better predict earthquake behavior than the time- and slip-predictable models 1: repeating earthquakes

USGS Publications Warehouse

Rubinstein, Justin L.; Ellsworth, William L.; Chen, Kate Huihsuan; Uchida, Naoki

2012-01-01

The behavior of individual events in repeating earthquake sequences in California, Taiwan and Japan is better predicted by a model with fixed inter-event time or fixed slip than it is by the time- and slip-predictable models for earthquake occurrence. Given that repeating earthquakes are highly regular in both inter-event time and seismic moment, the time- and slip-predictable models seem ideally suited to explain their behavior. Taken together with evidence from the companion manuscript that shows similar results for laboratory experiments we conclude that the short-term predictions of the time- and slip-predictable models should be rejected in favor of earthquake models that assume either fixed slip or fixed recurrence interval. This implies that the elastic rebound model underlying the time- and slip-predictable models offers no additional value in describing earthquake behavior in an event-to-event sense, but its value in a long-term sense cannot be determined. These models likely fail because they rely on assumptions that oversimplify the earthquake cycle. We note that the time and slip of these events is predicted quite well by fixed slip and fixed recurrence models, so in some sense they are time- and slip-predictable. While fixed recurrence and slip models better predict repeating earthquake behavior than the time- and slip-predictable models, we observe a correlation between slip and the preceding recurrence time for many repeating earthquake sequences in Parkfield, California. This correlation is not found in other regions, and the sequences with the correlative slip-predictable behavior are not distinguishable from nearby earthquake sequences that do not exhibit this behavior.

7. Earthquake prediction in seismogenic areas of the Iberian Peninsula based on computational intelligence

Morales-Esteban, A.; Martínez-Álvarez, F.; Reyes, J.

2013-05-01

A method to predict earthquakes in two of the seismogenic areas of the Iberian Peninsula, based on Artificial Neural Networks (ANNs), is presented in this paper. ANNs have been widely used in many fields but only very few and very recent studies have been conducted on earthquake prediction. Two kinds of predictions are provided in this study: a) the probability of an earthquake, of magnitude equal or larger than a preset threshold magnitude, within the next 7 days, to happen; b) the probability of an earthquake of a limited magnitude interval to happen, during the next 7 days. First, the physical fundamentals related to earthquake occurrence are explained. Second, the mathematical model underlying ANNs is explained and the configuration chosen is justified. Then, the ANNs have been trained in both areas: The Alborán Sea and the Western Azores-Gibraltar fault. Later, the ANNs have been tested in both areas for a period of time immediately subsequent to the training period. Statistical tests are provided showing meaningful results. Finally, ANNs were compared to other well known classifiers showing quantitatively and qualitatively better results. The authors expect that the results obtained will encourage researchers to conduct further research on this topic. Development of a system capable of predicting earthquakes for the next seven days Application of ANN is particularly reliable to earthquake prediction. Use of geophysical information modeling the soil behavior as ANN's input data Successful analysis of one region with large seismic activity

8. From a physical approach to earthquake prediction, towards long and short term warnings ahead of large earthquakes

Stefansson, R.; Bonafede, M.

2012-04-01

For 20 years the South Iceland Seismic Zone (SISZ) was a test site for multinational earthquake prediction research, partly bridging the gap between laboratory tests samples, and the huge transform zones of the Earth. The approach was to explore the physics of processes leading up to large earthquakes. The book Advances in Earthquake Prediction, Research and Risk Mitigation, by R. Stefansson (2011), published by Springer/PRAXIS, and an article in the August issue of the BSSA by Stefansson, M. Bonafede and G. Gudmundsson (2011) contain a good overview of the findings, and more references, as well as examples of partially successful long and short term warnings based on such an approach. Significant findings are: Earthquakes that occurred hundreds of years ago left scars in the crust, expressed in volumes of heterogeneity that demonstrate the size of their faults. Rheology and stress heterogeneity within these volumes are significantly variable in time and space. Crustal processes in and near such faults may be observed by microearthquake information decades before the sudden onset of a new large earthquake. High pressure fluids of mantle origin may in response to strain, especially near plate boundaries, migrate upward into the brittle/elastic crust to play a significant role in modifying crustal conditions on a long and short term. Preparatory processes of various earthquakes can not be expected to be the same. We learn about an impending earthquake by observing long term preparatory processes at the fault, finding a constitutive relationship that governs the processes, and then extrapolating that relationship into near space and future. This is a deterministic approach in earthquake prediction research. Such extrapolations contain many uncertainties. However the long time pattern of observations of the pre-earthquake fault process will help us to put probability constraints on our extrapolations and our warnings. The approach described is different from the usual

9. Understanding earthquake from the granular physics point of view — Causes of earthquake, earthquake precursors and predictions

Lu, Kunquan; Hou, Meiying; Jiang, Zehui; Wang, Qiang; Sun, Gang; Liu, Jixing

2018-03-01

We treat the earth crust and mantle as large scale discrete matters based on the principles of granular physics and existing experimental observations. Main outcomes are: A granular model of the structure and movement of the earth crust and mantle is established. The formation mechanism of the tectonic forces, which causes the earthquake, and a model of propagation for precursory information are proposed. Properties of the seismic precursory information and its relevance with the earthquake occurrence are illustrated, and principle of ways to detect the effective seismic precursor is elaborated. The mechanism of deep-focus earthquake is also explained by the jamming-unjamming transition of the granular flow. Some earthquake phenomena which were previously difficult to understand are explained, and the predictability of the earthquake is discussed. Due to the discrete nature of the earth crust and mantle, the continuum theory no longer applies during the quasi-static seismological process. In this paper, based on the principles of granular physics, we study the causes of earthquakes, earthquake precursors and predictions, and a new understanding, different from the traditional seismological viewpoint, is obtained.

10. Implications of fault constitutive properties for earthquake prediction.

PubMed

Dieterich, J H; Kilgore, B

1996-04-30

The rate- and state-dependent constitutive formulation for fault slip characterizes an exceptional variety of materials over a wide range of sliding conditions. This formulation provides a unified representation of diverse sliding phenomena including slip weakening over a characteristic sliding distance Dc, apparent fracture energy at a rupture front, time-dependent healing after rapid slip, and various other transient and slip rate effects. Laboratory observations and theoretical models both indicate that earthquake nucleation is accompanied by long intervals of accelerating slip. Strains from the nucleation process on buried faults generally could not be detected if laboratory values of Dc apply to faults in nature. However, scaling of Dc is presently an open question and the possibility exists that measurable premonitory creep may precede some earthquakes. Earthquake activity is modeled as a sequence of earthquake nucleation events. In this model, earthquake clustering arises from sensitivity of nucleation times to the stress changes induced by prior earthquakes. The model gives the characteristic Omori aftershock decay law and assigns physical interpretation to aftershock parameters. The seismicity formulation predicts large changes of earthquake probabilities result from stress changes. Two mechanisms for foreshocks are proposed that describe observed frequency of occurrence of foreshock-mainshock pairs by time and magnitude. With the first mechanism, foreshocks represent a manifestation of earthquake clustering in which the stress change at the time of the foreshock increases the probability of earthquakes at all magnitudes including the eventual mainshock. With the second model, accelerating fault slip on the mainshock nucleation zone triggers foreshocks.

11. Prospects for earthquake prediction and control

USGS Publications Warehouse

Healy, J.H.; Lee, W.H.K.; Pakiser, L.C.; Raleigh, C.B.; Wood, M.D.

1972-01-01

The San Andreas fault is viewed, according to the concepts of seafloor spreading and plate tectonics, as a transform fault that separates the Pacific and North American plates and along which relative movements of 2 to 6 cm/year have been taking place. The resulting strain can be released by creep, by earthquakes of moderate size, or (as near San Francisco and Los Angeles) by great earthquakes. Microearthquakes, as mapped by a dense seismograph network in central California, generally coincide with zones of the San Andreas fault system that are creeping. Microearthquakes are few and scattered in zones where elastic energy is being stored. Changes in the rate of strain, as recorded by tiltmeter arrays, have been observed before several earthquakes of about magnitude 4. Changes in fluid pressure may control timing of seismic activity and make it possible to control natural earthquakes by controlling variations in fluid pressure in fault zones. An experiment in earthquake control is underway at the Rangely oil field in Colorado, where the rates of fluid injection and withdrawal in experimental wells are being controlled. ?? 1972.

USGS Publications Warehouse

Wald, David J.

2009-01-01

What’s the best way to get alerted about the occurrence and potential impact of an earthquake? The answer to that question has changed dramatically of late, in part due to improvements in earthquake science, and in part by the implementation of new research in the delivery of earthquake information

13. Automatic Earthquake Shear Stress Measurement Method Developed for Accurate Time- Prediction Analysis of Forthcoming Major Earthquakes Along Shallow Active Faults

Serata, S.

2006-12-01

The Serata Stressmeter has been developed to measure and monitor earthquake shear stress build-up along shallow active faults. The development work made in the past 25 years has established the Stressmeter as an automatic stress measurement system to study timing of forthcoming major earthquakes in support of the current earthquake prediction studies based on statistical analysis of seismological observations. In early 1982, a series of major Man-made earthquakes (magnitude 4.5-5.0) suddenly occurred in an area over deep underground potash mine in Saskatchewan, Canada. By measuring underground stress condition of the mine, the direct cause of the earthquake was disclosed. The cause was successfully eliminated by controlling the stress condition of the mine. The Japanese government was interested in this development and the Stressmeter was introduced to the Japanese government research program for earthquake stress studies. In Japan the Stressmeter was first utilized for direct measurement of the intrinsic lateral tectonic stress gradient G. The measurement, conducted at the Mt. Fuji Underground Research Center of the Japanese government, disclosed the constant natural gradients of maximum and minimum lateral stresses in an excellent agreement with the theoretical value, i.e., G = 0.25. All the conventional methods of overcoring, hydrofracturing and deformation, which were introduced to compete with the Serata method, failed demonstrating the fundamental difficulties of the conventional methods. The intrinsic lateral stress gradient determined by the Stressmeter for the Japanese government was found to be the same with all the other measurements made by the Stressmeter in Japan. The stress measurement results obtained by the major international stress measurement work in the Hot Dry Rock Projects conducted in USA, England and Germany are found to be in good agreement with the Stressmeter results obtained in Japan. Based on this broad agreement, a solid geomechanical

14. A forecast experiment of earthquake activity in Japan under Collaboratory for the Study of Earthquake Predictability (CSEP)

Hirata, N.; Yokoi, S.; Nanjo, K. Z.; Tsuruoka, H.

2012-04-01

One major focus of the current Japanese earthquake prediction research program (2009-2013), which is now integrated with the research program for prediction of volcanic eruptions, is to move toward creating testable earthquake forecast models. For this purpose we started an experiment of forecasting earthquake activity in Japan under the framework of the Collaboratory for the Study of Earthquake Predictability (CSEP) through an international collaboration. We established the CSEP Testing Centre, an infrastructure to encourage researchers to develop testable models for Japan, and to conduct verifiable prospective tests of their model performance. We started the 1st earthquake forecast testing experiment in Japan within the CSEP framework. We use the earthquake catalogue maintained and provided by the Japan Meteorological Agency (JMA). The experiment consists of 12 categories, with 4 testing classes with different time spans (1 day, 3 months, 1 year, and 3 years) and 3 testing regions called "All Japan," "Mainland," and "Kanto." A total of 105 models were submitted, and are currently under the CSEP official suite of tests for evaluating the performance of forecasts. The experiments were completed for 92 rounds for 1-day, 6 rounds for 3-month, and 3 rounds for 1-year classes. For 1-day testing class all models passed all the CSEP's evaluation tests at more than 90% rounds. The results of the 3-month testing class also gave us new knowledge concerning statistical forecasting models. All models showed a good performance for magnitude forecasting. On the other hand, observation is hardly consistent in space distribution with most models when many earthquakes occurred at a spot. Now we prepare the 3-D forecasting experiment with a depth range of 0 to 100 km in Kanto region. The testing center is improving an evaluation system for 1-day class experiment to finish forecasting and testing results within one day. The special issue of 1st part titled Earthquake Forecast

15. Earthquake prediction rumors can help in building earthquake awareness: the case of May the 11th 2011 in Rome (Italy)

Amato, A.; Arcoraci, L.; Casarotti, E.; Cultrera, G.; Di Stefano, R.; Margheriti, L.; Nostro, C.; Selvaggi, G.; May-11 Team

2012-04-01

16. Discussion of New Approaches to Medium-Short-Term Earthquake Forecast in Practice of The Earthquake Prediction in Yunnan

Hong, F.

2017-12-01

After retrospection of years of practice of the earthquake prediction in Yunnan area, it is widely considered that the fixed-point earthquake precursory anomalies mainly reflect the field information. The increase of amplitude and number of precursory anomalies could help to determine the original time of earthquakes, however it is difficult to obtain the spatial relevance between earthquakes and precursory anomalies, thus we can hardly predict the spatial locations of earthquakes using precursory anomalies. The past practices have shown that the seismic activities are superior to the precursory anomalies in predicting earthquakes locations, resulting from the increased seismicity were observed before 80% M=6.0 earthquakes in Yunnan area. While the mobile geomagnetic anomalies are turned out to be helpful in predicting earthquakes locations in recent year, for instance, the forecasted earthquakes occurring time and area derived form the 1-year-scale geomagnetic anomalies before the M6.5 Ludian earthquake in 2014 are shorter and smaller than which derived from the seismicity enhancement region. According to the past works, the author believes that the medium-short-term earthquake forecast level, as well as objective understanding of the seismogenic mechanisms, could be substantially improved by the densely laying observation array and capturing the dynamic process of physical property changes in the enhancement region of medium to small earthquakes.

17. Implications of fault constitutive properties for earthquake prediction.

PubMed Central

Dieterich, J H; Kilgore, B

1996-01-01

The rate- and state-dependent constitutive formulation for fault slip characterizes an exceptional variety of materials over a wide range of sliding conditions. This formulation provides a unified representation of diverse sliding phenomena including slip weakening over a characteristic sliding distance Dc, apparent fracture energy at a rupture front, time-dependent healing after rapid slip, and various other transient and slip rate effects. Laboratory observations and theoretical models both indicate that earthquake nucleation is accompanied by long intervals of accelerating slip. Strains from the nucleation process on buried faults generally could not be detected if laboratory values of Dc apply to faults in nature. However, scaling of Dc is presently an open question and the possibility exists that measurable premonitory creep may precede some earthquakes. Earthquake activity is modeled as a sequence of earthquake nucleation events. In this model, earthquake clustering arises from sensitivity of nucleation times to the stress changes induced by prior earthquakes. The model gives the characteristic Omori aftershock decay law and assigns physical interpretation to aftershock parameters. The seismicity formulation predicts large changes of earthquake probabilities result from stress changes. Two mechanisms for foreshocks are proposed that describe observed frequency of occurrence of foreshock-mainshock pairs by time and magnitude. With the first mechanism, foreshocks represent a manifestation of earthquake clustering in which the stress change at the time of the foreshock increases the probability of earthquakes at all magnitudes including the eventual mainshock. With the second model, accelerating fault slip on the mainshock nucleation zone triggers foreshocks. Images Fig. 3 PMID:11607666

18. Implications of fault constitutive properties for earthquake prediction

USGS Publications Warehouse

Dieterich, J.H.; Kilgore, B.

1996-01-01

The rate- and state-dependent constitutive formulation for fault slip characterizes an exceptional variety of materials over a wide range of sliding conditions. This formulation provides a unified representation of diverse sliding phenomena including slip weakening over a characteristic sliding distance D(c), apparent fracture energy at a rupture front, time- dependent healing after rapid slip, and various other transient and slip rate effects. Laboratory observations and theoretical models both indicate that earthquake nucleation is accompanied by long intervals of accelerating slip. Strains from the nucleation process on buried faults generally could not be detected if laboratory values of D, apply to faults in nature. However, scaling of D(c) is presently an open question and the possibility exists that measurable premonitory creep may precede some earthquakes. Earthquake activity is modeled as a sequence of earthquake nucleation events. In this model, earthquake clustering arises from sensitivity of nucleation times to the stress changes induced by prior earthquakes. The model gives the characteristic Omori aftershock decay law and assigns physical interpretation to aftershock parameters. The seismicity formulation predicts large changes of earthquake probabilities result from stress changes. Two mechanisms for foreshocks are proposed that describe observed frequency of occurrence of foreshock-mainshock pairs by time and magnitude. With the first mechanism, foreshocks represent a manifestation of earthquake clustering in which the stress change at the time of the foreshock increases the probability of earthquakes at all magnitudes including the eventual mainshock. With the second model, accelerating fault slip on the mainshock nucleation zone triggers foreshocks.

19. Recent Achievements of the Collaboratory for the Study of Earthquake Predictability

Jordan, T. H.; Liukis, M.; Werner, M. J.; Schorlemmer, D.; Yu, J.; Maechling, P. J.; Jackson, D. D.; Rhoades, D. A.; Zechar, J. D.; Marzocchi, W.

2016-12-01

The Collaboratory for the Study of Earthquake Predictability (CSEP) supports a global program to conduct prospective earthquake forecasting experiments. CSEP testing centers are now operational in California, New Zealand, Japan, China, and Europe with 442 models under evaluation. The California testing center, started by SCEC, Sept 1, 2007, currently hosts 30-minute, 1-day, 3-month, 1-year and 5-year forecasts, both alarm-based and probabilistic, for California, the Western Pacific, and worldwide. Our tests are now based on the hypocentral locations and magnitudes of cataloged earthquakes, but we plan to test focal mechanisms, seismic hazard models, ground motion forecasts, and finite rupture forecasts as well. We have increased computational efficiency for high-resolution global experiments, such as the evaluation of the Global Earthquake Activity Rate (GEAR) model, introduced Bayesian ensemble models, and implemented support for non-Poissonian simulation-based forecasts models. We are currently developing formats and procedures to evaluate externally hosted forecasts and predictions. CSEP supports the USGS program in operational earthquake forecasting and a DHS project to register and test external forecast procedures from experts outside seismology. We found that earthquakes as small as magnitude 2.5 provide important information on subsequent earthquakes larger than magnitude 5. A retrospective experiment for the 2010-2012 Canterbury earthquake sequence showed that some physics-based and hybrid models outperform catalog-based (e.g., ETAS) models. This experiment also demonstrates the ability of the CSEP infrastructure to support retrospective forecast testing. Current CSEP development activities include adoption of the Comprehensive Earthquake Catalog (ComCat) as an authorized data source, retrospective testing of simulation-based forecasts, and support for additive ensemble methods. We describe the open-source CSEP software that is available to researchers as

20. Turning the rumor of May 11, 2011 earthquake prediction In Rome, Italy, into an information day on earthquake hazard

Amato, A.; Cultrera, G.; Margheriti, L.; Nostro, C.; Selvaggi, G.; INGVterremoti Team

2011-12-01

1. Sun-earth environment study to understand earthquake prediction

Mukherjee, S.

2007-05-01

Earthquake prediction is possible by looking into the location of active sunspots before it harbours energy towards earth. Earth is a restless planet the restlessness turns deadly occasionally. Of all natural hazards, earthquakes are the most feared. For centuries scientists working in seismically active regions have noted premonitory signals. Changes in thermosphere, Ionosphere, atmosphere and hydrosphere are noted before the changes in geosphere. The historical records talk of changes of the water level in wells, of strange weather, of ground-hugging fog, of unusual behaviour of animals (due to change in magnetic field of the earth) that seem to feel the approach of a major earthquake. With the advent of modern science and technology the understanding of these pre-earthquake signals has become stronger enough to develop a methodology of earthquake prediction. A correlation of earth directed coronal mass ejection (CME) from the active sunspots has been possible to develop as a precursor of the earthquake. Occasional local magnetic field and planetary indices (Kp values) changes in the lower atmosphere that is accompanied by the formation of haze and a reduction of moisture in the air. Large patches, often tens to hundreds of thousands of square kilometres in size, seen in night-time infrared satellite images where the land surface temperature seems to fluctuate rapidly. Perturbations in the ionosphere at 90 - 120 km altitude have been observed before the occurrence of earthquakes. These changes affect the transmission of radio waves and a radio black out has been observed due to CME. Another heliophysical parameter Electron flux (Eflux) has been monitored before the occurrence of the earthquakes. More than hundreds of case studies show that before the occurrence of the earthquakes the atmospheric temperature increases and suddenly drops before the occurrence of the earthquakes. These changes are being monitored by using Sun Observatory Heliospheric observatory

2. Testing an Earthquake Prediction Algorithm: The 2016 New Zealand and Chile Earthquakes

2017-05-01

The 13 November 2016, M7.8, 54 km NNE of Amberley, New Zealand and the 25 December 2016, M7.6, 42 km SW of Puerto Quellon, Chile earthquakes happened outside the area of the on-going real-time global testing of the intermediate-term middle-range earthquake prediction algorithm M8, accepted in 1992 for the M7.5+ range. Naturally, over the past two decades, the level of registration of earthquakes worldwide has grown significantly and by now is sufficient for diagnosis of times of increased probability (TIPs) by the M8 algorithm on the entire territory of New Zealand and Southern Chile as far as below 40°S. The mid-2016 update of the M8 predictions determines TIPs in the additional circles of investigation (CIs) where the two earthquakes have happened. Thus, after 50 semiannual updates in the real-time prediction mode, we (1) confirm statistically approved high confidence of the M8-MSc predictions and (2) conclude a possibility of expanding the territory of the Global Test of the algorithms M8 and MSc in an apparently necessary revision of the 1992 settings.

3. Earthquakes

MedlinePlus

... Search Term(s): Main Content Home Be Informed Earthquakes Earthquakes An earthquake is the sudden, rapid shaking of the earth, ... by the breaking and shifting of underground rock. Earthquakes can cause buildings to collapse and cause heavy ...

4. Measurement of neutron and charged particle fluxes toward earthquake prediction

Maksudov, Asatulla U.; Zufarov, Mars A.

2017-12-01

In this paper, we describe a possible method for predicting the earthquakes, which is based on simultaneous recording of the intensity of fluxes of neutrons and charged particles by detectors, commonly used in nuclear physics. These low-energy particles originate from radioactive nuclear processes in the Earth's crust. The variations in the particle flux intensity can be the precursor of the earthquake. A description is given of an electronic installation that records the fluxes of charged particles in the radial direction, which are a possible response to the accumulated tectonic stresses in the Earth's crust. The obtained results showed an increase in the intensity of the fluxes for 10 or more hours before the occurrence of the earthquake. The previous version of the installation was able to indicate for the possibility of an earthquake (Maksudov et al. in Instrum Exp Tech 58:130-131, 2015), but did not give information about the direction of the epicenter location. In this regard, the installation was modified by adding eight directional detectors. With the upgraded setup, we have received both the predictive signals, and signals determining the directions of the location of the forthcoming earthquake, starting 2-3 days before its origin.

5. Earthquake Prediction in Large-scale Faulting Experiments

Junger, J.; Kilgore, B.; Beeler, N.; Dieterich, J.

2004-12-01

6. 75 FR 63854 - National Earthquake Prediction Evaluation Council (NEPEC) Advisory Committee

Federal Register 2010, 2011, 2012, 2013, 2014

2010-10-18

... DEPARTMENT OF THE INTERIOR Geological Survey National Earthquake Prediction Evaluation Council...: Pursuant to Public Law 96-472, the National Earthquake Prediction Evaluation Council (NEPEC) will hold a 2... proposed earthquake predictions, on the completeness and scientific validity of the available data related...

7. Predictability of population displacement after the 2010 Haiti earthquake

PubMed Central

Lu, Xin; Bengtsson, Linus; Holme, Petter

2012-01-01

Most severe disasters cause large population movements. These movements make it difficult for relief organizations to efficiently reach people in need. Understanding and predicting the locations of affected people during disasters is key to effective humanitarian relief operations and to long-term societal reconstruction. We collaborated with the largest mobile phone operator in Haiti (Digicel) and analyzed the movements of 1.9 million mobile phone users during the period from 42 d before, to 341 d after the devastating Haiti earthquake of January 12, 2010. Nineteen days after the earthquake, population movements had caused the population of the capital Port-au-Prince to decrease by an estimated 23%. Both the travel distances and size of people’s movement trajectories grew after the earthquake. These findings, in combination with the disorder that was present after the disaster, suggest that people’s movements would have become less predictable. Instead, the predictability of people’s trajectories remained high and even increased slightly during the three-month period after the earthquake. Moreover, the destinations of people who left the capital during the first three weeks after the earthquake was highly correlated with their mobility patterns during normal times, and specifically with the locations in which people had significant social bonds. For the people who left Port-au-Prince, the duration of their stay outside the city, as well as the time for their return, all followed a skewed, fat-tailed distribution. The findings suggest that population movements during disasters may be significantly more predictable than previously thought. PMID:22711804

8. Predicted liquefaction of East Bay fills during a repeat of the 1906 San Francisco earthquake

USGS Publications Warehouse

Holzer, T.L.; Blair, J.L.; Noce, T.E.; Bennett, M.J.

2006-01-01

Predicted conditional probabilities of surface manifestations of liquefaction during a repeat of the 1906 San Francisco (M7.8) earthquake range from 0.54 to 0.79 in the area underlain by the sandy artificial fills along the eastern shore of San Francisco Bay near Oakland, California. Despite widespread liquefaction in 1906 of sandy fills in San Francisco, most of the East Bay fills were emplaced after 1906 without soil improvement to increase their liquefaction resistance. They have yet to be shaken strongly. Probabilities are based on the liquefaction potential index computed from 82 CPT soundings using median (50th percentile) estimates of PGA based on a ground-motion prediction equation. Shaking estimates consider both distance from the San Andreas Fault and local site conditions. The high probabilities indicate extensive and damaging liquefaction will occur in East Bay fills during the next M ??? 7.8 earthquake on the northern San Andreas Fault. ?? 2006, Earthquake Engineering Research Institute.

9. Earthquakes: Risk, Monitoring, Notification, and Research

DTIC Science & Technology

2008-06-19

Washington, Oregon, and Hawaii . The Rocky Mountain region, a portion of the central United States known as the New Madrid Seismic Zone, and portions...California, Washington, Oregon, and Alaska and Hawaii . Alaska is the most earthquake-prone state, experiencing a magnitude 7 earthquake1 almost every...Oakland, CA $349 23 Las Vegas, NV$28 4 San Francisco, CA $346 24 Anchorage, AK$25 5 San Jose, CA $243 25 Boston, MA$23 6 Orange, CA $214 26 Hilo , HI$20

10. Earthquakes

MedlinePlus

An earthquake happens when two blocks of the earth suddenly slip past one another. Earthquakes strike suddenly, violently, and without warning at any time of the day or night. If an earthquake occurs in a populated area, it may cause ...

11. Feasibility study of short-term earthquake prediction using ionospheric anomalies immediately before large earthquakes

Heki, K.; He, L.

2017-12-01

We showed that positive and negative electron density anomalies emerge above the fault immediately before they rupture, 40/20/10 minutes before Mw9/8/7 earthquakes (Heki, 2011 GRL; Heki and Enomoto, 2013 JGR; He and Heki 2017 JGR). These signals are stronger for earthquake with larger Mw and under higher background vertical TEC (total electron conetent) (Heki and Enomoto, 2015 JGR). The epicenter, the positive and the negative anomalies align along the local geomagnetic field (He and Heki, 2016 GRL), suggesting electric fields within ionosphere are responsible for making the anomalies (Kuo et al., 2014 JGR; Kelley et al., 2017 JGR). Here we suppose the next Nankai Trough earthquake that may occur within a few tens of years in Southwest Japan, and will discuss if we can recognize its preseismic signatures in TEC by real-time observations with GNSS.During high geomagnetic activities, large-scale traveling ionospheric disturbances (LSTID) often propagate from auroral ovals toward mid-latitude regions, and leave similar signatures to preseismic anomalies. This is a main obstacle to use preseismic TEC changes for practical short-term earthquake prediction. In this presentation, we show that the same anomalies appeared 40 minutes before the mainshock above northern Australia, the geomagnetically conjugate point of the 2011 Tohoku-oki earthquake epicenter. This not only demonstrates that electric fields play a role in making the preseismic TEC anomalies, but also offers a possibility to discriminate preseismic anomalies from those caused by LSTID. By monitoring TEC in the conjugate areas in the two hemisphere, we can recognize anomalies with simultaneous onset as those caused by within-ionosphere electric fields (e.g. preseismic anomalies, night-time MSTID) and anomalies without simultaneous onset as gravity-wave origin disturbances (e.g. LSTID, daytime MSTID).

12. Application of a time-magnitude prediction model for earthquakes

An, Weiping; Jin, Xueshen; Yang, Jialiang; Dong, Peng; Zhao, Jun; Zhang, He

2007-06-01

In this paper we discuss the physical meaning of the magnitude-time model parameters for earthquake prediction. The gestation process for strong earthquake in all eleven seismic zones in China can be described by the magnitude-time prediction model using the computations of the parameters of the model. The average model parameter values for China are: b = 0.383, c=0.154, d = 0.035, B = 0.844, C = -0.209, and D = 0.188. The robustness of the model parameters is estimated from the variation in the minimum magnitude of the transformed data, the spatial extent, and the temporal period. Analysis of the spatial and temporal suitability of the model indicates that the computation unit size should be at least 4° × 4° for seismic zones in North China, at least 3° × 3° in Southwest and Northwest China, and the time period should be as long as possible.

13. Welcome to Pacific Earthquake Engineering Research Center - PEER

Science.gov Websites

Triggering and Effects at Silty Soil Sites" - PEER Research Project Highlight: "Dissipative Base ; Upcoming Events More June 10-13, 2018 Geotechnical Earthquake Engineering and Soil Dynamics V 2018 - Call

14. A Cooperative Test of the Load/Unload Response Ratio Proposed Method of Earthquake Prediction

Trotta, J. E.; Tullis, T. E.

2004-12-01

15. Real time numerical shake prediction incorporating attenuation structure: a case for the 2016 Kumamoto Earthquake

Ogiso, M.; Hoshiba, M.; Shito, A.; Matsumoto, S.

2016-12-01

Needless to say, heterogeneous attenuation structure is important for ground motion prediction, including earthquake early warning, that is, real time ground motion prediction. Hoshiba and Ogiso (2015, AGU Fall meeting) showed that the heterogeneous attenuation and scattering structure will lead to earlier and more accurate ground motion prediction in the numerical shake prediction scheme proposed by Hoshiba and Aoki (2015, BSSA). Hoshiba and Ogiso (2015) used assumed heterogeneous structure, and we discuss the effect of them in the case of 2016 Kumamoto Earthquake, using heterogeneous structure estimated by actual observation data. We conducted Multiple Lapse Time Window Analysis (Hoshiba, 1993, JGR) to the seismic stations located on western part of Japan to estimate heterogeneous attenuation and scattering structure. The characteristics are similar to the previous work of Carcole and Sato (2010, GJI), e.g. strong intrinsic and scattering attenuation around the volcanoes located on the central part of Kyushu, and relatively weak heterogeneities in the other area. Real time ground motion prediction simulation for the 2016 Kumamoto Earthquake was conducted using the numerical shake prediction scheme with 474 strong ground motion stations. Comparing the snapshot of predicted and observed wavefield showed a tendency for underprediction around the volcanic area in spite of the heterogeneous structure. These facts indicate the necessity of improving the heterogeneous structure for the numerical shake prediction scheme.In this study, we used the waveforms of Hi-net, K-NET, KiK-net stations operated by the NIED for estimating structure and conducting ground motion prediction simulation. Part of this study was supported by the Earthquake Research Institute, the University of Tokyo cooperative research program and JSPS KAKENHI Grant Number 25282114.

16. Long-term predictability of regions and dates of strong earthquakes

Kubyshen, Alexander; Doda, Leonid; Shopin, Sergey

2016-04-01

Results on the long-term predictability of strong earthquakes are discussed. It is shown that dates of earthquakes with M>5.5 could be determined in advance of several months before the event. The magnitude and the region of approaching earthquake could be specified in the time-frame of a month before the event. Determination of number of M6+ earthquakes, which are expected to occur during the analyzed year, is performed using the special sequence diagram of seismic activity for the century time frame. Date analysis could be performed with advance of 15-20 years. Data is verified by a monthly sequence diagram of seismic activity. The number of strong earthquakes expected to occur in the analyzed month is determined by several methods having a different prediction horizon. Determination of days of potential earthquakes with M5.5+ is performed using astronomical data. Earthquakes occur on days of oppositions of Solar System planets (arranged in a single line). At that, the strongest earthquakes occur under the location of vector "Sun-Solar System barycenter" in the ecliptic plane. Details of this astronomical multivariate indicator still require further research, but it's practical significant is confirmed by practice. Another one empirical indicator of approaching earthquake M6+ is a synchronous variation of meteorological parameters: abrupt decreasing of minimal daily temperature, increasing of relative humidity, abrupt change of atmospheric pressure (RAMES method). Time difference of predicted and actual date is no more than one day. This indicator is registered 104 days before the earthquake, so it was called as Harmonic 104 or H-104. This fact looks paradoxical, but the works of A. Sytinskiy and V. Bokov on the correlation of global atmospheric circulation and seismic events give a physical basis for this empirical fact. Also, 104 days is a quarter of a Chandler period so this fact gives insight on the correlation between the anomalies of Earth orientation

17. Earthquakes.

ERIC Educational Resources Information Center

Pakiser, Louis C.

One of a series of general interest publications on science topics, the booklet provides those interested in earthquakes with an introduction to the subject. Following a section presenting an historical look at the world's major earthquakes, the booklet discusses earthquake-prone geographic areas, the nature and workings of earthquakes, earthquake…

18. Thermal IR satellite data application for earthquake research in Pakistan

2018-05-01

The scientific progress in space research indicates earthquake-related processes of surface temperature growth, gas/aerosol exhalation and electromagnetic disturbances in the ionosphere prior to seismic activity. Among them surface temperature growth calculated using the satellite thermal infrared images carries valuable earthquake precursory information for near/distant earthquakes. Previous studies have concluded that such information can appear few days before the occurrence of an earthquake. The objective of this study is to use MODIS thermal imagery data for precursory analysis of Kashmir (Oct 8, 2005; Mw 7.6; 26 km), Ziarat (Oct 28, 2008; Mw 6.4; 13 km) and Dalbandin (Jan 18, 2011; Mw 7.2; 69 km) earthquakes. Our results suggest that there exists an evident correlation of Land Surface Temperature (thermal; LST) anomalies with seismic activity. In particular, a rise of 3-10 °C in LST is observed 6, 4 and 14 days prior to Kashmir, Ziarat and Dalbandin earthquakes. In order to further elaborate our findings, we have presented a comparative and percentile analysis of daily and five years averaged LST for a selected time window with respect to the month of earthquake occurrence. Our comparative analyses of daily and five years averaged LST show a significant change of 6.5-7.9 °C for Kashmir, 8.0-8.1 °C for Ziarat and 2.7-5.4 °C for Dalbandin earthquakes. This significant change has high percentile values for the selected events i.e. 70-100% for Kashmir, 87-100% for Ziarat and 84-100% for Dalbandin earthquakes. We expect that such consistent results may help in devising an optimal earthquake forecasting strategy and to mitigate the effect of associated seismic hazards.

19. Earthquakes: Risk, Detection, Warning, and Research

DTIC Science & Technology

2010-01-14

which affect taller , multi-story buildings. Ground motion that affects shorter buildings of a few stories, called short-period seismic waves, is...places in a single fault, or jump between connected faults. Earthquakes that occur along the Sierra Madre fault in southern California, for example

20. Predictability of Landslide Timing From Quasi-Periodic Precursory Earthquakes

Bell, Andrew F.

2018-02-01

Accelerating rates of geophysical signals are observed before a range of material failure phenomena. They provide insights into the physical processes controlling failure and the basis for failure forecasts. However, examples of accelerating seismicity before landslides are rare, and their behavior and forecasting potential are largely unknown. Here I use a Bayesian methodology to apply a novel gamma point process model to investigate a sequence of quasiperiodic repeating earthquakes preceding a large landslide at Nuugaatsiaq in Greenland in June 2017. The evolution in earthquake rate is best explained by an inverse power law increase with time toward failure, as predicted by material failure theory. However, the commonly accepted power law exponent value of 1.0 is inconsistent with the data. Instead, the mean posterior value of 0.71 indicates a particularly rapid acceleration toward failure and suggests that only relatively short warning times may be possible for similar landslides in future.

1. Possibility of Earthquake-prediction by analyzing VLF signals

Ray, Suman; Chakrabarti, Sandip Kumar; Sasmal, Sudipta

2016-07-01

Prediction of seismic events is one of the most challenging jobs for the scientific community. Conventional ways for prediction of earthquakes are to monitor crustal structure movements, though this method has not yet yield satisfactory results. Furthermore, this method fails to give any short-term prediction. Recently, it is noticed that prior to any seismic event a huge amount of energy is released which may create disturbances in the lower part of D-layer/E-layer of the ionosphere. This ionospheric disturbance may be used as a precursor of earthquakes. Since VLF radio waves propagate inside the wave-guide formed by lower ionosphere and Earth's surface, this signal may be used to identify ionospheric disturbances due to seismic activity. We have analyzed VLF signals to find out the correlations, if any, between the VLF signal anomalies and seismic activities. We have done both the case by case study and also the statistical analysis using a whole year data. In both the methods we found that the night time amplitude of VLF signals fluctuated anomalously three days before the seismic events. Also we found that the terminator time of the VLF signals shifted anomalously towards night time before few days of any major seismic events. We calculate the D-layer preparation time and D-layer disappearance time from the VLF signals. We have observed that this D-layer preparation time and D-layer disappearance time become anomalously high 1-2 days before seismic events. Also we found some strong evidences which indicate that it may possible to predict the location of epicenters of earthquakes in future by analyzing VLF signals for multiple propagation paths.

2. Earthquakes: Risk, Monitoring, Notification, and Research

DTIC Science & Technology

2007-02-02

Global Seismic Network (GSN). The GSN is a system of broadband digital seismographs arrayed around the globe and designed to collect high-quality...39 states face some risk from earthquakes. Seismic hazards are greatest in the western United States, particularly California, Alaska, Washington...Oregon, and Hawaii. The Rocky Mountain region, a portion of the central United States known as the New Madrid Seismic Zone, and portions of the eastern

3. Study of Earthquake Disaster Prediction System of Langfang city Based on GIS

Huang, Meng; Zhang, Dian; Li, Pan; Zhang, YunHui; Zhang, RuoFei

2017-07-01

In this paper, according to the status of China’s need to improve the ability of earthquake disaster prevention, this paper puts forward the implementation plan of earthquake disaster prediction system of Langfang city based on GIS. Based on the GIS spatial database, coordinate transformation technology, GIS spatial analysis technology and PHP development technology, the seismic damage factor algorithm is used to predict the damage of the city under different intensity earthquake disaster conditions. The earthquake disaster prediction system of Langfang city is based on the B / S system architecture. Degree and spatial distribution and two-dimensional visualization display, comprehensive query analysis and efficient auxiliary decision-making function to determine the weak earthquake in the city and rapid warning. The system has realized the transformation of the city’s earthquake disaster reduction work from static planning to dynamic management, and improved the city’s earthquake and disaster prevention capability.

4. CSEP-Japan: The Japanese node of the collaboratory for the study of earthquake predictability

Yokoi, S.; Tsuruoka, H.; Nanjo, K.; Hirata, N.

2011-12-01

Collaboratory for the Study of Earthquake Predictability (CSEP) is a global project of earthquake predictability research. The final goal of this project is to have a look for the intrinsic predictability of the earthquake rupture process through forecast testing experiments. The Earthquake Research Institute, the University of Tokyo joined the CSEP and started the Japanese testing center called as CSEP-Japan. This testing center constitutes an open access to researchers contributing earthquake forecast models for applied to Japan. A total of 91 earthquake forecast models were submitted on the prospective experiment starting from 1 November 2009. The models are separated into 4 testing classes (1 day, 3 months, 1 year and 3 years) and 3 testing regions covering an area of Japan including sea area, Japanese mainland and Kanto district. We evaluate the performance of the models in the official suite of tests defined by the CSEP. The experiments of 1-day, 3-month, 1-year and 3-year forecasting classes were implemented for 92 rounds, 4 rounds, 1round and 0 round (now in progress), respectively. The results of the 3-month class gave us new knowledge concerning statistical forecasting models. All models showed a good performance for magnitude forecasting. On the other hand, observation is hardly consistent in space-distribution with most models in some cases where many earthquakes occurred at the same spot. Throughout the experiment, it has been clarified that some properties of the CSEP's evaluation tests such as the L-test show strong correlation with the N-test. We are now processing to own (cyber-) infrastructure to support the forecast experiment as follows. (1) Japanese seismicity has changed since the 2011 Tohoku earthquake. The 3rd call for forecasting models was announced in order to promote model improvement for forecasting earthquakes after this earthquake. So, we provide Japanese seismicity catalog maintained by JMA for modelers to study how seismicity

5. Recent Achievements of the Collaboratory for the Study of Earthquake Predictability

Jackson, D. D.; Liukis, M.; Werner, M. J.; Schorlemmer, D.; Yu, J.; Maechling, P. J.; Zechar, J. D.; Jordan, T. H.

2015-12-01

Maria Liukis, SCEC, USC; Maximilian Werner, University of Bristol; Danijel Schorlemmer, GFZ Potsdam; John Yu, SCEC, USC; Philip Maechling, SCEC, USC; Jeremy Zechar, Swiss Seismological Service, ETH; Thomas H. Jordan, SCEC, USC, and the CSEP Working Group The Collaboratory for the Study of Earthquake Predictability (CSEP) supports a global program to conduct prospective earthquake forecasting experiments. CSEP testing centers are now operational in California, New Zealand, Japan, China, and Europe with 435 models under evaluation. The California testing center, operated by SCEC, has been operational since Sept 1, 2007, and currently hosts 30-minute, 1-day, 3-month, 1-year and 5-year forecasts, both alarm-based and probabilistic, for California, the Western Pacific, and worldwide. We have reduced testing latency, implemented prototype evaluation of M8 forecasts, and are currently developing formats and procedures to evaluate externally-hosted forecasts and predictions. These efforts are related to CSEP support of the USGS program in operational earthquake forecasting and a DHS project to register and test external forecast procedures from experts outside seismology. A retrospective experiment for the 2010-2012 Canterbury earthquake sequence has been completed, and the results indicate that some physics-based and hybrid models outperform purely statistical (e.g., ETAS) models. The experiment also demonstrates the power of the CSEP cyberinfrastructure for retrospective testing. Our current development includes evaluation strategies that increase computational efficiency for high-resolution global experiments, such as the evaluation of the Global Earthquake Activity Rate (GEAR) model. We describe the open-source CSEP software that is available to researchers as they develop their forecast models (http://northridge.usc.edu/trac/csep/wiki/MiniCSEP). We also discuss applications of CSEP infrastructure to geodetic transient detection and how CSEP procedures are being

6. Space-Time Earthquake Prediction: The Error Diagrams

Molchan, G.

2010-08-01

The quality of earthquake prediction is usually characterized by a two-dimensional diagram n versus τ, where n is the rate of failures-to-predict and τ is a characteristic of space-time alarm. Unlike the time prediction case, the quantity τ is not defined uniquely. We start from the case in which τ is a vector with components related to the local alarm times and find a simple structure of the space-time diagram in terms of local time diagrams. This key result is used to analyze the usual 2-d error sets { n, τ w } in which τ w is a weighted mean of the τ components and w is the weight vector. We suggest a simple algorithm to find the ( n, τ w ) representation of all random guess strategies, the set D, and prove that there exists the unique case of w when D degenerates to the diagonal n + τ w = 1. We find also a confidence zone of D on the ( n, τ w ) plane when the local target rates are known roughly. These facts are important for correct interpretation of ( n, τ w ) diagrams when we discuss the prediction capability of the data or prediction methods.

7. Testing for the 'predictability' of dynamically triggered earthquakes in The Geysers geothermal field

Aiken, Chastity; Meng, Xiaofeng; Hardebeck, Jeanne

2018-03-01

The Geysers geothermal field is well known for being susceptible to dynamic triggering of earthquakes by large distant earthquakes, owing to the introduction of fluids for energy production. Yet, it is unknown if dynamic triggering of earthquakes is 'predictable' or whether dynamic triggering could lead to a potential hazard for energy production. In this paper, our goal is to investigate the characteristics of triggering and the physical conditions that promote triggering to determine whether or not triggering is in anyway foreseeable. We find that, at present, triggering in The Geysers is not easily 'predictable' in terms of when and where based on observable physical conditions. However, triggered earthquake magnitude positively correlates with peak imparted dynamic stress, and larger dynamic stresses tend to trigger sequences similar to mainshock-aftershock sequences. Thus, we may be able to 'predict' what size earthquakes to expect at The Geysers following a large distant earthquake.

8. Study on China’s Earthquake Prediction by Mathematical Analysis and its Application in Catastrophe Insurance

Jianjun, X.; Bingjie, Y.; Rongji, W.

2018-03-01

The purpose of this paper was to improve catastrophe insurance level. Firstly, earthquake predictions were carried out using mathematical analysis method. Secondly, the foreign catastrophe insurances’ policies and models were compared. Thirdly, the suggestions on catastrophe insurances to China were discussed. The further study should be paid more attention on the earthquake prediction by introducing big data.

9. NGA West 2 | Pacific Earthquake Engineering Research Center

Science.gov Websites

, multi-year research program to improve Next Generation Attenuation models for active tectonic regions earthquake engineering, including modeling of directivity and directionality; verification of NGA-West models epistemic uncertainty; and evaluation of soil amplification factors in NGA models versus NEHRP site factors

10. Are Earthquakes Predictable? A Study on Magnitude Correlations in Earthquake Catalog and Experimental Data

Stavrianaki, K.; Ross, G.; Sammonds, P. R.

2015-12-01

The clustering of earthquakes in time and space is widely accepted, however the existence of correlations in earthquake magnitudes is more questionable. In standard models of seismic activity, it is usually assumed that magnitudes are independent and therefore in principle unpredictable. Our work seeks to test this assumption by analysing magnitude correlation between earthquakes and their aftershocks. To separate mainshocks from aftershocks, we perform stochastic declustering based on the widely used Epidemic Type Aftershock Sequence (ETAS) model, which allows us to then compare the average magnitudes of aftershock sequences to that of their mainshock. The results of earthquake magnitude correlations were compared with acoustic emissions (AE) from laboratory analog experiments, as fracturing generates both AE at the laboratory scale and earthquakes on a crustal scale. Constant stress and constant strain rate experiments were done on Darley Dale sandstone under confining pressure to simulate depth of burial. Microcracking activity inside the rock volume was analyzed by the AE technique as a proxy for earthquakes. Applying the ETAS model to experimental data allowed us to validate our results and provide for the first time a holistic view on the correlation of earthquake magnitudes. Additionally we search the relationship between the conditional intensity estimates of the ETAS model and the earthquake magnitudes. A positive relation would suggest the existence of magnitude correlations. The aim of this study is to observe any trends of dependency between the magnitudes of aftershock earthquakes and the earthquakes that trigger them.

11. Construction of Source Model of Huge Subduction Earthquakes for Strong Ground Motion Prediction

Iwata, T.; Asano, K.; Kubo, H.

2013-12-01

It is a quite important issue for strong ground motion prediction to construct the source model of huge subduction earthquakes. Iwata and Asano (2012, AGU) summarized the scaling relationships of large slip area of heterogeneous slip model and total SMGA sizes on seismic moment for subduction earthquakes and found the systematic change between the ratio of SMGA to the large slip area and the seismic moment. They concluded this tendency would be caused by the difference of period range of source modeling analysis. In this paper, we try to construct the methodology of construction of the source model for strong ground motion prediction for huge subduction earthquakes. Following to the concept of the characterized source model for inland crustal earthquakes (Irikura and Miyake, 2001; 2011) and intra-slab earthquakes (Iwata and Asano, 2011), we introduce the proto-type of the source model for huge subduction earthquakes and validate the source model by strong ground motion modeling.

12. The Virtual Quake Earthquake Simulator: Earthquake Probability Statistics for the El Mayor-Cucapah Region and Evidence of Predictability in Simulated Earthquake Sequences

Schultz, K.; Yoder, M. R.; Heien, E. M.; Rundle, J. B.; Turcotte, D. L.; Parker, J. W.; Donnellan, A.

2015-12-01

We introduce a framework for developing earthquake forecasts using Virtual Quake (VQ), the generalized successor to the perhaps better known Virtual California (VC) earthquake simulator. We discuss the basic merits and mechanics of the simulator, and we present several statistics of interest for earthquake forecasting. We also show that, though the system as a whole (in aggregate) behaves quite randomly, (simulated) earthquake sequences limited to specific fault sections exhibit measurable predictability in the form of increasing seismicity precursory to large m > 7 earthquakes. In order to quantify this, we develop an alert based forecasting metric similar to those presented in Keilis-Borok (2002); Molchan (1997), and show that it exhibits significant information gain compared to random forecasts. We also discuss the long standing question of activation vs quiescent type earthquake triggering. We show that VQ exhibits both behaviors separately for independent fault sections; some fault sections exhibit activation type triggering, while others are better characterized by quiescent type triggering. We discuss these aspects of VQ specifically with respect to faults in the Salton Basin and near the El Mayor-Cucapah region in southern California USA and northern Baja California Norte, Mexico.

13. VAN method of short-term earthquake prediction shows promise

Uyeda, Seiya

Although optimism prevailed in the 1970s, the present consensus on earthquake prediction appears to be quite pessimistic. However, short-term prediction based on geoelectric potential monitoring has stood the test of time in Greece for more than a decade [VarotsosandKulhanek, 1993] Lighthill, 1996]. The method used is called the VAN method.The geoelectric potential changes constantly due to causes such as magnetotelluric effects, lightning, rainfall, leakage from manmade sources, and electrochemical instabilities of electrodes. All of this noise must be eliminated before preseismic signals are identified, if they exist at all. The VAN group apparently accomplished this task for the first time. They installed multiple short (100-200m) dipoles with different lengths in both north-south and east-west directions and long (1-10 km) dipoles in appropriate orientations at their stations (one of their mega-stations, Ioannina, for example, now has 137 dipoles in operation) and found that practically all of the noise could be eliminated by applying a set of criteria to the data.

14. Earthquake!

ERIC Educational Resources Information Center

Hernandez, Hildo

2000-01-01

Examines the types of damage experienced by California State University at Northridge during the 1994 earthquake and what lessons were learned in handling this emergency are discussed. The problem of loose asbestos is addressed. (GR)

15. INVESTIGATIVE RESEARCH PROJECTS RELATED TO THE TOHOKU EARTHQUAKE (THE GREAT EAST JAPAN EARTHQUAKE) CONDUCTED IN FUKUSHIMA

PubMed Central

YAMAMOTO, TOSHIYUKI; HASHIMOTO, YASUHIRO; YOSHIDA, MASAYUKI; OHNO, KIKUO; OHTO, HITOSHI; ABE, MASAFUMI

2015-01-01

ABSTRACT Backgrounds: On March 11th 2011, the Tohoku region of Japan was struck by catastrophic disasters. Thousands of people were killed due to a magnitude 9.0 earthquake and its subsequent tsunami. Furthermore, a serious nuclear crisis occurred in Fukushima Prefecture as a result of the disasters, and an emergency evacuation was ordered to people living near the nuclear power plants. There was a lot of anxiety regarding lost families as well as the influences of radioactivity on the health of people and their children. Based on these urgent and uncertain situations, a number of research projects were developed at many institutes both inside and outside Fukushima. Methods: We herein report the investigative research projects related to the Tohoku Earthquake (The Great East Japan Earthquake) conducted after the disasters. The research projects were reviewed by the Institutional Review Board in Fukushima Medical University during the two years following the disasters. The research projects conducted in universities other than Fukushima Medical University were also examined using questionnaire analysis. Results: Among the research projects conducted in Fukushima Medical University (n=424), 7% (n=32) were disaster-related investigative research. The mean duration planned to pursue the projects was 25.5 months. Among these projects, those focusing on the health of Fukushima citizens were most common (n=9), followed by the influence of chronic exposure of radiation on chronic inflammatory disorders (n=6), and the mental health of Fukushima citizens (n=5). They were carefully reviewed for the purpose, suitability, and necessity from ethical as well as scientific viewpoints. The majority of the research projects focused on the effects of the Tohoku Earthquake and/or chronic exposure to low-dose radioactivity on the health of children and pregnant women, as well as on various disorders, such as mental health and chronic inflammatory diseases. On the other hand, among 58

16. INVESTIGATIVE RESEARCH PROJECTS RELATED TO THE TOHOKU EARTHQUAKE (THE GREAT EAST JAPAN EARTHQUAKE) CONDUCTED IN FUKUSHIMA.

PubMed

Yamamoto, Toshiyuki; Hashimoto, Yasuhiro; Yoshida, Masayuki; Ohno, Kikuo; Ohto, Hitoshi; Abe, Masafumi

2015-01-01

On March 11(th) 2011, the Tohoku region of Japan was struck by catastrophic disasters. Thousands of people were killed due to a magnitude 9.0 earthquake and its subsequent tsunami. Furthermore, a serious nuclear crisis occurred in Fukushima Prefecture as a result of the disasters, and an emergency evacuation was ordered to people living near the nuclear power plants. There was a lot of anxiety regarding lost families as well as the influences of radioactivity on the health of people and their children. Based on these urgent and uncertain situations, a number of research projects were developed at many institutes both inside and outside Fukushima. We herein report the investigative research projects related to the Tohoku Earthquake (The Great East Japan Earthquake) conducted after the disasters. The research projects were reviewed by the Institutional Review Board in Fukushima Medical University during the two years following the disasters. The research projects conducted in universities other than Fukushima Medical University were also examined using questionnaire analysis. Among the research projects conducted in Fukushima Medical University (n=424), 7% (n=32) were disaster-related investigative research. The mean duration planned to pursue the projects was 25.5 months. Among these projects, those focusing on the health of Fukushima citizens were most common (n=9), followed by the influence of chronic exposure of radiation on chronic inflammatory disorders (n=6), and the mental health of Fukushima citizens (n=5). They were carefully reviewed for the purpose, suitability, and necessity from ethical as well as scientific viewpoints. The majority of the research projects focused on the effects of the Tohoku Earthquake and/or chronic exposure to low-dose radioactivity on the health of children and pregnant women, as well as on various disorders, such as mental health and chronic inflammatory diseases. On the other hand, among 58 projects we collected from 22

17. Events | Pacific Earthquake Engineering Research Center

Science.gov Websites

home about peer news events research products laboratories publications nisee b.i.p. members education FAQs links Events Calendar of PEER and Other Events PEER Events Archive PEER Annual Meeting 2009 Experimental Structural Engineering PEER Summative Meeting Site Map Search Calendar of PEER and Other Events

18. Predicting earthquake effects—Learning from Northridge and Loma Prieta

USGS Publications Warehouse

Holzer, Thomas L.

1994-01-01

The continental United States has been rocked by two particularly damaging earthquakes in the last 4.5 years, Loma Prieta in northern California in 1989 and Northridge in southern California in 1994. Combined losses from these two earthquakes approached $30 billion. Approximately half these losses were reimbursed by the federal government. Because large earthquakes typically overwhelm state resources and place unplanned burdens on the federal government, it is important to learn from these earthquakes how to reduce future losses. My purpose here is to explore a potential implication of the Northridge and Loma Prieta earthquakes for hazard-mitigation strategies: earth scientists should increase their efforts to map hazardous areas within urban regions. 19. U.S.-Japan Quake Prediction Research NASA Astrophysics Data System (ADS) Kisslinger, Carl; Mikumo, Takeshi; Kanamori, Hiroo For the seventh time since 1964, a seminar on earthquake prediction has been convened under the U.S.-Japan Cooperation in Science Program. The purpose of the seminar was to provide an opportunity for researchers from the two countries to share recent progress and future plans in the continuing effort to develop the scientific basis for predicting earthquakes and practical means for implementing prediction technology as it emerges. Thirty-six contributors, 15 from Japan and 21 from the U.S., met in Morro Bay, Calif.September 12-14. The following day they traveled to nearby sections of the San Andreas fault, including the site of the Parkfield prediction experiment. The conveners of the seminar were Hiroo Kanamori, Seismological Laboratory, California Institute of Technology (Caltech), for the U.S., and Takeshi Mikumo, Disaster Prevention Research Institute, Kyoto University, for Japan . Funding for the participants came from the U.S. National Science Foundation and the Japan Society forthe Promotion of Science, supplemented by other agencies in both countries. 20. Earthquake mechanism and predictability shown by a laboratory fault USGS Publications Warehouse King, C.-Y. 1994-01-01 Slip events generated in a laboratory fault model consisting of a circulinear chain of eight spring-connected blocks of approximately equal weight elastically driven to slide on a frictional surface are studied. It is found that most of the input strain energy is released by a relatively few large events, which are approximately time predictable. A large event tends to roughen stress distribution along the fault, whereas the subsequent smaller events tend to smooth the stress distribution and prepare a condition of simultaneous criticality for the occurrence of the next large event. The frequency-size distribution resembles the Gutenberg-Richter relation for earthquakes, except for a falloff for the largest events due to the finite energy-storage capacity of the fault system. Slip distributions, in different events are commonly dissimilar. Stress drop, slip velocity, and rupture velocity all tend to increase with event size. Rupture-initiation locations are usually not close to the maximum-slip locations. ?? 1994 Birkha??user Verlag. 1. Predicting earthquakes by analyzing accelerating precursory seismic activity USGS Publications Warehouse Varnes, D.J. 1989-01-01 During 11 sequences of earthquakes that in retrospect can be classed as foreshocks, the accelerating rate at which seismic moment is released follows, at least in part, a simple equation. This equation (1) is {Mathematical expression},where {Mathematical expression} is the cumulative sum until time, t, of the square roots of seismic moments of individual foreshocks computed from reported magnitudes;C and n are constants; and tfis a limiting time at which the rate of seismic moment accumulation becomes infinite. The possible time of a major foreshock or main shock, tf,is found by the best fit of equation (1), or its integral, to step-like plots of {Mathematical expression} versus time using successive estimates of tfin linearized regressions until the maximum coefficient of determination, r2,is obtained. Analyzed examples include sequences preceding earthquakes at Cremasta, Greece, 2/5/66; Haicheng, China 2/4/75; Oaxaca, Mexico, 11/29/78; Petatlan, Mexico, 3/14/79; and Central Chile, 3/3/85. In 29 estimates of main-shock time, made as the sequences developed, the errors in 20 were less than one-half and in 9 less than one tenth the time remaining between the time of the last data used and the main shock. Some precursory sequences, or parts of them, yield no solution. Two sequences appear to include in their first parts the aftershocks of a previous event; plots using the integral of equation (1) show that the sequences are easily separable into aftershock and foreshock segments. Synthetic seismic sequences of shocks at equal time intervals were constructed to follow equation (1), using four values of n. In each series the resulting distributions of magnitudes closely follow the linear Gutenberg-Richter relation log N=a-bM, and the product n times b for each series is the same constant. In various forms and for decades, equation (1) has been used successfully to predict failure times of stressed metals and ceramics, landslides in soil and rock slopes, and volcanic 2. A numerical simulation strategy on occupant evacuation behaviors and casualty prediction in a building during earthquakes NASA Astrophysics Data System (ADS) Li, Shuang; Yu, Xiaohui; Zhang, Yanjuan; Zhai, Changhai 2018-01-01 Casualty prediction in a building during earthquakes benefits to implement the economic loss estimation in the performance-based earthquake engineering methodology. Although after-earthquake observations reveal that the evacuation has effects on the quantity of occupant casualties during earthquakes, few current studies consider occupant movements in the building in casualty prediction procedures. To bridge this knowledge gap, a numerical simulation method using refined cellular automata model is presented, which can describe various occupant dynamic behaviors and building dimensions. The simulation on the occupant evacuation is verified by a recorded evacuation process from a school classroom in real-life 2013 Ya'an earthquake in China. The occupant casualties in the building under earthquakes are evaluated by coupling the building collapse process simulation by finite element method, the occupant evacuation simulation, and the casualty occurrence criteria with time and space synchronization. A case study of casualty prediction in a building during an earthquake is provided to demonstrate the effect of occupant movements on casualty prediction. 3. Probabilistic Tsunami Hazard Assessment along Nankai Trough (1) An assessment based on the information of the forthcoming earthquake that Earthquake Research Committee(2013) evaluated NASA Astrophysics Data System (ADS) Hirata, K.; Fujiwara, H.; Nakamura, H.; Osada, M.; Morikawa, N.; Kawai, S.; Ohsumi, T.; Aoi, S.; Yamamoto, N.; Matsuyama, H.; Toyama, N.; Kito, T.; Murashima, Y.; Murata, Y.; Inoue, T.; Saito, R.; Takayama, J.; Akiyama, S.; Korenaga, M.; Abe, Y.; Hashimoto, N. 2015-12-01 The Earthquake Research Committee(ERC)/HERP, Government of Japan (2013) revised their long-term evaluation of the forthcoming large earthquake along the Nankai Trough; the next earthquake is estimated M8 to 9 class, and the probability (P30) that the next earthquake will occur within the next 30 years (from Jan. 1, 2013) is 60% to 70%. In this study, we assess tsunami hazards (maximum coastal tsunami heights) in the near future, in terms of a probabilistic approach, from the next earthquake along Nankai Trough, on the basis of ERC(2013)'s report. The probabilistic tsunami hazard assessment that we applied is as follows; (1) Characterized earthquake fault models (CEFMs) are constructed on each of the 15 hypothetical source areas (HSA) that ERC(2013) showed. The characterization rule follows Toyama et al.(2015, JpGU). As results, we obtained total of 1441 CEFMs. (2) We calculate tsunamis due to CEFMs by solving nonlinear, finite-amplitude, long-wave equations with advection and bottom friction terms by finite-difference method. Run-up computation on land is included. (3) A time predictable model predicts the recurrent interval of the present seismic cycle is T=88.2 years (ERC,2013). We fix P30 = 67% by applying the renewal process based on BPT distribution with T and alpha=0.24 as its aperiodicity. (4) We divide the probability P30 into P30(i) for i-th subgroup consisting of the earthquakes occurring in each of 15 HSA by following a probability re-distribution concept (ERC,2014). Then each earthquake (CEFM) in i-th subgroup is assigned a probability P30(i)/N where N is the number of CEFMs in each sub-group. Note that such re-distribution concept of the probability is nothing but tentative because the present seismology cannot give deep knowledge enough to do it. Epistemic logic-tree approach may be required in future. (5) We synthesize a number of tsunami hazard curves at every evaluation points on coasts by integrating the information about 30 years occurrence 4. Empirical models for the prediction of ground motion duration for intraplate earthquakes NASA Astrophysics Data System (ADS) Anbazhagan, P.; Neaz Sheikh, M.; Bajaj, Ketan; Mariya Dayana, P. J.; Madhura, H.; Reddy, G. R. 2017-07-01 Many empirical relationships for the earthquake ground motion duration were developed for interplate region, whereas only a very limited number of empirical relationships exist for intraplate region. Also, the existing relationships were developed based mostly on the scaled recorded interplate earthquakes to represent intraplate earthquakes. To the author's knowledge, none of the existing relationships for the intraplate regions were developed using only the data from intraplate regions. Therefore, an attempt is made in this study to develop empirical predictive relationships of earthquake ground motion duration (i.e., significant and bracketed) with earthquake magnitude, hypocentral distance, and site conditions (i.e., rock and soil sites) using the data compiled from intraplate regions of Canada, Australia, Peninsular India, and the central and southern parts of the USA. The compiled earthquake ground motion data consists of 600 records with moment magnitudes ranging from 3.0 to 6.5 and hypocentral distances ranging from 4 to 1000 km. The non-linear mixed-effect (NLMEs) and logistic regression techniques (to account for zero duration) were used to fit predictive models to the duration data. The bracketed duration was found to be decreased with an increase in the hypocentral distance and increased with an increase in the magnitude of the earthquake. The significant duration was found to be increased with the increase in the magnitude and hypocentral distance of the earthquake. Both significant and bracketed durations were predicted higher in rock sites than in soil sites. The predictive relationships developed herein are compared with the existing relationships for interplate and intraplate regions. The developed relationship for bracketed duration predicts lower durations for rock and soil sites. However, the developed relationship for a significant duration predicts lower durations up to a certain distance and thereafter predicts higher durations compared to the 5. Predicting the Maximum Earthquake Magnitude from Seismic Data in Israel and Its Neighboring Countries PubMed Central 2016-01-01 This paper explores several data mining and time series analysis methods for predicting the magnitude of the largest seismic event in the next year based on the previously recorded seismic events in the same region. The methods are evaluated on a catalog of 9,042 earthquake events, which took place between 01/01/1983 and 31/12/2010 in the area of Israel and its neighboring countries. The data was obtained from the Geophysical Institute of Israel. Each earthquake record in the catalog is associated with one of 33 seismic regions. The data was cleaned by removing foreshocks and aftershocks. In our study, we have focused on ten most active regions, which account for more than 80% of the total number of earthquakes in the area. The goal is to predict whether the maximum earthquake magnitude in the following year will exceed the median of maximum yearly magnitudes in the same region. Since the analyzed catalog includes only 28 years of complete data, the last five annual records of each region (referring to the years 2006–2010) are kept for testing while using the previous annual records for training. The predictive features are based on the Gutenberg-Richter Ratio as well as on some new seismic indicators based on the moving averages of the number of earthquakes in each area. The new predictive features prove to be much more useful than the indicators traditionally used in the earthquake prediction literature. The most accurate result (AUC = 0.698) is reached by the Multi-Objective Info-Fuzzy Network (M-IFN) algorithm, which takes into account the association between two target variables: the number of earthquakes and the maximum earthquake magnitude during the same year. PMID:26812351 6. Predicting the Maximum Earthquake Magnitude from Seismic Data in Israel and Its Neighboring Countries. PubMed Last, Mark; Rabinowitz, Nitzan; Leonard, Gideon 2016-01-01 This paper explores several data mining and time series analysis methods for predicting the magnitude of the largest seismic event in the next year based on the previously recorded seismic events in the same region. The methods are evaluated on a catalog of 9,042 earthquake events, which took place between 01/01/1983 and 31/12/2010 in the area of Israel and its neighboring countries. The data was obtained from the Geophysical Institute of Israel. Each earthquake record in the catalog is associated with one of 33 seismic regions. The data was cleaned by removing foreshocks and aftershocks. In our study, we have focused on ten most active regions, which account for more than 80% of the total number of earthquakes in the area. The goal is to predict whether the maximum earthquake magnitude in the following year will exceed the median of maximum yearly magnitudes in the same region. Since the analyzed catalog includes only 28 years of complete data, the last five annual records of each region (referring to the years 2006-2010) are kept for testing while using the previous annual records for training. The predictive features are based on the Gutenberg-Richter Ratio as well as on some new seismic indicators based on the moving averages of the number of earthquakes in each area. The new predictive features prove to be much more useful than the indicators traditionally used in the earthquake prediction literature. The most accurate result (AUC = 0.698) is reached by the Multi-Objective Info-Fuzzy Network (M-IFN) algorithm, which takes into account the association between two target variables: the number of earthquakes and the maximum earthquake magnitude during the same year. 7. Moment Magnitudes and Local Magnitudes for Small Earthquakes: Implications for Ground-Motion Prediction and b-values NASA Astrophysics Data System (ADS) Baltay, A.; Hanks, T. C.; Vernon, F. 2016-12-01 We illustrate two essential consequences of the systematic difference between moment magnitude and local magnitude for small earthquakes, illuminating the underlying earthquake physics. Moment magnitude, M 2/3 log M0, is uniformly valid for all earthquake sizes [Hanks and Kanamori, 1979]. However, the relationship between local magnitude ML and moment is itself magnitude dependent. For moderate events, 3< M < 7, M and M­L are coincident; for earthquakes smaller than M3, ML log M0 [Hanks and Boore, 1984]. This is a consequence of the saturation of the apparent corner frequency fc as it becoming greater than the largest observable frequency, fmax; In this regime, stress drop no longer controls ground motion. This implies that ML and M differ by a factor of 1.5 for these small events. While this idea is not new, its implications are important as more small-magnitude data are incorporated into earthquake hazard research. With a large dataset of M<3 earthquakes recorded on the ANZA network, we demonstrate striking consequences of the difference between M and ML. ML scales as the log peak ground motions (e.g., PGA or PGV) for these small earthquakes, which yields log PGA log M0 [Boore, 1986]. We plot nearly 15,000 records of PGA and PGV at close stations, adjusted for site conditions and for geometrical spreading to 10 km. The slope of the log of ground motion is 1.0*ML­, or 1.5*M, confirming the relationship, and that fc >> fmax. Just as importantly, if this relation is overlooked, prediction of large-magnitude ground motion from small earthquakes will be misguided. We also consider the effect of this magnitude scale difference on b-value. The oft-cited b-value of 1 should hold for small magnitudes, given M. Use of ML necessitates b=2/3 for the same data set; use of mixed, or unknown, magnitudes complicates the matter further. This is of particular import when estimating the rate of large earthquakes when one has limited data on their recurrence, as is the case for 8. The Parkfield earthquake prediction of October 1992; the emergency services response USGS Publications Warehouse Andrews, R. 1992-01-01 The science of earthquake prediction is interesting and worthy of support. In many respects the ultimate payoff of earthquake prediction or earthquake forecasting is how the information can be used to enhance public safety and public preparedness. This is a particularly important issue here in California where we have such a high level of seismic risk historically, and currently, as a consequence of activity in 1989 in the San Francisco Bay Area, in Humboldt County in April of this year (1992), and in southern California in the Landers-Big Bear area in late June of this year (1992). We are currently very concerned about the possibility of a major earthquake, one or more, happening close to one of our metropolitan areas. Within that context, the Parkfield experiment becomes very important. 9. Four Examples of Short-Term and Imminent Prediction of Earthquakes NASA Astrophysics Data System (ADS) zeng, zuoxun; Liu, Genshen; Wu, Dabin; Sibgatulin, Victor 2014-05-01 We show here 4 examples of short-term and imminent prediction of earthquakes in China last year. They are Nima Earthquake(Ms5.2), Minxian Earthquake(Ms6.6), Nantou Earthquake (Ms6.7) and Dujiangyan Earthquake (Ms4.1) Imminent Prediction of Nima Earthquake(Ms5.2) Based on the comprehensive analysis of the prediction of Victor Sibgatulin using natural electromagnetic pulse anomalies and the prediction of Song Song and Song Kefu using observation of a precursory halo, and an observation for the locations of a degasification of the earth in the Naqu, Tibet by Zeng Zuoxun himself, the first author made a prediction for an earthquake around Ms 6 in 10 days in the area of the degasification point (31.5N, 89.0 E) at 0:54 of May 8th, 2013. He supplied another degasification point (31N, 86E) for the epicenter prediction at 8:34 of the same day. At 18:54:30 of May 15th, 2013, an earthquake of Ms5.2 occurred in the Nima County, Naqu, China. Imminent Prediction of Minxian Earthquake (Ms6.6) At 7:45 of July 22nd, 2013, an earthquake occurred at the border between Minxian and Zhangxian of Dingxi City (34.5N, 104.2E), Gansu province with magnitude of Ms6.6. We review the imminent prediction process and basis for the earthquake using the fingerprint method. 9 channels or 15 channels anomalous components - time curves can be outputted from the SW monitor for earthquake precursors. These components include geomagnetism, geoelectricity, crust stresses, resonance, crust inclination. When we compress the time axis, the outputted curves become different geometric images. The precursor images are different for earthquake in different regions. The alike or similar images correspond to earthquakes in a certain region. According to the 7-year observation of the precursor images and their corresponding earthquake, we usually get the fingerprint 6 days before the corresponding earthquakes. The magnitude prediction needs the comparison between the amplitudes of the fingerpringts from the same 10. Applications of the gambling score in evaluating earthquake predictions and forecasts NASA Astrophysics Data System (ADS) Zhuang, Jiancang; Zechar, Jeremy D.; Jiang, Changsheng; Console, Rodolfo; Murru, Maura; Falcone, Giuseppe 2010-05-01 This study presents a new method, namely the gambling score, for scoring the performance earthquake forecasts or predictions. Unlike most other scoring procedures that require a regular scheme of forecast and treat each earthquake equally, regardless their magnitude, this new scoring method compensates the risk that the forecaster has taken. Starting with a certain number of reputation points, once a forecaster makes a prediction or forecast, he is assumed to have betted some points of his reputation. The reference model, which plays the role of the house, determines how many reputation points the forecaster can gain if he succeeds, according to a fair rule, and also takes away the reputation points bet by the forecaster if he loses. This method is also extended to the continuous case of point process models, where the reputation points betted by the forecaster become a continuous mass on the space-time-magnitude range of interest. For discrete predictions, we apply this method to evaluate performance of Shebalin's predictions made by using the Reverse Tracing of Precursors (RTP) algorithm and of the outputs of the predictions from the Annual Consultation Meeting on Earthquake Tendency held by China Earthquake Administration. For the continuous case, we use it to compare the probability forecasts of seismicity in the Abruzzo region before and after the L'aquila earthquake based on the ETAS model and the PPE model. 11. The susceptibility analysis of landslides induced by earthquake in Aso volcanic area, Japan, scoping the prediction NASA Astrophysics Data System (ADS) Kubota, Tetsuya; Takeda, Tsuyoshi 2017-04-01 Kumamoto earthquake on April 16th 2016 in Kumamoto prefecture, Kyushu Island, Japan with intense seismic scale of M7.3 (maximum acceleration = 1316 gal in Aso volcanic region) yielded countless instances of landslide and debris flow that induced serious damages and causalities in the area, especially in the Aso volcanic mountain range. Hence, field investigation and numerical slope stability analysis were conducted to delve into the characteristics or the prediction factors of the landslides induced by this earthquake. For the numerical analysis, Finite Element Method (FEM) and CSSDP (Critical Slip Surface analysis by Dynamic Programming theory based on limit equilibrium method) were applied to the landslide slopes with seismic acceleration observed. These numerical analysis methods can automatically detect the landslide slip surface which has minimum Fs (factor of safety). The various results and the information obtained through this investigation and analysis were integrated to predict the landslide susceptible slopes in volcanic area induced by earthquakes and rainfalls of their aftermath, considering geologic-geomorphologic features, geo-technical characteristics of the landslides and vegetation effects on the slope stability. Based on the FEM or CSSDP results, the landslides occurred in this earthquake at the mild gradient slope on the ridge have the safety factor of slope Fs=2.20 approximately (without rainfall nor earthquake, and Fs>=1.0 corresponds to stable slope without landslide) and 1.78 2.10 (with the most severe rainfall in the past) while they have approximately Fs=0.40 with the seismic forces in this earthquake (horizontal direction 818 gal, vertical direction -320 gal respectively, observed in the earthquake). It insists that only in case of earthquakes the landslide in volcanic sediment apt to occur at the mild gradient slopes as well as on the ridges with convex cross section. Consequently, the following results are obtained. 1) At volcanic 12. Volunteers in the earthquake hazard reduction program USGS Publications Warehouse Ward, P.L. 1978-01-01 With this in mind, I organized a small workshop for approximately 30 people on February 2 and 3, 1978, in Menlo Park, Calif. the purpose of the meeting was to discuss methods of involving volunteers in a meaningful way in earthquake research and in educating the public about earthquake hazards. The emphasis was on earthquake prediction research, but the discussions covered the whole earthquake hazard reduction program. Representatives attended from the earthquake research community, from groups doing socioeconomic research on earthquake matters, and from a wide variety of organizations who might sponsor volunteers. 13. Electromagnetic earthquake triggering phenomena: State-of-the-art research and future developments NASA Astrophysics Data System (ADS) Zeigarnik, Vladimir; Novikov, Victor 2014-05-01 Developed in the 70s of the last century in Russia unique pulsed power systems based on solid propellant magneto-hydrodynamic (MHD) generators with an output of 10-500 MW and operation duration of 10 to 15 s were applied for an active electromagnetic monitoring of the Earth's crust to explore its deep structure, oil and gas electrical prospecting, and geophysical studies for earthquake prediction due to their high specific power parameters, portability, and a capability of operation under harsh climatic conditions. The most interesting and promising results were obtained during geophysical experiments at the test sites located at Pamir and Northern Tien Shan mountains, when after 1.5-2.5 kA electric current injection into the Earth crust through an 4 km-length emitting dipole the regional seismicity variations were observed (increase of number of weak earthquakes within a week). Laboratory experiments performed by different teams of the Institute of Physics of the Earth, Joint Institute for High Temperatures, and Research Station of Russian Academy of Sciences on observation of acoustic emission behavior of stressed rock samples during their processing by electric pulses demonstrated similar patterns - a burst of acoustic emission (formation of cracks) after application of current pulse to the sample. Based on the field and laboratory studies it was supposed that a new kind of earthquake triggering - electromagnetic initiation of weak seismic events has been observed, which may be used for the man-made electromagnetic safe release of accumulated tectonic stresses and, consequently, for earthquake hazard mitigation. For verification of this hypothesis some additional field experiments were carried out at the Bishkek geodynamic proving ground with application of pulsed ERGU-600 facility, which provides 600 A electric current in the emitting dipole. An analysis of spatio-temporal redistribution of weak regional seismicity after ERGU-600 pulses, as well as a response 14. Research on response spectrum of dam based on scenario earthquake NASA Astrophysics Data System (ADS) Zhang, Xiaoliang; Zhang, Yushan 2017-10-01 Taking a large hydropower station as an example, the response spectrum based on scenario earthquake is determined. Firstly, the potential source of greatest contribution to the site is determined on the basis of the results of probabilistic seismic hazard analysis (PSHA). Secondly, the magnitude and epicentral distance of the scenario earthquake are calculated according to the main faults and historical earthquake of the potential seismic source zone. Finally, the response spectrum of scenario earthquake is calculated using the Next Generation Attenuation (NGA) relations. The response spectrum based on scenario earthquake method is less than the probability-consistent response spectrum obtained by PSHA method. The empirical analysis shows that the response spectrum of scenario earthquake considers the probability level and the structural factors, and combines the advantages of the deterministic and probabilistic seismic hazard analysis methods. It is easy for people to accept and provide basis for seismic engineering of hydraulic engineering. 15. Can an earthquake prediction and warning system be developed? USGS Publications Warehouse N.N, Ambraseys 1990-01-01 Over the last 20 years, natural disasters have killed nearly 3 million people and disrupted the lives of over 800 million others. In 2 years there were more than 50 serious natural disasters, including landslides in Italy, France, and Colombia; a typhoon in Korea; wildfires in China and the United States; a windstorm in England; grasshopper plagues in Africa's horn and the Sahel; tornadoes in Canada; devastating earthquakes in Soviet Armenia and Tadzhikstand; infestations in Africa; landslides in Brazil; and tornadoes in the United States 16. Landscape scale prediction of earthquake-induced landsliding based on seismological and geomorphological parameters. NASA Astrophysics Data System (ADS) Marc, O.; Hovius, N.; Meunier, P.; Rault, C. 2017-12-01 In tectonically active areas, earthquakes are an important trigger of landslides with significant impact on hillslopes and river evolutions. However, detailed prediction of landslides locations and properties for a given earthquakes remain difficult.In contrast we propose, landscape scale, analytical prediction of bulk coseismic landsliding, that is total landslide area and volume (Marc et al., 2016a) as well as the regional area within which most landslide must distribute (Marc et al., 2017). The prediction is based on a limited number of seismological (seismic moment, source depth) and geomorphological (landscape steepness, threshold acceleration) parameters, and therefore could be implemented in landscape evolution model aiming at engaging with erosion dynamics at the scale of the seismic cycle. To assess the model we have compiled and normalized estimates of total landslide volume, total landslide area and regional area affected by landslides for 40, 17 and 83 earthquakes, respectively. We have found that low landscape steepness systematically leads to overprediction of the total area and volume of landslides. When this effect is accounted for, the model is able to predict within a factor of 2 the landslide areas and associated volumes for about 70% of the cases in our databases. The prediction of regional area affected do not require a calibration for the landscape steepness and gives a prediction within a factor of 2 for 60% of the database. For 7 out of 10 comprehensive inventories we show that our prediction compares well with the smallest region around the fault containing 95% of the total landslide area. This is a significant improvement on a previously published empirical expression based only on earthquake moment.Some of the outliers seems related to exceptional rock mass strength in the epicentral area or shaking duration and other seismic source complexities ignored by the model. Applications include prediction on the mass balance of earthquakes and 17. Predicting Posttraumatic Stress Symptom Prevalence and Local Distribution after an Earthquake with Scarce Data. PubMed Dussaillant, Francisca; Apablaza, Mauricio 2017-08-01 After a major earthquake, the assignment of scarce mental health emergency personnel to different geographic areas is crucial to the effective management of the crisis. The scarce information that is available in the aftermath of a disaster may be valuable in helping predict where are the populations that are in most need. The objectives of this study were to derive algorithms to predict posttraumatic stress (PTS) symptom prevalence and local distribution after an earthquake and to test whether there are algorithms that require few input data and are still reasonably predictive. A rich database of PTS symptoms, informed after Chile's 2010 earthquake and tsunami, was used. Several model specifications for the mean and centiles of the distribution of PTS symptoms, together with posttraumatic stress disorder (PTSD) prevalence, were estimated via linear and quantile regressions. The models varied in the set of covariates included. Adjusted R2 for the most liberal specifications (in terms of numbers of covariates included) ranged from 0.62 to 0.74, depending on the outcome. When only including peak ground acceleration (PGA), poverty rate, and household damage in linear and quadratic form, predictive capacity was still good (adjusted R2 from 0.59 to 0.67 were obtained). Information about local poverty, household damage, and PGA can be used as an aid to predict PTS symptom prevalence and local distribution after an earthquake. This can be of help to improve the assignment of mental health personnel to the affected localities. Dussaillant F , Apablaza M . Predicting posttraumatic stress symptom prevalence and local distribution after an earthquake with scarce data. Prehosp Disaster Med. 2017;32(4):357-367. 18. New predictive equations for Arias intensity from crustal earthquakes in New Zealand NASA Astrophysics Data System (ADS) Stafford, Peter J.; Berrill, John B.; Pettinga, Jarg R. 2009-01-01 Arias Intensity (Arias, MIT Press, Cambridge MA, pp 438-483, 1970) is an important measure of the strength of a ground motion, as it is able to simultaneously reflect multiple characteristics of the motion in question. Recently, the effectiveness of Arias Intensity as a predictor of the likelihood of damage to short-period structures has been demonstrated, reinforcing the utility of Arias Intensity for use in both structural and geotechnical applications. In light of this utility, Arias Intensity has begun to be considered as a ground-motion measure suitable for use in probabilistic seismic hazard analysis (PSHA) and earthquake loss estimation. It is therefore timely to develop predictive equations for this ground-motion measure. In this study, a suite of four predictive equations, each using a different functional form, is derived for the prediction of Arias Intensity from crustal earthquakes in New Zealand. The provision of a suite of models is included to allow for epistemic uncertainty to be considered within a PSHA framework. Coefficients are presented for four different horizontal-component definitions for each of the four models. The ground-motion dataset for which the equations are derived include records from New Zealand crustal earthquakes as well as near-field records from worldwide crustal earthquakes. The predictive equations may be used to estimate Arias Intensity for moment magnitudes between 5.1 and 7.5 and for distances (both rjb and rrup) up to 300 km. 19. Shaking table test and dynamic response prediction on an earthquake-damaged RC building NASA Astrophysics Data System (ADS) Xianguo, Ye; Jiaru, Qian; Kangning, Li 2004-12-01 This paper presents the results from shaking table tests of a one-tenth-scale reinforced concrete (RC) building model. The test model is a protype of a building that was seriously damaged during the 1985 Mexico earthquake. The input ground excitation used during the test was from the records obtained near the site of the prototype building during the 1985 and 1995 Mexico earthquakes. The tests showed that the damage pattern of the test model agreed well with that of the prototype building. Analytical prediction of earthquake response has been conducted for the prototype building using a sophisticated 3-D frame model. The input motion used for the dynamic analysis was the shaking table test measurements with similarity transformation. The comparison of the analytical results and the shaking table test results indicates that the response of the RC building to minor and the moderate earthquakes can be predicated well. However, there is difference between the predication and the actual response to the major earthquake. 20. A global earthquake discrimination scheme to optimize ground-motion prediction equation selection USGS Publications Warehouse Garcia, Daniel; Wald, David J.; Hearne, Michael 2012-01-01 We present a new automatic earthquake discrimination procedure to determine in near-real time the tectonic regime and seismotectonic domain of an earthquake, its most likely source type, and the corresponding ground-motion prediction equation (GMPE) class to be used in the U.S. Geological Survey (USGS) Global ShakeMap system. This method makes use of the Flinn–Engdahl regionalization scheme, seismotectonic information (plate boundaries, global geology, seismicity catalogs, and regional and local studies), and the source parameters available from the USGS National Earthquake Information Center in the minutes following an earthquake to give the best estimation of the setting and mechanism of the event. Depending on the tectonic setting, additional criteria based on hypocentral depth, style of faulting, and regional seismicity may be applied. For subduction zones, these criteria include the use of focal mechanism information and detailed interface models to discriminate among outer-rise, upper-plate, interface, and intraslab seismicity. The scheme is validated against a large database of recent historical earthquakes. Though developed to assess GMPE selection in Global ShakeMap operations, we anticipate a variety of uses for this strategy, from real-time processing systems to any analysis involving tectonic classification of sources from seismic catalogs. 1. Spatio-Temporal Fluctuations of the Earthquake Magnitude Distribution: Robust Estimation and Predictive Power NASA Astrophysics Data System (ADS) Olsen, S.; Zaliapin, I. 2008-12-01 We establish positive correlation between the local spatio-temporal fluctuations of the earthquake magnitude distribution and the occurrence of regional earthquakes. In order to accomplish this goal, we develop a sequential Bayesian statistical estimation framework for the b-value (slope of the Gutenberg-Richter's exponential approximation to the observed magnitude distribution) and for the ratio a(t) between the earthquake intensities in two non-overlapping magnitude intervals. The time-dependent dynamics of these parameters is analyzed using Markov Chain Models (MCM). The main advantage of this approach over the traditional window-based estimation is its "soft" parameterization, which allows one to obtain stable results with realistically small samples. We furthermore discuss a statistical methodology for establishing lagged correlations between continuous and point processes. The developed methods are applied to the observed seismicity of California, Nevada, and Japan on different temporal and spatial scales. We report an oscillatory dynamics of the estimated parameters, and find that the detected oscillations are positively correlated with the occurrence of large regional earthquakes, as well as with small events with magnitudes as low as 2.5. The reported results have important implications for further development of earthquake prediction and seismic hazard assessment methods. 2. Source Model of Huge Subduction Earthquakes for Strong Ground Motion Prediction NASA Astrophysics Data System (ADS) Iwata, T.; Asano, K. 2012-12-01 It is a quite important issue for strong ground motion prediction to construct the source model of huge subduction earthquakes. Irikura and Miyake (2001, 2011) proposed the characterized source model for strong ground motion prediction, which consists of plural strong ground motion generation area (SMGA, Miyake et al., 2003) patches on the source fault. We obtained the SMGA source models for many events using the empirical Green's function method and found the SMGA size has an empirical scaling relationship with seismic moment. Therefore, the SMGA size can be assumed from that empirical relation under giving the seismic moment for anticipated earthquakes. Concerning to the setting of the SMGAs position, the information of the fault segment is useful for inland crustal earthquakes. For the 1995 Kobe earthquake, three SMGA patches are obtained and each Nojima, Suma, and Suwayama segment respectively has one SMGA from the SMGA modeling (e.g. Kamae and Irikura, 1998). For the 2011 Tohoku earthquake, Asano and Iwata (2012) estimated the SMGA source model and obtained four SMGA patches on the source fault. Total SMGA area follows the extension of the empirical scaling relationship between the seismic moment and the SMGA area for subduction plate-boundary earthquakes, and it shows the applicability of the empirical scaling relationship for the SMGA. The positions of two SMGAs are in Miyagi-Oki segment and those other two SMGAs are in Fukushima-Oki and Ibaraki-Oki segments, respectively. Asano and Iwata (2012) also pointed out that all SMGAs are corresponding to the historical source areas of 1930's. Those SMGAs do not overlap the huge slip area in the shallower part of the source fault which estimated by teleseismic data, long-period strong motion data, and/or geodetic data during the 2011 mainshock. This fact shows the huge slip area does not contribute to strong ground motion generation (10-0.1s). The information of the fault segment in the subduction zone, or 3. Research in seismology and earthquake engineering in Venezuela USGS Publications Warehouse Urbina, L.; Grases, J. 1983-01-01 After the July 29, 1967, damaging earthquake (with a moderate magnitude of 6.3) caused widespread damage to the northern coastal area of Venezuela and to the Caracas Valley, the Venezuelan Government decided to establish a Presidential Earthquake Commission. This commission undertook the task of coordinating the efforts to study the after-effects of the earthquake. The July 1967 earthquake claimed numerous lives and caused extensive damage to the capital of Venezuela. In 1968, the U.S Geological Survey conducted a seismological field study in the northern coastal area and in the Caracas Valley of Venezuela. the objective was to study the area that sustained severe, moderate, and no damage to structures. A reported entitled Ground Amplification Studies in Earthquake Damage Areas: The Caracas Earthquake of 1967 documented, for the first time, short-period seismic wave ground-motion amplifications in the Caracas Valley. Figure 1 shows the area of severe damage in the Los Palos Grantes suburb and the correlation with depth of alluvium and the arabic numbers denote the ground amplification factor at each site in the area. the Venezuelan Government initiated many programs to study in detail the damage sustained and to investigate the ongoing construction practices. These actions motivated professionals in the academic, private, and Government sectors to develops further capabilities and self-sufficiency in the fields of engineering and seismology. Allocation of funds was made to assist in training professionals and technicians and in developing new seismological stations and new programs at the national level in earthquake engineering and seismology. A brief description of the ongoing programs in Venezuela is listed below. these programs are being performed by FUNVISIS and by other national organizations listed at the end of this article. 4. Empirical prediction for travel distance of channelized rock avalanches in the Wenchuan earthquake area NASA Astrophysics Data System (ADS) Zhan, Weiwei; Fan, Xuanmei; Huang, Runqiu; Pei, Xiangjun; Xu, Qiang; Li, Weile 2017-06-01 Rock avalanches are extremely rapid, massive flow-like movements of fragmented rock. The travel path of the rock avalanches may be confined by channels in some cases, which are referred to as channelized rock avalanches. Channelized rock avalanches are potentially dangerous due to their difficult-to-predict travel distance. In this study, we constructed a dataset with detailed characteristic parameters of 38 channelized rock avalanches triggered by the 2008 Wenchuan earthquake using the visual interpretation of remote sensing imagery, field investigation and literature review. Based on this dataset, we assessed the influence of different factors on the runout distance and developed prediction models of the channelized rock avalanches using the multivariate regression method. The results suggested that the movement of channelized rock avalanche was dominated by the landslide volume, total relief and channel gradient. The performance of both models was then tested with an independent validation dataset of eight rock avalanches that were induced by the 2008 Wenchuan earthquake, the Ms 7.0 Lushan earthquake and heavy rainfall in 2013, showing acceptable good prediction results. Therefore, the travel-distance prediction models for channelized rock avalanches constructed in this study are applicable and reliable for predicting the runout of similar rock avalanches in other regions. 5. Foreshock sequences and short-term earthquake predictability on East Pacific Rise transform faults. PubMed McGuire, Jeffrey J; Boettcher, Margaret S; Jordan, Thomas H 2005-03-24 East Pacific Rise transform faults are characterized by high slip rates (more than ten centimetres a year), predominantly aseismic slip and maximum earthquake magnitudes of about 6.5. Using recordings from a hydroacoustic array deployed by the National Oceanic and Atmospheric Administration, we show here that East Pacific Rise transform faults also have a low number of aftershocks and high foreshock rates compared to continental strike-slip faults. The high ratio of foreshocks to aftershocks implies that such transform-fault seismicity cannot be explained by seismic triggering models in which there is no fundamental distinction between foreshocks, mainshocks and aftershocks. The foreshock sequences on East Pacific Rise transform faults can be used to predict (retrospectively) earthquakes of magnitude 5.4 or greater, in narrow spatial and temporal windows and with a high probability gain. The predictability of such transform earthquakes is consistent with a model in which slow slip transients trigger earthquakes, enrich their low-frequency radiation and accommodate much of the aseismic plate motion. 6. Predicted Attenuation Relation and Observed Ground Motion of Gorkha Nepal Earthquake of 25 April 2015 NASA Astrophysics Data System (ADS) Singh, R. P.; Ahmad, R. 2015-12-01 A comparison of recent observed ground motion parameters of recent Gorkha Nepal earthquake of 25 April 2015 (Mw 7.8) with the predicted ground motion parameters using exitsing attenuation relation of the Himalayan region will be presented. The recent earthquake took about 8000 lives and destroyed thousands of poor quality of buildings and the earthquake was felt by millions of people living in Nepal, China, India, Bangladesh, and Bhutan. The knowledge of ground parameters are very important in developing seismic code of seismic prone regions like Himalaya for better design of buildings. The ground parameters recorded in recent earthquake event and aftershocks are compared with attenuation relations for the Himalayan region, the predicted ground motion parameters show good correlation with the observed ground parameters. The results will be of great use to Civil engineers in updating existing building codes in the Himlayan and surrounding regions and also for the evaluation of seismic hazards. The results clearly show that the attenuation relation developed for the Himalayan region should be only used, other attenuation relations based on other regions fail to provide good estimate of observed ground motion parameters. 7. Research on Collection of Earthquake Disaster Information from the Crowd NASA Astrophysics Data System (ADS) Nian, Z. 2017-12-01 In China, the assessment of the earthquake disasters information is mainly based on the inversion of the seismic source mechanism and the pre-calculated population data model, the real information of the earthquake disaster is usually collected through the government departments, the accuracy and the speed need to be improved. And in a massive earthquake like the one in Mexico, the telecommunications infrastructure on ground were damaged , the quake zone was difficult to observe by satellites and aircraft in the bad weather. Only a bit of information was sent out through maritime satellite of other country. Thus, the timely and effective development of disaster relief was seriously affected. Now Chinese communication satellites have been orbiting, people don't only rely on the ground telecom base station to keep communication with the outside world, to open the web page,to land social networking sites, to release information, to transmit images and videoes. This paper will establish an earthquake information collection system which public can participate. Through popular social platform and other information sources, the public can participate in the collection of earthquake information, and supply quake zone information, including photos, video, etc.,especially those information made by unmanned aerial vehicle (uav) after earthqake, the public can use the computer, potable terminals, or mobile text message to participate in the earthquake information collection. In the system, the information will be divided into earthquake zone basic information, earthquake disaster reduction information, earthquake site information, post-disaster reconstruction information etc. and they will been processed and put into database. The quality of data is analyzed by multi-source information, and is controlled by local public opinion on them to supplement the data collected by government departments timely and implement the calibration of simulation results ,which will better guide 8. Earthquake fragility assessment of curved and skewed bridges in Mountain West region : research brief. DOT National Transportation Integrated Search 2016-09-01 the ISSUE : the RESEARCH : Earthquake Fragility : Assessment of Curved : and Skewed Bridges in : Mountain West Region : Reinforced concrete bridges with both skew and curvature are common in areas with complex terrains. : These bridges are irregular ... 9. Satellite relay telemetry of seismic data in earthquake prediction and control USGS Publications Warehouse Jackson, Wayne H.; Eaton, Jerry P. 1971-01-01 The Satellite Telemetry Earthquake Monitoring Program was started in FY 1968 to evaluate the applicability of satellite relay telemetry in the collection of seismic data from a large number of dense seismograph clusters laid out along the major fault systems of western North America. Prototype clusters utilizing phone-line telemetry were then being installed by the National Center for Earthquake Research (NCER) in 3 regions along the San Andreas fault in central California; and the experience of installing and operating the clusters and in reducing and analyzing the seismic data from them was to provide the raw materials for evaluation in the satellite relay telemetry project. 10. A Hybrid Ground-Motion Prediction Equation for Earthquakes in Western Alberta NASA Astrophysics Data System (ADS) Spriggs, N.; Yenier, E.; Law, A.; Moores, A. O. 2015-12-01 Estimation of ground-motion amplitudes that may be produced by future earthquakes constitutes the foundation of seismic hazard assessment and earthquake-resistant structural design. This is typically done by using a prediction equation that quantifies amplitudes as a function of key seismological variables such as magnitude, distance and site condition. In this study, we develop a hybrid empirical prediction equation for earthquakes in western Alberta, where evaluation of seismic hazard associated with induced seismicity is of particular interest. We use peak ground motions and response spectra from recorded seismic events to model the regional source and attenuation attributes. The available empirical data is limited in the magnitude range of engineering interest (M>4). Therefore, we combine empirical data with a simulation-based model in order to obtain seismologically informed predictions for moderate-to-large magnitude events. The methodology is two-fold. First, we investigate the shape of geometrical spreading in Alberta. We supplement the seismic data with ground motions obtained from mining/quarry blasts, in order to gain insights into the regional attenuation over a wide distance range. A comparison of ground-motion amplitudes for earthquakes and mining/quarry blasts show that both event types decay at similar rates with distance and demonstrate a significant Moho-bounce effect. In the second stage, we calibrate the source and attenuation parameters of a simulation-based prediction equation to match the available amplitude data from seismic events. We model the geometrical spreading using a trilinear function with attenuation rates obtained from the first stage, and calculate coefficients of anelastic attenuation and site amplification via regression analysis. This provides a hybrid ground-motion prediction equation that is calibrated for observed motions in western Alberta and is applicable to moderate-to-large magnitude events. 11. Earthquake Predictability: Results From Aggregating Seismicity Data And Assessment Of Theoretical Individual Cases Via Synthetic Data NASA Astrophysics Data System (ADS) Adamaki, A.; Roberts, R. 2016-12-01 For many years an important aim in seismological studies has been forecasting the occurrence of large earthquakes. Despite some well-established statistical behavior of earthquake sequences, expressed by e.g. the Omori law for aftershock sequences and the Gutenburg-Richter distribution of event magnitudes, purely statistical approaches to short-term earthquake prediction have in general not been successful. It seems that better understanding of the processes leading to critical stress build-up prior to larger events is necessary to identify useful precursory activity, if this exists, and statistical analyses are an important tool in this context. There has been considerable debate on the usefulness or otherwise of foreshock studies for short-term earthquake prediction. We investigate generic patterns of foreshock activity using aggregated data and by studying not only strong but also moderate magnitude events. Aggregating empirical local seismicity time series prior to larger events observed in and around Greece reveals a statistically significant increasing rate of seismicity over 20 days prior to M>3.5 earthquakes. This increase cannot be explained by tempo-spatial clustering models such as ETAS, implying genuine changes in the mechanical situation just prior to larger events and thus the possible existence of useful precursory information. Because of tempo-spatial clustering, including aftershocks to foreshocks, even if such generic behavior exists it does not necessarily follow that foreshocks have the potential to provide useful precursory information for individual larger events. Using synthetic catalogs produced based on different clustering models and different presumed system sensitivities we are now investigating to what extent the apparently established generic foreshock rate acceleration may or may not imply that the foreshocks have potential in the context of routine forecasting of larger events. Preliminary results suggest that this is the case, but 12. Earthquake prediction in California using regression algorithms and cloud-based big data infrastructure NASA Astrophysics Data System (ADS) Asencio-Cortés, G.; Morales-Esteban, A.; Shang, X.; Martínez-Álvarez, F. 2018-06-01 Earthquake magnitude prediction is a challenging problem that has been widely studied during the last decades. Statistical, geophysical and machine learning approaches can be found in literature, with no particularly satisfactory results. In recent years, powerful computational techniques to analyze big data have emerged, making possible the analysis of massive datasets. These new methods make use of physical resources like cloud based architectures. California is known for being one of the regions with highest seismic activity in the world and many data are available. In this work, the use of several regression algorithms combined with ensemble learning is explored in the context of big data (1 GB catalog is used), in order to predict earthquakes magnitude within the next seven days. Apache Spark framework, H2 O library in R language and Amazon cloud infrastructure were been used, reporting very promising results. 13. The Virtual Quake earthquake simulator: a simulation-based forecast of the El Mayor-Cucapah region and evidence of predictability in simulated earthquake sequences NASA Astrophysics Data System (ADS) Yoder, Mark R.; Schultz, Kasey W.; Heien, Eric M.; Rundle, John B.; Turcotte, Donald L.; Parker, Jay W.; Donnellan, Andrea 2015-12-01 In this manuscript, we introduce a framework for developing earthquake forecasts using Virtual Quake (VQ), the generalized successor to the perhaps better known Virtual California (VC) earthquake simulator. We discuss the basic merits and mechanics of the simulator, and we present several statistics of interest for earthquake forecasting. We also show that, though the system as a whole (in aggregate) behaves quite randomly, (simulated) earthquake sequences limited to specific fault sections exhibit measurable predictability in the form of increasing seismicity precursory to large m > 7 earthquakes. In order to quantify this, we develop an alert-based forecasting metric, and show that it exhibits significant information gain compared to random forecasts. We also discuss the long-standing question of activation versus quiescent type earthquake triggering. We show that VQ exhibits both behaviours separately for independent fault sections; some fault sections exhibit activation type triggering, while others are better characterized by quiescent type triggering. We discuss these aspects of VQ specifically with respect to faults in the Salton Basin and near the El Mayor-Cucapah region in southern California, USA and northern Baja California Norte, Mexico. 14. Predicting the spatial extent of liquefaction from geospatial and earthquake specific parameters USGS Publications Warehouse Zhu, Jing; Baise, Laurie G.; Thompson, Eric M.; Wald, David J.; Knudsen, Keith L.; Deodatis, George; Ellingwood, Bruce R.; Frangopol, Dan M. 2014-01-01 The spatially extensive damage from the 2010-2011 Christchurch, New Zealand earthquake events are a reminder of the need for liquefaction hazard maps for anticipating damage from future earthquakes. Liquefaction hazard mapping as traditionally relied on detailed geologic mapping and expensive site studies. These traditional techniques are difficult to apply globally for rapid response or loss estimation. We have developed a logistic regression model to predict the probability of liquefaction occurrence in coastal sedimentary areas as a function of simple and globally available geospatial features (e.g., derived from digital elevation models) and standard earthquake-specific intensity data (e.g., peak ground acceleration). Some of the geospatial explanatory variables that we consider are taken from the hydrology community, which has a long tradition of using remotely sensed data as proxies for subsurface parameters. As a result of using high resolution, remotely-sensed, and spatially continuous data as a proxy for important subsurface parameters such as soil density and soil saturation, and by using a probabilistic modeling framework, our liquefaction model inherently includes the natural spatial variability of liquefaction occurrence and provides an estimate of spatial extent of liquefaction for a given earthquake. To provide a quantitative check on how the predicted probabilities relate to spatial extent of liquefaction, we report the frequency of observed liquefaction features within a range of predicted probabilities. The percentage of liquefaction is the areal extent of observed liquefaction within a given probability contour. The regional model and the results show that there is a strong relationship between the predicted probability and the observed percentage of liquefaction. Visual inspection of the probability contours for each event also indicates that the pattern of liquefaction is well represented by the model. 15. Comparison of Ground Motion Prediction Equations (GMPE) for Chile and Canada With Recent Chilean Megathust Earthquakes NASA Astrophysics Data System (ADS) Herrera, C.; Cassidy, J. F.; Dosso, S. E. 2017-12-01 The ground shaking assessment allows quantifying the hazards associated with the occurrence of earthquakes. Chile and western Canada are two areas that have experienced, and are susceptible to imminent large crustal, in-slab and megathrust earthquakes that can affect the population significantly. In this context, we compare the current GMPEs used in the 2015 National Building Code of Canada and the most recent GMPEs calculated for Chile, with observed accelerations generated by four recent Chilean megathrust earthquakes (MW ≥ 7.7) that have occurred during the past decade, which is essential to quantify how well current models predict observations of major events.We collected the 3-component waveform data of more than 90 stations from the Centro Sismologico Nacional and the Universidad de Chile, and processed them by removing the trend and applying a band-pass filter. Then, for each station, we obtained the Peak Ground Acceleration (PGA), and by using a damped response spectra, we calculated the Pseudo Spectral Acceleration (PSA). Finally, we compared those observations with the most recent Chilean and Canadian GMPEs. Given the lack of geotechnical information for most of the Chilean stations, we also used a new method to obtain the VS30 by inverting the H/V ratios using a trans-dimensional Bayesian inversion, which allows us to improve the correction of observations according to soil conditions.As expected, our results show a good fit between observations and the Chilean GMPEs, but we observe that although the shape of the Canadian GMPEs is coherent with the distribution of observations, in general they under predict the observations for PGA and PSA at shorter periods for most of the considered earthquakes. An example of this can be seen in the attached figure for the case of the 2014 Iquique earthquake.These results present important implications related to the hazards associated to large earthquakes, especially for western Canada, where the probability of a 16. Conditional spectrum computation incorporating multiple causal earthquakes and ground-motion prediction models USGS Publications Warehouse Lin, Ting; Harmsen, Stephen C.; Baker, Jack W.; Luco, Nicolas 2013-01-01 The conditional spectrum (CS) is a target spectrum (with conditional mean and conditional standard deviation) that links seismic hazard information with ground-motion selection for nonlinear dynamic analysis. Probabilistic seismic hazard analysis (PSHA) estimates the ground-motion hazard by incorporating the aleatory uncertainties in all earthquake scenarios and resulting ground motions, as well as the epistemic uncertainties in ground-motion prediction models (GMPMs) and seismic source models. Typical CS calculations to date are produced for a single earthquake scenario using a single GMPM, but more precise use requires consideration of at least multiple causal earthquakes and multiple GMPMs that are often considered in a PSHA computation. This paper presents the mathematics underlying these more precise CS calculations. Despite requiring more effort to compute than approximate calculations using a single causal earthquake and GMPM, the proposed approach produces an exact output that has a theoretical basis. To demonstrate the results of this approach and compare the exact and approximate calculations, several example calculations are performed for real sites in the western United States. The results also provide some insights regarding the circumstances under which approximate results are likely to closely match more exact results. To facilitate these more precise calculations for real applications, the exact CS calculations can now be performed for real sites in the United States using new deaggregation features in the U.S. Geological Survey hazard mapping tools. Details regarding this implementation are discussed in this paper. 17. Magnitude Estimation for the 2011 Tohoku-Oki Earthquake Based on Ground Motion Prediction Equations NASA Astrophysics Data System (ADS) Eshaghi, Attieh; Tiampo, Kristy F.; Ghofrani, Hadi; Atkinson, Gail M. 2015-08-01 This study investigates whether real-time strong ground motion data from seismic stations could have been used to provide an accurate estimate of the magnitude of the 2011 Tohoku-Oki earthquake in Japan. Ultimately, such an estimate could be used as input data for a tsunami forecast and would lead to more robust earthquake and tsunami early warning. We collected the strong motion accelerograms recorded by borehole and free-field (surface) Kiban Kyoshin network stations that registered this mega-thrust earthquake in order to perform an off-line test to estimate the magnitude based on ground motion prediction equations (GMPEs). GMPEs for peak ground acceleration and peak ground velocity (PGV) from a previous study by Eshaghi et al. in the Bulletin of the Seismological Society of America 103. (2013) derived using events with moment magnitude ( M) ≥ 5.0, 1998-2010, were used to estimate the magnitude of this event. We developed new GMPEs using a more complete database (1998-2011), which added only 1 year but approximately twice as much data to the initial catalog (including important large events), to improve the determination of attenuation parameters and magnitude scaling. These new GMPEs were used to estimate the magnitude of the Tohoku-Oki event. The estimates obtained were compared with real time magnitude estimates provided by the existing earthquake early warning system in Japan. Unlike the current operational magnitude estimation methods, our method did not saturate and can provide robust estimates of moment magnitude within ~100 s after earthquake onset for both catalogs. It was found that correcting for average shear-wave velocity in the uppermost 30 m () improved the accuracy of magnitude estimates from surface recordings, particularly for magnitude estimates of PGV (Mpgv). The new GMPEs also were used to estimate the magnitude of all earthquakes in the new catalog with at least 20 records. Results show that the magnitude estimate from PGV values using 18. ERTS Applications in earthquake research and mineral exploration in California NASA Technical Reports Server (NTRS) Abdel-Gawad, M.; Silverstein, J. 1973-01-01 Examples that ERTS imagery can be effectively utilized to identify, locate, and map faults which show geomorphic evidence of geologically recent breakage are presented. Several important faults not previously known have been identified. By plotting epicenters of historic earthquakes in parts of California, Sonora, Mexico, Arizona, and Nevada, we found that areas known for historic seismicity are often characterized by abundant evidence of recent fault and crustal movements. There are many examples of seismically quiet areas where outstanding evidence of recent fault movements is observed. One application is clear: ERTS-1 imagery could be effectively utilized to delineate areas susceptible to earthquake recurrence which, on the basis of seismic data alone, may be misleadingly considered safe. ERTS data can also be utilized in planning new sites in the geophysical network of fault movement monitoring and strain and tilt measurements. 19. Reduction of earthquake risk in the united states: Bridging the gap between research and practice USGS Publications Warehouse Hays, W.W. 1998-01-01 Continuing efforts under the auspices of the National Earthquake Hazards Reduction Program are under way to improve earthquake risk assessment and risk management in earthquake-prone regions of Alaska, California, Nevada, Washington, Oregon, Arizona, Utah, Wyoming, and Idaho, the New Madrid and Wabash Valley seismic zones in the central United States, the southeastern and northeastern United States, Puerto Rico, Virgin Islands, Guam, and Hawaii. Geologists, geophysicists, seismologists, architects, engineers, urban planners, emergency managers, health care specialists, and policymakers are having to work at the margins of their disciplines to bridge the gap between research and practice and to provide a social, technical, administrative, political, legal, and economic basis for changing public policies and professional practices in communities where the earthquake risk is unacceptable. ?? 1998 IEEE. 20. Assessing the capability of numerical methods to predict earthquake ground motion: the Euroseistest verification and validation project NASA Astrophysics Data System (ADS) Chaljub, E. O.; Bard, P.; Tsuno, S.; Kristek, J.; Moczo, P.; Franek, P.; Hollender, F.; Manakou, M.; Raptakis, D.; Pitilakis, K. 2009-12-01 During the last decades, an important effort has been dedicated to develop accurate and computationally efficient numerical methods to predict earthquake ground motion in heterogeneous 3D media. The progress in methods and increasing capability of computers have made it technically feasible to calculate realistic seismograms for frequencies of interest in seismic design applications. In order to foster the use of numerical simulation in practical prediction, it is important to (1) evaluate the accuracy of current numerical methods when applied to realistic 3D applications where no reference solution exists (verification) and (2) quantify the agreement between recorded and numerically simulated earthquake ground motion (validation). Here we report the results of the Euroseistest verification and validation project - an ongoing international collaborative work organized jointly by the Aristotle University of Thessaloniki, Greece, the Cashima research project (supported by the French nuclear agency, CEA, and the Laue-Langevin institute, ILL, Grenoble), and the Joseph Fourier University, Grenoble, France. The project involves more than 10 international teams from Europe, Japan and USA. The teams employ the Finite Difference Method (FDM), the Finite Element Method (FEM), the Global Pseudospectral Method (GPSM), the Spectral Element Method (SEM) and the Discrete Element Method (DEM). The project makes use of a new detailed 3D model of the Mygdonian basin (about 5 km wide, 15 km long, sediments reach about 400 m depth, surface S-wave velocity is 200 m/s). The prime target is to simulate 8 local earthquakes with magnitude from 3 to 5. In the verification, numerical predictions for frequencies up to 4 Hz for a series of models with increasing structural and rheological complexity are analyzed and compared using quantitative time-frequency goodness-of-fit criteria. Predictions obtained by one FDM team and the SEM team are close and different from other predictions 1. Prediction of the area affected by earthquake-induced landsliding based on seismological parameters NASA Astrophysics Data System (ADS) Marc, Odin; Meunier, Patrick; Hovius, Niels 2017-07-01 We present an analytical, seismologically consistent expression for the surface area of the region within which most landslides triggered by an earthquake are located (landslide distribution area). This expression is based on scaling laws relating seismic moment, source depth, and focal mechanism with ground shaking and fault rupture length and assumes a globally constant threshold of acceleration for onset of systematic mass wasting. The seismological assumptions are identical to those recently used to propose a seismologically consistent expression for the total volume and area of landslides triggered by an earthquake. To test the accuracy of the model we gathered geophysical information and estimates of the landslide distribution area for 83 earthquakes. To reduce uncertainties and inconsistencies in the estimation of the landslide distribution area, we propose an objective definition based on the shortest distance from the seismic wave emission line containing 95 % of the total landslide area. Without any empirical calibration the model explains 56 % of the variance in our dataset, and predicts 35 to 49 out of 83 cases within a factor of 2, depending on how we account for uncertainties on the seismic source depth. For most cases with comprehensive landslide inventories we show that our prediction compares well with the smallest region around the fault containing 95 % of the total landslide area. Aspects ignored by the model that could explain the residuals include local variations of the threshold of acceleration and processes modulating the surface ground shaking, such as the distribution of seismic energy release on the fault plane, the dynamic stress drop, and rupture directivity. Nevertheless, its simplicity and first-order accuracy suggest that the model can yield plausible and useful estimates of the landslide distribution area in near-real time, with earthquake parameters issued by standard detection routines. 2. Earthquake-triggered liquefaction in Southern Siberia and surroundings: a base for predictive models and seismic hazard estimation NASA Astrophysics Data System (ADS) Lunina, Oksana 2016-04-01 The forms and location patterns of soil liquefaction induced by earthquakes in southern Siberia, Mongolia, and northern Kazakhstan in 1950 through 2014 have been investigated, using field methods and a database of coseismic effects created as a GIS MapInfo application, with a handy input box for large data arrays. Statistical analysis of the data has revealed regional relationships between the magnitude (Ms) of an earthquake and the maximum distance of its environmental effect to the epicenter and to the causative fault (Lunina et al., 2014). Estimated limit distances to the fault for the Ms = 8.1 largest event are 130 km that is 3.5 times as short as those to the epicenter, which is 450 km. Along with this the wider of the fault the less liquefaction cases happen. 93% of them are within 40 km from the causative fault. Analysis of liquefaction locations relative to nearest faults in southern East Siberia shows the distances to be within 8 km but 69% of all cases are within 1 km. As a result, predictive models have been created for locations of seismic liquefaction, assuming a fault pattern for some parts of the Baikal rift zone. Base on our field and world data, equations have been suggested to relate the maximum sizes of liquefaction-induced clastic dikes (maximum width, visible maximum height and intensity index of clastic dikes) with Ms and local shaking intensity corresponding to the MSK-64 macroseismic intensity scale (Lunina and Gladkov, 2015). The obtained results make basis for modeling the distribution of the geohazard for the purposes of prediction and for estimating the earthquake parameters from liquefaction-induced clastic dikes. The author would like to express their gratitude to the Institute of the Earth's Crust, Siberian Branch of the Russian Academy of Sciences for providing laboratory to carry out this research and Russian Scientific Foundation for their financial support (Grant 14-17-00007). 3. The Himalayan Seismogenic Zone: A New Frontier for Earthquake Research NASA Astrophysics Data System (ADS) Brown, Larry; Hubbard, Judith; Karplus, Marianne; Klemperer, Simon; Sato, Hiroshi 2016-04-01 The Mw 7.8 Gorkha, Nepal, earthquake that occurred on April 25 of this year was a dramatic reminder that great earthquakes are not restricted to the large seismogenic zones associated with subduction of oceanic lithosphere. Not only does Himalayan seismogenesis represents important scientific and societal issues in its own right, it constitutes a reference for evaluating general models of the earthquake cycle derived from the studies of the oceanic subduction systems. This presentation reports results of a Mini-Workshop sponsored by the GeoPrisms project that was held in conjunction with the American Geophysical Union on December 15, 2015, designed to organize a new initiative to study the great Himalaya earthquake machine. The Himalayan seismogenic zone shares with its oceanic counterparts a number of fundamental questions, including: a) What controls the updip and downdip limits of rupture? b) What controls the lateral segmentation of rupture zones (and hence magnitude)? c) What is the role of fluids in facilitating slip and or rupture? d) What nucleates rupture (e..g. asperities?)? e) What physical properties can be monitored as precursors to future events? f) How effectively can the radiation pattern of future events be modeled? g) How can a better understanding of Himalayan rupture be translated into more cost effective preparations for the next major event in this region? However the underthrusting of continental, as opposed to oceanic, lithosphere in the Himalayas frames these questions in a very different context: h) How does the greater thickness and weaker rheology of continental crust/lithosphere affect locking of the seismogenic zone? i) How does the different thermal structure of continental vs oceanic crust affect earthquake geodynamics? j) Are fluids a significant factor in intercontinental thrusting? k) How does the basement morphology of underthrust continental crust affect locking/creep, and how does it differ from the oceanic case? l) What is the 4. Real-time 3-D space numerical shake prediction for earthquake early warning NASA Astrophysics Data System (ADS) Wang, Tianyun; Jin, Xing; Huang, Yandan; Wei, Yongxiang 2017-12-01 In earthquake early warning systems, real-time shake prediction through wave propagation simulation is a promising approach. Compared with traditional methods, it does not suffer from the inaccurate estimation of source parameters. For computation efficiency, wave direction is assumed to propagate on the 2-D surface of the earth in these methods. In fact, since the seismic wave propagates in the 3-D sphere of the earth, the 2-D space modeling of wave direction results in inaccurate wave estimation. In this paper, we propose a 3-D space numerical shake prediction method, which simulates the wave propagation in 3-D space using radiative transfer theory, and incorporate data assimilation technique to estimate the distribution of wave energy. 2011 Tohoku earthquake is studied as an example to show the validity of the proposed model. 2-D space model and 3-D space model are compared in this article, and the prediction results show that numerical shake prediction based on 3-D space model can estimate the real-time ground motion precisely, and overprediction is alleviated when using 3-D space model. 5. An Integrated and Interdisciplinary Model for Predicting the Risk of Injury and Death in Future Earthquakes. PubMed Shapira, Stav; Novack, Lena; Bar-Dayan, Yaron; Aharonson-Daniel, Limor 2016-01-01 A comprehensive technique for earthquake-related casualty estimation remains an unmet challenge. This study aims to integrate risk factors related to characteristics of the exposed population and to the built environment in order to improve communities' preparedness and response capabilities and to mitigate future consequences. An innovative model was formulated based on a widely used loss estimation model (HAZUS) by integrating four human-related risk factors (age, gender, physical disability and socioeconomic status) that were identified through a systematic review and meta-analysis of epidemiological data. The common effect measures of these factors were calculated and entered to the existing model's algorithm using logistic regression equations. Sensitivity analysis was performed by conducting a casualty estimation simulation in a high-vulnerability risk area in Israel. the integrated model outcomes indicated an increase in the total number of casualties compared with the prediction of the traditional model; with regard to specific injury levels an increase was demonstrated in the number of expected fatalities and in the severely and moderately injured, and a decrease was noted in the lightly injured. Urban areas with higher populations at risk rates were found more vulnerable in this regard. The proposed model offers a novel approach that allows quantification of the combined impact of human-related and structural factors on the results of earthquake casualty modelling. Investing efforts in reducing human vulnerability and increasing resilience prior to an occurrence of an earthquake could lead to a possible decrease in the expected number of casualties. 6. Potential utilization of the NASA/George C. Marshall Space Flight Center in earthquake engineering research NASA Technical Reports Server (NTRS) Scholl, R. E. (Editor) 1979-01-01 Earthquake engineering research capabilities of the National Aeronautics and Space Administration (NASA) facilities at George C. Marshall Space Flight Center (MSFC), Alabama, were evaluated. The results indicate that the NASA/MSFC facilities and supporting capabilities offer unique opportunities for conducting earthquake engineering research. Specific features that are particularly attractive for large scale static and dynamic testing of natural and man-made structures include the following: large physical dimensions of buildings and test bays; high loading capacity; wide range and large number of test equipment and instrumentation devices; multichannel data acquisition and processing systems; technical expertise for conducting large-scale static and dynamic testing; sophisticated techniques for systems dynamics analysis, simulation, and control; and capability for managing large-size and technologically complex programs. Potential uses of the facilities for near and long term test programs to supplement current earthquake research activities are suggested. 7. Satellite Relay Telemetry of Seismic Data in Earthquake Prediction and Control NASA Technical Reports Server (NTRS) Jackson, W. H.; Eaton, J. P. 1971-01-01 The Satellite Telemetry Earthquake Monitoring Program was started to evaluate the applicability of satellite relay telemetry in the collection of seismic data from a large number of dense seismograph clusters laid out along the major fault systems of western North America. Prototype clusters utilizing phone-line telemetry were then being installed by the National Center for Earthquake Research in 3 regions along the San Andreas fault in central California; and the experience of installing and operating the clusters and in reducing and analyzing the seismic data from them was to provide the raw materials for evaluation in the satellite relay telemetry project. The principal advantages of the satellite relay system over commercial telephone or microwave systems were: (1) it could be made less prone to massive failure during a major earthquake; (2) it could be extended readily into undeveloped regions; and (3) it could provide flexible, uniform communications over large sections of major global tectonic zones. Fundamental characteristics of a communications system to cope with the large volume of raw data collected by a short-period seismograph network are discussed. 8. The 1868 Hayward Earthquake Alliance: A Case Study - Using an Earthquake Anniversary to Promote Earthquake Preparedness NASA Astrophysics Data System (ADS) Brocher, T. M.; Garcia, S.; Aagaard, B. T.; Boatwright, J. J.; Dawson, T.; Hellweg, M.; Knudsen, K. L.; Perkins, J.; Schwartz, D. P.; Stoffer, P. W.; Zoback, M. 2008-12-01 Last October 21st marked the 140th anniversary of the M6.8 1868 Hayward Earthquake, the last damaging earthquake on the southern Hayward Fault. This anniversary was used to help publicize the seismic hazards associated with the fault because: (1) the past five such earthquakes on the Hayward Fault occurred about 140 years apart on average, and (2) the Hayward-Rodgers Creek Fault system is the most likely (with a 31 percent probability) fault in the Bay Area to produce a M6.7 or greater earthquake in the next 30 years. To promote earthquake awareness and preparedness, over 140 public and private agencies and companies and many individual joined the public-private nonprofit 1868 Hayward Earthquake Alliance (1868alliance.org). The Alliance sponsored many activities including a public commemoration at Mission San Jose in Fremont, which survived the 1868 earthquake. This event was followed by an earthquake drill at Bay Area schools involving more than 70,000 students. The anniversary prompted the Silver Sentinel, an earthquake response exercise based on the scenario of an earthquake on the Hayward Fault conducted by Bay Area County Offices of Emergency Services. 60 other public and private agencies also participated in this exercise. The California Seismic Safety Commission and KPIX (CBS affiliate) produced professional videos designed forschool classrooms promoting Drop, Cover, and Hold On. Starting in October 2007, the Alliance and the U.S. Geological Survey held a sequence of press conferences to announce the release of new research on the Hayward Fault as well as new loss estimates for a Hayward Fault earthquake. These included: (1) a ShakeMap for the 1868 Hayward earthquake, (2) a report by the U. S. Bureau of Labor Statistics forecasting the number of employees, employers, and wages predicted to be within areas most strongly shaken by a Hayward Fault earthquake, (3) new estimates of the losses associated with a Hayward Fault earthquake, (4) new ground motion 9. Modeling, Forecasting and Mitigating Extreme Earthquakes NASA Astrophysics Data System (ADS) Ismail-Zadeh, A.; Le Mouel, J.; Soloviev, A. 2012-12-01 Recent earthquake disasters highlighted the importance of multi- and trans-disciplinary studies of earthquake risk. A major component of earthquake disaster risk analysis is hazards research, which should cover not only a traditional assessment of ground shaking, but also studies of geodetic, paleoseismic, geomagnetic, hydrological, deep drilling and other geophysical and geological observations together with comprehensive modeling of earthquakes and forecasting extreme events. Extreme earthquakes (large magnitude and rare events) are manifestations of complex behavior of the lithosphere structured as a hierarchical system of blocks of different sizes. Understanding of physics and dynamics of the extreme events comes from observations, measurements and modeling. A quantitative approach to simulate earthquakes in models of fault dynamics will be presented. The models reproduce basic features of the observed seismicity (e.g., the frequency-magnitude relationship, clustering of earthquakes, occurrence of extreme seismic events). They provide a link between geodynamic processes and seismicity, allow studying extreme events, influence of fault network properties on seismic patterns and seismic cycles, and assist, in a broader sense, in earthquake forecast modeling. Some aspects of predictability of large earthquakes (how well can large earthquakes be predicted today?) will be also discussed along with possibilities in mitigation of earthquake disasters (e.g., on 'inverse' forensic investigations of earthquake disasters). 10. Postseismic Deformation after the 1964 Great Alaskan Earthquake: Collaborative Research with Goddard Space Flight Center NASA Technical Reports Server (NTRS) Freymueller, Jeffrey T. 1999-01-01 The purpose of this project was to carry out GPS observations on the Kenai Peninsula, southern Alaska, in order to study the postseismic and contemporary deformation following the 1964 Alaska earthquake. All of the research supported in this grant was carried out in collaboration with Dr. Steven Cohen of Goddard Space Flight Center. The research funding from this grant primarily supported GPS fieldwork, along with the acquisition of computer equipment to allow analysis and modeling of the GPS data. A minor amount of salary support was provided by the PI, but the great majority of the salary support was provided by the Geophysical Institute. After the expiration of this grant, additional funding was obtained from the National Science Foundation to continue the work. This grant supported GPS field campaigns in August 1995, June 1996, May-June and September 1997, and May-June 1998. We initially began the work by surveying leveling benchmarks on the Kenai peninsula that had been surveyed after the 1964 earthquake. Changes in height from the 1964 leveling data to the 1995+ GPS data, corrected for the geoid-ellipsoid separation, give the total elevation change since the earthquake. Beginning in 1995, we also identified or established sites that were suitable for long-term surveying using GPS. In the subsequent annual GPS campaigns, we made regular measurements at these GPS marks, and steadily enhanced our set of points for which cumulative postseismic uplift data were available. From 4 years of Global Positioning System (GPS) measurements, we find significant spatial variations in present-day deformation between the eastern and western Kenai peninsula, Alaska. Sites in the eastern Kenai peninsula and Prince William Sound move to the NNW relative to North America, in the direction of Pacific-North America relative plate motion. Velocities decrease in magnitude from nearly the full plate rate in southern Prince William Sound to about 30 mm/yr at Seward and to about 5 mm 11. Earthquake and Tsunami Disaster Mitigation in The Marmara Region and Disaster Education in Turkey Part2 Yoshiyuki KANEDA Nagoya University Japan Agency for Marine-Earth Science and Technology (JAMSTEC) Haluk OZENER Boğaziçi University, Earthquake Researches Institute (KOERI) and Members of SATREPS Japan-Turkey project NASA Astrophysics Data System (ADS) Kaneda, Y.; Ozener, H. 2015-12-01 The 1999 Izumit Earthquake as the destructive earthquake occurred near the Marmara Sea. The Marmara Sea should be focused on because of a seismic gap in the North Anatolian fault. Istanbul is located around the Marmara Sea, so, if next earthquake will occur near Istanbul, fatal damages will be generated. The Japan and Turkey can share our own experiences during past damaging earthquakes and we can prepare for future large earthquakes in cooperation with each other. In earthquakes in Tokyo area and Istanbul area as the destructive earthquakes near high population cities, there are common disaster researches and measures. For disaster mitigation, we are progressing multidisciplinary researches. Our goals of this SATREPS project are as follows, To develop disaster mitigation policy and strategies based on multidisciplinary research activities. To provide decision makers with newly found knowledge for its implementation to the current regulations. To organize disaster education programs in order to increase disaster awareness in Turkey. To contribute the evaluation of active fault studies in Japan. This project is composed of four research groups. The first group is Marmara Earthquake Source region observationally research group. This group has 4 sub-themes such as Seismicity, Geodesy, Electromagnetics and Trench analyses. The second group focuses on scenario researches of earthquake occurrence along the North Anatolia fault and precise tsunami simulation in the Marmara region. Aims of the third group are improvements and constructions of seismic characterizations and damage predictions based on observation researches and precise simulations. The fourth group is promoting disaster educations using research result visuals. In this SATREPS project, we will integrate these research results for disaster mitigation in Marmara region and .disaster education in Turkey. We will have a presentation of the updated results of this SATREPS project. 12. Scientific Research Database of the 2008 Ms8.0 Wenchuan Earthquake NASA Astrophysics Data System (ADS) Liang, C.; Yang, Y.; Yu, Y. 2013-12-01 Nearly 5 years after the 2008 Ms8.0 Wenchuan Earthquake, the Ms7.0 Lushan earthquake stroke 70km away along the same fault system. Given the tremendous life loss and property damages as well as the short time and distance intervals between the two large magnitude events, the scientific probing into their causing factors and future seismic activities in the nearby region will continue to be in the center of earthquake research in China and even the world for years to come. In the past five years, scientists have made significant efforts to study the Wenchuan earthquake from various aspects using different datasets and methods. Their studies cover a variety of topics including seismogenic environment, earthquake precursors, rupture process, co-seismic phenomenon, hazard relief, reservoir induced seismicity and more. These studies have been published in numerous journals in Chinese, English and many other languages. In addition, 54 books regarding to this earthquake have been published. The extremely diversified nature of all publications makes it very difficult and time-consuming, if not impossible, to sort out information needed by individual researcher in an efficient way. An information platform that collects relevant scientific information and makes them accessible in various ways can be very handy. With this mission in mind, the Earthquake Research Group in the Chengdu University of Technology has developed a website www.wceq.org to attack this target: (1) articles published by major journals and books are recorded into a database. Researchers will be able to find articles by topics, journals, publication dates, authors and keywords e.t.c by a few clicks; (2) to fast track the latest developments, researchers can also follow upon updates in the current month, last 90days, 180 days and 365 days by clicking on corresponding links; (3) the modern communication tools such as Facebook, Twitter and their Chinese counterparts are accommodated in this site to share 13. Earthquake Hazards. ERIC Educational Resources Information Center Donovan, Neville 1979-01-01 Provides a survey and a review of earthquake activity and global tectonics from the advancement of the theory of continental drift to the present. Topics include: an identification of the major seismic regions of the earth, seismic measurement techniques, seismic design criteria for buildings, and the prediction of earthquakes. (BT) 14. Earthquake Forecasting Methodology Catalogue - A collection and comparison of the state-of-the-art in earthquake forecasting and prediction methodologies NASA Astrophysics Data System (ADS) Schaefer, Andreas; Daniell, James; Wenzel, Friedemann 2015-04-01 Earthquake forecasting and prediction has been one of the key struggles of modern geosciences for the last few decades. A large number of approaches for various time periods have been developed for different locations around the world. A categorization and review of more than 20 of new and old methods was undertaken to develop a state-of-the-art catalogue in forecasting algorithms and methodologies. The different methods have been categorised into time-independent, time-dependent and hybrid methods, from which the last group represents methods where additional data than just historical earthquake statistics have been used. It is necessary to categorize in such a way between pure statistical approaches where historical earthquake data represents the only direct data source and also between algorithms which incorporate further information e.g. spatial data of fault distributions or which incorporate physical models like static triggering to indicate future earthquakes. Furthermore, the location of application has been taken into account to identify methods which can be applied e.g. in active tectonic regions like California or in less active continental regions. In general, most of the methods cover well-known high-seismicity regions like Italy, Japan or California. Many more elements have been reviewed, including the application of established theories and methods e.g. for the determination of the completeness magnitude or whether the modified Omori law was used or not. Target temporal scales are identified as well as the publication history. All these different aspects have been reviewed and catalogued to provide an easy-to-use tool for the development of earthquake forecasting algorithms and to get an overview in the state-of-the-art. 15. An Integrated and Interdisciplinary Model for Predicting the Risk of Injury and Death in Future Earthquakes PubMed Central Shapira, Stav; Novack, Lena; Bar-Dayan, Yaron; Aharonson-Daniel, Limor 2016-01-01 Background A comprehensive technique for earthquake-related casualty estimation remains an unmet challenge. This study aims to integrate risk factors related to characteristics of the exposed population and to the built environment in order to improve communities’ preparedness and response capabilities and to mitigate future consequences. Methods An innovative model was formulated based on a widely used loss estimation model (HAZUS) by integrating four human-related risk factors (age, gender, physical disability and socioeconomic status) that were identified through a systematic review and meta-analysis of epidemiological data. The common effect measures of these factors were calculated and entered to the existing model’s algorithm using logistic regression equations. Sensitivity analysis was performed by conducting a casualty estimation simulation in a high-vulnerability risk area in Israel. Results the integrated model outcomes indicated an increase in the total number of casualties compared with the prediction of the traditional model; with regard to specific injury levels an increase was demonstrated in the number of expected fatalities and in the severely and moderately injured, and a decrease was noted in the lightly injured. Urban areas with higher populations at risk rates were found more vulnerable in this regard. Conclusion The proposed model offers a novel approach that allows quantification of the combined impact of human-related and structural factors on the results of earthquake casualty modelling. Investing efforts in reducing human vulnerability and increasing resilience prior to an occurrence of an earthquake could lead to a possible decrease in the expected number of casualties. PMID:26959647 16. Risk Communication on Earthquake Prediction Studies -"No L'Aquila quake risk" experts probed in Italy in June 2010 NASA Astrophysics Data System (ADS) Oki, S.; Koketsu, K.; Kuwabara, E.; Tomari, J. 2010-12-01 For the previous 6 months from the L'Aquila earthquake which occurred on 6th April 2009, the seismicity in that region had been active. Having become even more active and reached to magnitude 4 earthquake on 30th March, the government held Major Risks Committee which is a part of the Civil Protection Department and is tasked with forecasting possible risks by collating and analyzing data from a variety of sources and making preventative recommendations. At the press conference immediately after the committee, they reported that "The scientific community tells us there is no danger, because there is an ongoing discharge of energy. The situation looks favorable." 6 days later, a magunitude 6.3 earthquake attacked L'Aquila and killed 308 people. On 3rd June next year, the prosecutors opened the investigation after complaints of the victims that far more people would have fled their homes that night if there had been no reassurances of the Major Risks Committee the previous week. This issue becomes widely known to the seismological society especially after an email titled "Letter of Support for Italian Earthquake Scientists" from seismologists at the National Geophysics and Volcanology Institute (INGV) sent worldwide. It says that the L'Aquila Prosecutors office indicted of manslaughter the members of the Major Risks Committee and that the charges are for failing to provide a short term alarm to the population before the earthquake struck. It is true that there is no generalized method to predict earthquakes but failing the short term alarm is not the reason for the investigation of the scientists. The chief prosecutor stated that "the committee could have provided the people with better advice", and "it wasn't the case that they did not receive any warnings, because there had been tremors". The email also requests sign-on support for the open letter to the president of Italy from Earth sciences colleagues from all over the world and collected more than 5000 signatures 17. Prediction of maximum earthquake intensities for the San Francisco Bay region USGS Publications Warehouse Borcherdt, Roger D.; Gibbs, James F. 1975-01-01 The intensity data for the California earthquake of April 18, 1906, are strongly dependent on distance from the zone of surface faulting and the geological character of the ground. Considering only those sites (approximately one square city block in size) for which there is good evidence for the degree of ascribed intensity, the empirical relation derived between 1906 intensities and distance perpendicular to the fault for 917 sites underlain by rocks of the Franciscan Formation is: Intensity = 2.69 - 1.90 log (Distance) (km). For sites on other geologic units intensity increments, derived with respect to this empirical relation, correlate strongly with the Average Horizontal Spectral Amplifications (AHSA) determined from 99 three-component recordings of ground motion generated by nuclear explosions in Nevada. The resulting empirical relation is: Intensity Increment = 0.27 +2.70 log (AHSA), and average intensity increments for the various geologic units are -0.29 for granite, 0.19 for Franciscan Formation, 0.64 for the Great Valley Sequence, 0.82 for Santa Clara Formation, 1.34 for alluvium, 2.43 for bay mud. The maximum intensity map predicted from these empirical relations delineates areas in the San Francisco Bay region of potentially high intensity from future earthquakes on either the San Andreas fault or the Hazard fault. 18. A Trial for Earthquake Prediction by Precise Monitoring of Deep Ground Water Temperature NASA Astrophysics Data System (ADS) Nasuhara, Y.; Otsuki, K.; Yamauchi, T. 2006-12-01 A near future large earthquake is estimated to occur off Miyagi prefecture, northeast Japan within 20 years at a probability of about 80 %. In order to predict this earthquake, we have observed groundwater temperature in a borehole at Sendai city 100 km west of the asperity. This borehole penetrates the fault zone of NE-trending active reverse fault, Nagamachi-Rifu fault zone, at 820m depth. Our concept of the ground water observation is that fault zones are natural amplifier of crustal strain, and hence at 820m depth we set a very precise quartz temperature sensor with the resolution of 0.0002 deg. C. We confirmed our observation system to work normally by both the pumping up tests and the systematic temperature changes at different depths. Since the observation started on June 20 in 2004, we found mysterious intermittent temperature fluctuations of two types; one is of a period of 5-10 days and an amplitude of ca. 0.1 deg. C, and the other is of a period of 11-21 days and an amplitude of ca. 0.2 deg. C. Based on the examination using the product of Grashof number and Prantl number, natural convection of water can be occurred in the borehole. However, since these temperature fluctuations are observed only at the depth around 820 m, thus it is likely that they represent the hydrological natures proper to the Nagamachi-Rifu fault zone. It is noteworthy that the small temperature changes correlatable with earth tide are superposed on the long term and large amplitude fluctuations. The amplitude on the days of the full moon and new moon is ca. 0.001 deg. C. The bottoms of these temperature fluctuations always delay about 6 hours relative to peaks of earth tide. This is interpreted as that water in the borehole is sucked into the fault zone on which tensional normal stress acts on the days of the full moon and new moon. The amplitude of the crustal strain by earth tide was measured at ca. 2∗10^-8 strain near our observation site. High frequency temperature noise of 19. Prediction of Strong Earthquake Ground Motion for the M=7.4 and M=7.2 1999, Turkey Earthquakes based upon Geological Structure Modeling and Local Earthquake Recordings NASA Astrophysics Data System (ADS) Gok, R.; Hutchings, L. 2004-05-01 We test a means to predict strong ground motion using the Mw=7.4 and Mw=7.2 1999 Izmit and Duzce, Turkey earthquakes. We generate 100 rupture scenarios for each earthquake, constrained by a prior knowledge, and use these to synthesize strong ground motion and make the prediction. Ground motion is synthesized with the representation relation using impulsive point source Green's functions and synthetic source models. We synthesize the earthquakes from DC to 25 Hz. We demonstrate how to incorporate this approach into standard probabilistic seismic hazard analyses (PSHA). The synthesis of earthquakes is based upon analysis of over 3,000 aftershocks recorded by several seismic networks. The analysis provides source parameters of the aftershocks; records available for use as empirical Green's functions; and a three-dimensional velocity structure from tomographic inversion. The velocity model is linked to a finite difference wave propagation code (E3D, Larsen 1998) to generate synthetic Green's functions (DC < f < 0.5 Hz). We performed the simultaneous inversion for hypocenter locations and three-dimensional P-wave velocity structure of the Marmara region using SIMULPS14 along with 2,500 events. We also obtained source moment and corner frequency and individual station attenuation parameter estimates for over 500 events by performing a simultaneous inversion to fit these parameters with a Brune source model. We used the results of the source inversion to deconvolve out a Brune model from small to moderate size earthquake (M<4.0) recordings to obtain empirical Green's functions for the higher frequency range of ground motion (0.5 < f < 25.0 Hz). Work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract W-7405-ENG-48. 20. Predictive factors of depression symptoms among adolescents in the 18-month follow-up after Wenchuan earthquake in China. PubMed Chui, Cheryl H K; Ran, Mao-Sheng; Li, Rong-Hui; Fan, Mei; Zhang, Zhen; Li, Yuan-Hao; Ou, Guo Jing; Jiang, Zhe; Tong, Yu-Zhen; Fang, Ding-Zhi 2017-02-01 It is unclear about the change and risk factors of depression among adolescent survivors after earthquake. This study aimed to explore the change of depression, and identify the predictive factors of depression among adolescent survivors after the 2008 Wenchuan earthquake in China. The depression among high school students at 6, 12 and 18 months after the Wenchuan earthquake were investigated. The Beck Depression Inventory (BDI) was used in this study to assess the severity of depression. Subjects included 548 student survivors in an affected high school. The rates of depression among the adolescent survivors at 6-, 12- and 18-month after the earthquake were 27.3%, 42.9% and 33.3%, respectively, for males, and 42.9%, 61.9% and 53.4%, respectively, for females. Depression symptoms, trauma-related self-injury, suicidal ideation and PTSD symptoms at the 6-month follow-up were significant predictive factors for depression at the 18-month time interval following the earthquake. This study highlights the need for considering disaster-related psychological sequela and risk factors of depression symptoms in the planning and implementation of mental health services. Long-term mental and psychological supports for victims of natural disasters are imperative. 1. Earthquake prediction analysis based on empirical seismic rate: the M8 algorithm NASA Astrophysics Data System (ADS) Molchan, G.; Romashkova, L. 2010-12-01 The quality of space-time earthquake prediction is usually characterized by a 2-D error diagram (n, τ), where n is the fraction of failures-to-predict and τ is the local rate of alarm averaged in space. The most reasonable averaging measure for analysis of a prediction strategy is the normalized rate of target events λ(dg) in a subarea dg. In that case the quantity H = 1 - (n + τ) determines the prediction capability of the strategy. The uncertainty of λ(dg) causes difficulties in estimating H and the statistical significance, α, of prediction results. We investigate this problem theoretically and show how the uncertainty of the measure can be taken into account in two situations, viz., the estimation of α and the construction of a confidence zone for the (n, τ)-parameters of the random strategies. We use our approach to analyse the results from prediction of M >= 8.0 events by the M8 method for the period 1985-2009 (the M8.0+ test). The model of λ(dg) based on the events Mw >= 5.5, 1977-2004, and the magnitude range of target events 8.0 <= M < 8.5 are considered as basic to this M8 analysis. We find the point and upper estimates of α and show that they are still unstable because the number of target events in the experiment is small. However, our results argue in favour of non-triviality of the M8 prediction algorithm. 2. A new scoring method for evaluating the performance of earthquake forecasts and predictions NASA Astrophysics Data System (ADS) Zhuang, J. 2009-12-01 This study presents a new method, namely the gambling score, for scoring the performance of earthquake forecasts or predictions. Unlike most other scoring procedures that require a regular scheme of forecast and treat each earthquake equally, regardless their magnitude, this new scoring method compensates the risk that the forecaster has taken. A fair scoring scheme should reward the success in a way that is compatible with the risk taken. Suppose that we have the reference model, usually the Poisson model for usual cases or Omori-Utsu formula for the case of forecasting aftershocks, which gives probability p0 that at least 1 event occurs in a given space-time-magnitude window. The forecaster, similar to a gambler, who starts with a certain number of reputation points, bets 1 reputation point on Yes'' or No'' according to his forecast, or bets nothing if he performs a NA-prediction. If the forecaster bets 1 reputation point of his reputations on Yes" and loses, the number of his reputation points is reduced by 1; if his forecasts is successful, he should be rewarded (1-p0)/p0 reputation points. The quantity (1-p0)/p0 is the return (reward/bet) ratio for bets on Yes''. In this way, if the reference model is correct, the expected return that he gains from this bet is 0. This rule also applies to probability forecasts. Suppose that p is the occurrence probability of an earthquake given by the forecaster. We can regard the forecaster as splitting 1 reputation point by betting p on Yes'' and 1-p on `No''. In this way, the forecaster's expected pay-off based on the reference model is still 0. From the viewpoints of both the reference model and the forecaster, the rule for rewarding and punishment is fair. This method is also extended to the continuous case of point process models, where the reputation points bet by the forecaster become a continuous mass on the space-time-magnitude range of interest. We also calculate the upper bound of the gambling score when 3. Update of the Graizer-Kalkan ground-motion prediction equations for shallow crustal continental earthquakes USGS Publications Warehouse Graizer, Vladimir; Kalkan, Erol 2015-01-01 A ground-motion prediction equation (GMPE) for computing medians and standard deviations of peak ground acceleration and 5-percent damped pseudo spectral acceleration response ordinates of maximum horizontal component of randomly oriented ground motions was developed by Graizer and Kalkan (2007, 2009) to be used for seismic hazard analyses and engineering applications. This GMPE was derived from the greatly expanded Next Generation of Attenuation (NGA)-West1 database. In this study, Graizer and Kalkan’s GMPE is revised to include (1) an anelastic attenuation term as a function of quality factor (Q0) in order to capture regional differences in large-distance attenuation and (2) a new frequency-dependent sedimentary-basin scaling term as a function of depth to the 1.5-km/s shear-wave velocity isosurface to improve ground-motion predictions for sites on deep sedimentary basins. The new model (GK15), developed to be simple, is applicable to the western United States and other regions with shallow continental crust in active tectonic environments and may be used for earthquakes with moment magnitudes 5.0–8.0, distances 0–250 km, average shear-wave velocities 200–1,300 m/s, and spectral periods 0.01–5 s. Directivity effects are not explicitly modeled but are included through the variability of the data. Our aleatory variability model captures inter-event variability, which decreases with magnitude and increases with distance. The mixed-effects residuals analysis shows that the GK15 reveals no trend with respect to the independent parameters. The GK15 is a significant improvement over Graizer and Kalkan (2007, 2009), and provides a demonstrable, reliable description of ground-motion amplitudes recorded from shallow crustal earthquakes in active tectonic regions over a wide range of magnitudes, distances, and site conditions. 4. Uncertainty, variability, and earthquake physics in ground‐motion prediction equations USGS Publications Warehouse Baltay, Annemarie S.; Hanks, Thomas C.; Abrahamson, Norm A. 2017-01-01 Residuals between ground‐motion data and ground‐motion prediction equations (GMPEs) can be decomposed into terms representing earthquake source, path, and site effects. These terms can be cast in terms of repeatable (epistemic) residuals and the random (aleatory) components. Identifying the repeatable residuals leads to a GMPE with reduced uncertainty for a specific source, site, or path location, which in turn can yield a lower hazard level at small probabilities of exceedance. We illustrate a schematic framework for this residual partitioning with a dataset from the ANZA network, which straddles the central San Jacinto fault in southern California. The dataset consists of more than 3200 1.15≤M≤3 earthquakes and their peak ground accelerations (PGAs), recorded at close distances (R≤20 km). We construct a small‐magnitude GMPE for these PGA data, incorporating VS30 site conditions and geometrical spreading. Identification and removal of the repeatable source, path, and site terms yield an overall reduction in the standard deviation from 0.97 (in ln units) to 0.44, for a nonergodic assumption, that is, for a single‐source location, single site, and single path. We give examples of relationships between independent seismological observables and the repeatable terms. We find a correlation between location‐based source terms and stress drops in the San Jacinto fault zone region; an explanation of the site term as a function of kappa, the near‐site attenuation parameter; and a suggestion that the path component can be related directly to elastic structure. These correlations allow the repeatable source location, site, and path terms to be determined a priori using independent geophysical relationships. Those terms could be incorporated into location‐specific GMPEs for more accurate and precise ground‐motion prediction. 5. The 26 January 2001 M 7.6 Bhuj, India, earthquake: Observed and predicted ground motions USGS Publications Warehouse Hough, S.E.; Martin, S.; Bilham, R.; Atkinson, G.M. 2002-01-01 Although local and regional instrumental recordings of the devastating 26, January 2001, Bhuj earthquake are sparse, the distribution of macroseismic effects can provide important constraints on the mainshock ground motions. We compiled available news accounts describing damage and other effects and interpreted them to obtain modified Mercalli intensities (MMIs) at >200 locations throughout the Indian subcontinent. These values are then used to map the intensity distribution throughout the subcontinent using a simple mathematical interpolation method. Although preliminary, the maps reveal several interesting features. Within the Kachchh region, the most heavily damaged villages are concentrated toward the western edge of the inferred fault, consistent with western directivity. Significant sediment-induced amplification is also suggested at a number of locations around the Gulf of Kachchh to the south of the epicenter. Away from the Kachchh region, intensities were clearly amplified significantly in areas that are along rivers, within deltas, or on coastal alluvium, such as mudflats and salt pans. In addition, we use fault-rupture parameters inferred from teleseismic data to predict shaking intensity at distances of 0-1000 km. We then convert the predicted hard-rock ground-motion parameters to MMI by using a relationship (derived from Internet-based intensity surveys) that assigns MMI based on the average effects in a region. The predicted MMIs are typically lower by 1-3 units than those estimated from news accounts, although they do predict near-field ground motions of approximately 80%g and potentially damaging ground motions on hard-rock sites to distances of approximately 300 km. For the most part, this discrepancy is consistent with the expected effect of sediment response, but it could also reflect other factors, such as unusually high building vulnerability in the Bhuj region and a tendency for media accounts to focus on the most dramatic damage, rather than 6. Reflections from the interface between seismological research and earthquake risk reduction NASA Astrophysics Data System (ADS) Sargeant, S. 2012-04-01 Scientific understanding of earthquakes and their attendant hazards is vital for the development of effective earthquake risk reduction strategies. Within the global disaster reduction policy framework (the Hyogo Framework for Action, overseen by the UN International Strategy for Disaster Reduction), the anticipated role of science and scientists is clear, with respect to risk assessment, loss estimation, space-based observation, early warning and forecasting. The importance of information sharing and cooperation, cross-disciplinary networks and developing technical and institutional capacity for effective disaster management is also highlighted. In practice, the degree to which seismological information is successfully delivered to and applied by individuals, groups or organisations working to manage or reduce the risk from earthquakes is variable. The challenge for scientists is to provide fit-for-purpose information that can be integrated simply into decision-making and risk reduction activities at all levels of governance and at different geographic scales, often by a non-technical audience (i.e. people without any seismological/earthquake engineering training). The interface between seismological research and earthquake risk reduction (defined here in terms of both the relationship between the science and its application, and the scientist and other risk stakeholders) is complex. This complexity is a function of a range issues that arise relating to communication, multidisciplinary working, politics, organisational practices, inter-organisational collaboration, working practices, sectoral cultures, individual and organisational values, worldviews and expectations. These factors can present significant obstacles to scientific information being incorporated into the decision-making process. The purpose of this paper is to present some personal reflections on the nature of the interface between the worlds of seismological research and risk reduction, and the 7. Tohoku Earthquake-associated Marine Sciences: the research project for the Great East Japan Earthquake on March 11, 2011 NASA Astrophysics Data System (ADS) Kitazato, Hiroshi; Kijima, Akihiro; Kogure, Kazuhiro; Hara, Motoyuki; Nagata, Toshi; Fujikura, Kasunori; Sonoda, Akira 2015-04-01 At 2:46 pm on March 11, 2011, a huge earthquake (M 9.0) occurred off the Pacific coast of Tohoku Region, Japan. The subsequent Tsunamis hit the coasts and seriously damaged fishing villages and towns in the area. Tohoku Region faces Northwestern Pacific where is one of the most productive oceans on the Earth. Then, what happened to the marine ecosystems in the Tohoku Region? What happened to the fishery bioresources? What is the mechanism to sustain high productivity in the Region? Is the ecosystem restoring after 4 years? What is required for the recovery of fisheries in the area? In order to answer these questions, the 10 years research project, TEAMS (Tohoku Ecosystem-Associated Marine Sciences) was launched in January 2012 funded by MEXT (Ministry of Education, Culture, Sports, Science and Technology, Japan) to conduct comprehensive research on the area. Tohoku University (TU), Atmosphere and Ocean Research Institute, the University of Tokyo (AORIUT), Japan Agency for Marine-Earth Science and Technology (JAMSTEC), and 25 other institutions are conducting research for this project in close association with local government and fishery people. Currently, approximately 400 people (200 scientists, 160 students and others) covering physical, chemical, biological, and geological sciences including modeling take part in the project from all over Japan. MEXT also supports TEAMS by constructing R/V Shinsei Maru in 2013 for the oceanic investigations in the region. In this report, the overview of the ecosystem before and after the disaster, major findings and challenges of TEAMS will be described. 8. EU H2020 SERA: Seismology and Earthquake Engineering Research Infrastructure Alliance for Europe NASA Astrophysics Data System (ADS) Giardini, Domenico; Saleh, Kauzar; SERA Consortium, the 2017-04-01 SERA - Seismology and Earthquake Engineering Research Infrastructure Alliance for Europe - is a new infrastructure project awarded in the last Horizon 2020 call for Integrating Activities for Advanced Communities (INFRAIA-01-2016-2017). Building up on precursor projects like NERA, SHARE, NERIES, SERIES, etc., SERA is expected to contribute significantly to the access of data, services and research infrastructures, and to develop innovative solutions in seismology and earthquake engineering, with the overall objective of reducing the exposure to risks associated to natural and anthropogenic earthquakes. For instance, SERA will revise the European Seismic Hazard reference model for input in the current revision of the Eurocode 8 on Seismic Design of Buildings; we also foresee to develop the first comprehensive framework for seismic risk modeling at European scale, and to develop new standards for future experimental observations and instruments for earthquake engineering and seismology. To that aim, SERA is engaging 31 institutions across Europe with leading expertise in the operation of research facilities, monitoring infrastructures, data repositories and experimental facilities in the fields of seismology, anthropogenic hazards and earthquake engineering. SERA comprises 26 activities, including 5 Networking Activities (NA) to improve the availability and access of data through enhanced community coordination and pooling of resources, 6 Joint Research Activities (JRA) aimed at creating new European standards for the optimal use of the data collected by the European infrastructures, Virtual Access (VA) to the 5 main European services for seismology and engineering seismology, and Trans-national Access (TA) to 10 high-class experimental facilities for earthquake engineering and seismology in Europe. In fact, around 50% of the SERA resources will be dedicated to virtual and transnational access. SERA and EPOS (European Platform Observing System, a European Research 9. Proceedings of the 11th United States-Japan natural resources panel for earthquake research, Napa Valley, California, November 16–18, 2016 USGS Publications Warehouse Detweiler, Shane; Pollitz, Fred 2017-10-18 The UJNR Panel on Earthquake Research promotes advanced research toward a more fundamental understanding of the earthquake process and hazard estimation. The Eleventh Joint meeting was extremely beneficial in furthering cooperation and deepening understanding of problems common to both Japan and the United States.The meeting included productive exchanges of information on approaches to systematic observation and modeling of earthquake processes. Regarding the earthquake and tsunami of March 2011 off the Pacific coast of Tohoku and the 2016 Kumamoto earthquake sequence, the Panel recognizes that further efforts are necessary to achieve our common goal of reducing earthquake risk through close collaboration and focused discussions at the 12th UJNR meeting. 10. Predictability of catastrophic events: Material rupture, earthquakes, turbulence, financial crashes, and human birth PubMed Central Sornette, Didier 2002-01-01 We propose that catastrophic events are “outliers” with statistically different properties than the rest of the population and result from mechanisms involving amplifying critical cascades. We describe a unifying approach for modeling and predicting these catastrophic events or “ruptures,” that is, sudden transitions from a quiescent state to a crisis. Such ruptures involve interactions between structures at many different scales. Applications and the potential for prediction are discussed in relation to the rupture of composite materials, great earthquakes, turbulence, and abrupt changes of weather regimes, financial crashes, and human parturition (birth). Future improvements will involve combining ideas and tools from statistical physics and artificial/computational intelligence, to identify and classify possible universal structures that occur at different scales, and to develop application-specific methodologies to use these structures for prediction of the “crises” known to arise in each application of interest. We live on a planet and in a society with intermittent dynamics rather than a state of equilibrium, and so there is a growing and urgent need to sensitize students and citizens to the importance and impacts of ruptures in their multiple forms. PMID:11875205 11. NGA East | Pacific Earthquake Engineering Research Center (PEER) Science.gov Websites the Geotechnical and Vertical WGs shown in Figure 1. The role of the different groups and participants essentially play the role of Resource Experts and the sub-award researchers and contractors play the role of Specialty Contractors. Some individuals from these two groups will also play a Proponent Expert role at 12. In-situ fluid-pressure measurements for earthquake prediction: An example from a deep well at Hi Vista, California USGS Publications Warehouse Healy, J.H.; Urban, T.C. 1985-01-01 Short-term earthquake prediction requires sensitive instruments for measuring the small anomalous changes in stress and strain that precede earthquakes. Instruments installed at or near the surface have proven too noisy for measuring anomalies of the size expected to occur, and it is now recognized that even to have the possibility of a reliable earthquake-prediction system will require instruments installed in drill holes at depths sufficient to reduce the background noise to a level below that of the expected premonitory signals. We are conducting experiments to determine the maximum signal-to-noise improvement that can be obtained in drill holes. In a 592 m well in the Mojave Desert near Hi Vista, California, we measured water-level changes with amplitudes greater than 10 cm, induced by earth tides. By removing the effects of barometric pressure and the stress related to earth tides, we have achieved a sensitivity to volumetric strain rates of 10-9 to 10-10 per day. Further improvement may be possible, and it appears that a successful earthquake-prediction capability may be achieved with an array of instruments installed in drill holes at depths of about 1 km, assuming that the premonitory strain signals are, in fact, present. ?? 1985 Birkha??user Verlag. 13. Proceedings of the 9th U.S.-Japan natural resources panel for earthquake research USGS Publications Warehouse Detweiler, Shane T.; Ellsworth, William L. 2015-01-01 The Panel strongly urges that the appropriate agencies in the U.S. and Japan that are represented on this panel work together with the academic sector to support and coordinate scientific work in these areas of cooperation. The Panel recognizes the importance of promoting the exchange of scientific personnel, exchange of data, and fundamental studies to advance progress in earthquake research. The U.S. and Japan should promote these exchanges throughout the world. The Panel endorses continuation of these activities. 14. Ground Motion Prediction for M7+ scenarios on the San Andreas Fault using the Virtual Earthquake Approach NASA Astrophysics Data System (ADS) Denolle, M.; Dunham, E. M.; Prieto, G.; Beroza, G. C. 2013-05-01 There is no clearer example of the increase in hazard due to prolonged and amplified shaking in sedimentary, than the case of Mexico City in the 1985 Michoacan earthquake. It is critically important to identify what other cities might be susceptible to similar basin amplification effects. Physics-based simulations in 3D crustal structure can be used to model and anticipate those effects, but they rely on our knowledge of the complexity of the medium. We propose a parallel approach to validate ground motion simulations using the ambient seismic field. We compute the Earth's impulse response combining the ambient seismic field and coda-wave enforcing causality and symmetry constraints. We correct the surface impulse responses to account for the source depth, mechanism and duration using a 1D approximation of the local surface-wave excitation. We call the new responses virtual earthquakes. We validate the ground motion predicted from the virtual earthquakes against moderate earthquakes in southern California. We then combine temporary seismic stations on the southern San Andreas Fault and extend the point source approximation of the Virtual Earthquake Approach to model finite kinematic ruptures. We confirm the coupling between source directivity and amplification in downtown Los Angeles seen in simulations. 15. Application of space technology to crustal dynamics and earthquake research NASA Technical Reports Server (NTRS) 1979-01-01 In cooperation with other Federal government agencies, and the governments of other countries, NASA is undertaking a program of research in geodynamics. The present program activities and plans for extension of these activities in the time period 1979-1985 are described. The program includes operation of observatories for laser ranging to the Moon and to artificial satellites, and radio observatories for very long baseline microwave interferometry (VLBI). These observatories are used to measure polar motion, earth rotation, and tectonic plate movement, and serve as base stations for mobile facilities. The mobile laser ranging and VLBI facilities are used to measure crustal deformation in tectonically active areas. 16. Media exposure related to the 2008 Sichuan Earthquake predicted probable PTSD among Chinese adolescents in Kunming, China: A longitudinal study. PubMed Yeung, Nelson C Y; Lau, Joseph T F; Yu, Nancy Xiaonan; Zhang, Jianping; Xu, Zhening; Choi, Kai Chow; Zhang, Qi; Mak, Winnie W S; Lui, Wacy W S 2018-03-01 This study examined the prevalence and the psychosocial predictors of probable PTSD among Chinese adolescents in Kunming (approximately 444 miles from the epicenter), China, who were indirectly exposed to the Sichuan Earthquake in 2008. Using a longitudinal study design, primary and secondary school students (N = 3577) in Kunming completed questionnaires at baseline (June 2008) and 6 months afterward (December 2008) in classroom settings. Participants' exposure to earthquake-related imagery and content, perceptions and emotional reactions related to the earthquake, and posttraumatic stress symptoms were measured. Univariate and forward stepwise multivariable logistic regression models were fit to identify significant predictors of probable PTSD at the 6-month follow-up. Prevalences of probable PTSD (with a Children's Revised Impact of Event Scale score ≥30) among the participants at baseline and 6-month follow-up were 16.9% and 11.1% respectively. In the multivariable analysis, those who were frequently exposed to distressful imagery had experienced at least two types of negative life events, perceived that teachers were distressed due to the earthquake, believed that the earthquake resulted from damages to the ecosystem, and felt apprehensive and emotionally disturbed due to the earthquake reported a higher risk of probable PTSD at 6-month follow-up (all ps < .05). Exposure to distressful media images, emotional responses, and disaster-related perceptions at baseline were found to be predictive of probable PTSD several months after indirect exposure to the event. Parents, teachers, and the mass media should be aware of the negative impacts of disaster-related media exposure on adolescents' psychological health. (PsycINFO Database Record (c) 2018 APA, all rights reserved). 17. Earthquake watch USGS Publications Warehouse Hill, M. 1976-01-01 When the time comes that earthquakes can be predicted accurately, what shall we do with the knowledge? This was the theme of a November 1975 conference on earthquake warning and response held in San Francisco called by Assistant Secretary of the Interior Jack W. Carlson. Invited were officials of State and local governments from Alaska, California, Hawaii, Idaho, Montana, Nevada, utah, Washington, and Wyoming and representatives of the news media. 18. Strain buildup and release, earthquake prediction and selection of VBL sites for margins of the north Pacific NASA Technical Reports Server (NTRS) Scholz, C. H.; Bilham, R.; Johnson, T. L. 1981-01-01 During the past year, the grant supported research on several aspects of crustal deformation. The relation between earthquake displacements and fault dimensions was studied in an effort to find scaling laws that relate static parameters such as slip and stress drop to the dimensions of the rupture. Several implications of the static relations for the dynamic properties of earthquakes such as rupture velocity and dynamic stress drop were proposed. A theoretical basis for earthquake related phenomena associated with slow rupture growth or propagation, such as delayed multiple events, was developed using the stress intensity factor defined in fracture mechanics and experimental evidence from studies of crack growth by stress corrosion. Finally, extensive studies by Japanese geologists have established the offset across numerous faults in Japan over the last one hundred thousand years. These observations of intraplate faulting are being used to establish the spatial variations of the average strain rate of subregions in southern Japan. 19. Earthquakes: hydrogeochemical precursors USGS Publications Warehouse Ingebritsen, Steven E.; Manga, Michael 2014-01-01 Earthquake prediction is a long-sought goal. Changes in groundwater chemistry before earthquakes in Iceland highlight a potential hydrogeochemical precursor, but such signals must be evaluated in the context of long-term, multiparametric data sets. 20. Automated Magnitude Measures, Earthquake Source Modeling, VFM Discriminant Testing and Summary of Current Research. DTIC Science & Technology 1979-02-01 jm.. W 112.11111 * I 120 11 11111.258 MICROCOPY RESOLUTION TEST CHART NATIONAL BUREAU OF STANOARDS-19b3-A 0 - SYSTEMS, SCIENCE AND SOFTWARE * SSS-R-79...3933 0AUTOMATED MAGNITUDE MEASURES, EARTHQUAKE SOURCE MODELING, VFM DISCRIMINANT TESTING AND SUMMARY OF CURRENT RESEARCH T. C. BACHE S. M. DAY J. M...VFM DISCRIMINANT . PERFORMING ORG. REPORT NUMBER TESTING AND SUMMARY OF CURRENT RESEARCH SSS-R-79-3933 7. AUTmOR(s) 8. CONTRACT OR GRANT NUMBERtSi T 1. Integrated Program of Multidisciplinary Education and Research in Mechanics and Physics of Earthquakes NASA Astrophysics Data System (ADS) Lapusta, N. 2011-12-01 Studying earthquake source processes is a multidisciplinary endeavor involving a number of subjects, from geophysics to engineering. As a solid mechanician interested in understanding earthquakes through physics-based computational modeling and comparison with observations, I need to educate and attract students from diverse areas. My CAREER award has provided the crucial support for the initiation of this effort. Applying for the award made me to go through careful initial planning in consultation with my colleagues and administration from two divisions, an important component of the eventual success of my path to tenure. Then, the long-term support directed at my program as a whole - and not a specific year-long task or subject area - allowed for the flexibility required for a start-up of a multidisciplinary undertaking. My research is directed towards formulating realistic fault models that incorporate state-of-the-art experimental studies, field observations, and analytical models. The goal is to compare the model response - in terms of long-term fault behavior that includes both sequences of simulated earthquakes and aseismic phenomena - with observations, to identify appropriate constitutive laws and parameter ranges. CAREER funding has enabled my group to develop a sophisticated 3D modeling approach that we have used to understand patterns of seismic and aseismic fault slip on the Sunda megathrust in Sumatra, investigate the effect of variable hydraulic properties on fault behavior, with application to Chi-Chi and Tohoku earthquake, create a model of the Parkfield segment of the San Andreas fault that reproduces both long-term and short-term features of the M6 earthquake sequence there, and design experiments with laboratory earthquakes, among several other studies. A critical ingredient in this research program has been the fully integrated educational component that allowed me, on the one hand, to expose students from different backgrounds to the 2. 7th U.S. / Japan Natural Resources (UJNR) Panel on Earthquake Research: Abstract Volume and Technical Program USGS Publications Warehouse Detweiler, Shane T.; Ellsworth, William L. 2008-01-01 The U.S. / Japan Natural Resources (UJNR) Panel on Earthquake Research promotes advanced study toward a more fundamental understanding of the earthquake process and hazard estimation. The Panel promotes basic and applied research to improve our understanding of the causes and effects of earthquakes and to facilitate the transmission of research results to those who implement hazard reduction measures on both sides of the Pacific and around the world. Meetings are held every other year, and alternate between countries with short presentation on current research and local field trips being the highlights. The 5th Joint Panel meeting was held at Asilomar, California in October, 2004. The technical sessions featured reports on the September 28, 2004 Parkfield, California earthquake, progress on earthquake early warning and rapid post-event assessment technology, probabilistic earthquake forecasting and the newly discovered phenomenon of nonvolcanic tremor. The Panel visited the epicentral region of the M 6.0 Parkfield earthquake and viewed the surface ruptures along the San Andreas Fault. They also visited the San Andreas Fault Observatory at Depth (SAFOD), which had just completed the first phase of drilling into the fault. The 6th Joint Panel meeting was held in Tokushima, Japan in November, 2006. The meeting included very productive exchanges of information on approaches to systematic observation of earthquake processes. Sixty eight technical papers were presented during the meeting on a wide range of subjects, including interplate earthquakes in subduction zones, slow slip and nonvolcanic tremor, crustal deformation, recent earthquake activity and hazard mapping. Through our discussion, we reaffirmed the benefits of working together to achieve our common goal of reducing earthquake hazard, continued cooperation on issues involving densification of observation networks and the open exchange of data among scientific communities. We also reaffirmed the importance of 3. Predicted Surface Displacements for Scenario Earthquakes in the San Francisco Bay Region USGS Publications Warehouse Murray-Moraleda, Jessica R. 2008-01-01 In the immediate aftermath of a major earthquake, the U.S. Geological Survey (USGS) will be called upon to provide information on the characteristics of the event to emergency responders and the media. One such piece of information is the expected surface displacement due to the earthquake. In conducting probabilistic hazard analyses for the San Francisco Bay Region, the Working Group on California Earthquake Probabilities (WGCEP) identified a series of scenario earthquakes involving the major faults of the region, and these were used in their 2003 report (hereafter referred to as WG03) and the recently released 2008 Uniform California Earthquake Rupture Forecast (UCERF). Here I present a collection of maps depicting the expected surface displacement resulting from those scenario earthquakes. The USGS has conducted frequent Global Positioning System (GPS) surveys throughout northern California for nearly two decades, generating a solid baseline of interseismic measurements. Following an earthquake, temporary GPS deployments at these sites will be important to augment the spatial coverage provided by continuous GPS sites for recording postseismic deformation, as will the acquisition of Interferometric Synthetic Aperture Radar (InSAR) scenes. The information provided in this report allows one to anticipate, for a given event, where the largest displacements are likely to occur. This information is valuable both for assessing the need for further spatial densification of GPS coverage before an event and prioritizing sites to resurvey and InSAR data to acquire in the immediate aftermath of the earthquake. In addition, these maps are envisioned to be a resource for scientists in communicating with emergency responders and members of the press, particularly during the time immediately after a major earthquake before displacements recorded by continuous GPS stations are available. 4. Prediction of Research Self-Efficacy and Future Research Involvement. ERIC Educational Resources Information Center Bishop, Rosean M.; And Others Although graduate programs hope that their students will be committed to research in their careers, most students express ambivalence towards research. Identifying the variables that predict involvement in research thus seems crucial. In this study 136 doctoral students from a wide range of disciplines completed the Research Self-Efficacy Scale… 5. Thermal Infrared Anomalies of Several Strong Earthquakes PubMed Central Wei, Congxin; Guo, Xiao; Qin, Manzhong 2013-01-01 In the history of earthquake thermal infrared research, it is undeniable that before and after strong earthquakes there are significant thermal infrared anomalies which have been interpreted as preseismic precursor in earthquake prediction and forecasting. In this paper, we studied the characteristics of thermal radiation observed before and after the 8 great earthquakes with magnitude up to Ms7.0 by using the satellite infrared remote sensing information. We used new types of data and method to extract the useful anomaly information. Based on the analyses of 8 earthquakes, we got the results as follows. (1) There are significant thermal radiation anomalies before and after earthquakes for all cases. The overall performance of anomalies includes two main stages: expanding first and narrowing later. We easily extracted and identified such seismic anomalies by method of “time-frequency relative power spectrum.” (2) There exist evident and different characteristic periods and magnitudes of thermal abnormal radiation for each case. (3) Thermal radiation anomalies are closely related to the geological structure. (4) Thermal radiation has obvious characteristics in abnormal duration, range, and morphology. In summary, we should be sure that earthquake thermal infrared anomalies as useful earthquake precursor can be used in earthquake prediction and forecasting. PMID:24222728 6. Thermal infrared anomalies of several strong earthquakes. PubMed Wei, Congxin; Zhang, Yuansheng; Guo, Xiao; Hui, Shaoxing; Qin, Manzhong; Zhang, Ying 2013-01-01 In the history of earthquake thermal infrared research, it is undeniable that before and after strong earthquakes there are significant thermal infrared anomalies which have been interpreted as preseismic precursor in earthquake prediction and forecasting. In this paper, we studied the characteristics of thermal radiation observed before and after the 8 great earthquakes with magnitude up to Ms7.0 by using the satellite infrared remote sensing information. We used new types of data and method to extract the useful anomaly information. Based on the analyses of 8 earthquakes, we got the results as follows. (1) There are significant thermal radiation anomalies before and after earthquakes for all cases. The overall performance of anomalies includes two main stages: expanding first and narrowing later. We easily extracted and identified such seismic anomalies by method of "time-frequency relative power spectrum." (2) There exist evident and different characteristic periods and magnitudes of thermal abnormal radiation for each case. (3) Thermal radiation anomalies are closely related to the geological structure. (4) Thermal radiation has obvious characteristics in abnormal duration, range, and morphology. In summary, we should be sure that earthquake thermal infrared anomalies as useful earthquake precursor can be used in earthquake prediction and forecasting. 7. On a report that the 2012 M 6.0 earthquake in Italy was predicted after seeing an unusual cloud formation USGS Publications Warehouse Thomas, J.N.; Masci, F; Love, Jeffrey J. 2015-01-01 Several recently published reports have suggested that semi-stationary linear-cloud formations might be causally precursory to earthquakes. We examine the report of Guangmeng and Jie (2013), who claim to have predicted the 2012 M 6.0 earthquake in the Po Valley of northern Italy after seeing a satellite photograph (a digital image) showing a linear-cloud formation over the eastern Apennine Mountains of central Italy. From inspection of 4 years of satellite images we find numerous examples of linear-cloud formations over Italy. A simple test shows no obvious statistical relationship between the occurrence of these cloud formations and earthquakes that occurred in and around Italy. All of the linear-cloud formations we have identified in satellite images, including that which Guangmeng and Jie (2013) claim to have used to predict the 2012 earthquake, appear to be orographic – formed by the interaction of moisture-laden wind flowing over mountains. Guangmeng and Jie (2013) have not clearly stated how linear-cloud formations can be used to predict the size, location, and time of an earthquake, and they have not published an account of all of their predictions (including any unsuccessful predictions). We are skeptical of the validity of the claim by Guangmeng and Jie (2013) that they have managed to predict any earthquakes. 8. Important Earthquake Engineering Resources Science.gov Websites PEER logo Pacific Earthquake Engineering Research Center home about peer news events research Engineering Resources Site Map Search Important Earthquake Engineering Resources - American Concrete Institute Motion Observation Systems (COSMOS) - Consortium of Universities for Research in Earthquake Engineering 9. Design and Optimization of a Telemetric system for appliance in earthquake prediction NASA Astrophysics Data System (ADS) Bogdos, G.; Tassoulas, E.; Vereses, A.; Papapanagiotou, A.; Filippi, K.; Koulouras, G.; Nomicos, C. 2009-04-01 This project's aim is to design a telemetric system which will be able to collect data from a digitizer, transform it into appropriate form and transfer this data to a central system where an on-line data elaboration will take place. On-line mathematical elaboration (fractal analysis) of pre-seismic electromagnetic signals and instant display may lead to safe earthquake prediction methodologies. Ad-hoc connections and heterogeneous topologies are the core network, while wired and wireless means cooperate for an accurate and on-time transmission. The nature of data is considered very sensitive so the transmission needs to be instant. All stations are situated in rural places in order to prevent electromagnetic interferences; this imposes continuous monitoring and provision of backup data links. The central stations collect the data of every station and allocate them properly in a predefined database. Special software is designed to elaborate mathematically the incoming data and export it graphically. The developing part included digitizer design, workstation software design, transmission protocol study and simulation on OPNET, database programming, mathematical data elaborations and software development for graphical representation. All the package was tested under lab conditions and tested in real conditions. The main aspect that this project serves is the very big interest for the scientific community in case this platform will eventually be implemented and then installed in Greek countryside in large scale. The platform is designed in such a way that techniques of data mining and mathematical elaboration are possible and any extension can be adapted. The main specialization of this project is that these mechanisms and mathematical transformations can be applied on live data. This can help to rapid exploitation of the real meaning of the measured and stored data. The elaboration of this study has as primary intention to help and alleviate the analysis process 10. CyberShake-derived ground-motion prediction models for the Los Angeles region with application to earthquake early warning USGS Publications Warehouse Bose, Maren; Graves, Robert; Gill, David; Callaghan, Scott; Maechling, Phillip J. 2014-01-01 Real-time applications such as earthquake early warning (EEW) typically use empirical ground-motion prediction equations (GMPEs) along with event magnitude and source-to-site distances to estimate expected shaking levels. In this simplified approach, effects due to finite-fault geometry, directivity and site and basin response are often generalized, which may lead to a significant under- or overestimation of shaking from large earthquakes (M > 6.5) in some locations. For enhanced site-specific ground-motion predictions considering 3-D wave-propagation effects, we develop support vector regression (SVR) models from the SCEC CyberShake low-frequency (<0.5 Hz) and broad-band (0–10 Hz) data sets. CyberShake encompasses 3-D wave-propagation simulations of >415 000 finite-fault rupture scenarios (6.5 ≤ M ≤ 8.5) for southern California defined in UCERF 2.0. We use CyberShake to demonstrate the application of synthetic waveform data to EEW as a ‘proof of concept’, being aware that these simulations are not yet fully validated and might not appropriately sample the range of rupture uncertainty. Our regression models predict the maximum and the temporal evolution of instrumental intensity (MMI) at 71 selected test sites using only the hypocentre, magnitude and rupture ratio, which characterizes uni- and bilateral rupture propagation. Our regression approach is completely data-driven (where here the CyberShake simulations are considered data) and does not enforce pre-defined functional forms or dependencies among input parameters. The models were established from a subset (∼20 per cent) of CyberShake simulations, but can explain MMI values of all >400 k rupture scenarios with a standard deviation of about 0.4 intensity units. We apply our models to determine threshold magnitudes (and warning times) for various active faults in southern California that earthquakes need to exceed to cause at least ‘moderate’, ‘strong’ or ‘very strong’ shaking 11. Computational approaches for predicting biomedical research collaborations. PubMed Zhang, Qing; Yu, Hong 2014-01-01 Biomedical research is increasingly collaborative, and successful collaborations often produce high impact work. Computational approaches can be developed for automatically predicting biomedical research collaborations. Previous works of collaboration prediction mainly explored the topological structures of research collaboration networks, leaving out rich semantic information from the publications themselves. In this paper, we propose supervised machine learning approaches to predict research collaborations in the biomedical field. We explored both the semantic features extracted from author research interest profile and the author network topological features. We found that the most informative semantic features for author collaborations are related to research interest, including similarity of out-citing citations, similarity of abstracts. Of the four supervised machine learning models (naïve Bayes, naïve Bayes multinomial, SVMs, and logistic regression), the best performing model is logistic regression with an ROC ranging from 0.766 to 0.980 on different datasets. To our knowledge we are the first to study in depth how research interest and productivities can be used for collaboration prediction. Our approach is computationally efficient, scalable and yet simple to implement. The datasets of this study are available at https://github.com/qingzhanggithub/medline-collaboration-datasets. 12. Large Historical Tsunamigenic Earthquakes in Italy: The Neglected Tsunami Research Point of View NASA Astrophysics Data System (ADS) Armigliato, A.; Tinti, S.; Pagnoni, G.; Zaniboni, F. 2015-12-01 It is known that tsunamis are rather rare events, especially when compared to earthquakes, and the Italian coasts are no exception. Nonetheless, a striking evidence is that 6 out of 10 earthquakes occurred in the last thousand years in Italy, and having equivalent moment magnitude equal or larger than 7 where accompanied by destructive or heavily damaging tsunamis. If we extend the lower limit of the equivalent moment magnitude down to 6.5 the percentage decreases (around 40%), but is still significant. Famous events like those occurred on 30 July 1627 in Gargano, on 11 January 1693 in eastern Sicily, and on 28 December 1908 in the Messina Straits are part of this list: they were all characterized by maximum run-ups of several meters (13 m for the 1908 tsunami), significant maximum inundation distances, and large (although not precisely quantifiable) numbers of victims. Further evidences provided in the last decade by paleo-tsunami deposit analyses help to better characterize the tsunami impact and confirm that none of the cited events can be reduced to local or secondary effects. Proper analysis and simulation of available tsunami data would then appear as an obvious part of the correct definition of the sources responsible for the largest Italian tsunamigenic earthquakes, in a process in which different datasets analyzed by different disciplines must be reconciled rather than put into contrast with each other. Unfortunately, macroseismic, seismic and geological/geomorphological observations and data typically are assigned much heavier weights, and in-land faults are often assigned larger credit than the offshore ones, even when evidence is provided by tsunami simulations that they are not at all capable of justifying the observed tsunami effects. Tsunami generation is imputed a-priori to only supposed, and sometimes even non-existing, submarine landslides. We try to summarize the tsunami research point of view on the largest Italian historical tsunamigenic 13. A landslide susceptibility prediction on a sample slope in Kathmandu Nepal associated with the 2015's Gorkha Earthquake NASA Astrophysics Data System (ADS) Kubota, Tetsuya; Prasad Paudel, Prem 2016-04-01 In 2013, some landslides induced by heavy rainfalls occurred in southern part of Kathmandu, Nepal which is located southern suburb of Kathmandu, the capital. These landslide slopes hit by the strong Gorkha Earthquake in April 2015 and seemed to destabilize again. Hereby, to clarify their susceptibility of landslide in the earthquake, one of these landslide slopes was analyzed its slope stability by CSSDP (Critical Slip Surface analysis by Dynamic Programming based on limit equilibrium method, especially Janbu method) against slope failure with various seismic acceleration observed around Kathmandu in the Gorkha Earthquake. The CSSDP can detect the landslide slip surface which has minimum Fs (factor of safety) automatically using dynamic programming theory. The geology in this area mainly consists of fragile schist and it is prone to landslide occurrence. Field survey was conducted to obtain topological data such as ground surface and slip surface cross section. Soil parameters obtained by geotechnical tests with field sampling were applied. Consequently, the slope has distinctive characteristics followings in terms of slope stability: (1) With heavy rainfall, it collapsed and had a factor of safety Fs <1.0 (0.654 or more). (2) With seismic acceleration of 0.15G (147gal) observed around Kathmandu, it has Fs=1.34. (3) With possible local seismic acceleration of 0.35G (343gal) estimated at Kathmandu, it has Fs=0.989. If it were very shallow landslide and covered with cedars, it could have Fs =1.055 due to root reinforcement effect to the soil strength. (4) Without seismic acceleration and with no rainfall condition, it has Fs=1.75. These results can explain the real landslide occurrence in this area with the maximum seismic acceleration estimated as 0.15G in the vicinity of Kathmandu by the Gorkha Earthquake. Therefore, these results indicate landslide susceptibility of the slopes in this area with strong earthquake. In this situation, it is possible to predict 14. ShakeMap-based prediction of earthquake-induced mass movements in Switzerland calibrated on historical observations USGS Publications Warehouse Cauzzi, Carlo; Fah, Donat; Wald, David J.; Clinton, John; Losey, Stephane; Wiemer, Stefan 2018-01-01 In Switzerland, nearly all historical Mw ~ 6 earthquakes have induced damaging landslides, rockslides and snow avalanches that, in some cases, also resulted in damage to infrastructure and loss of lives. We describe the customisation to Swiss conditions of a globally calibrated statistical approach originally developed to rapidly assess earthquake-induced landslide likelihoods worldwide. The probability of occurrence of such earthquake-induced effects is modelled through a set of geospatial susceptibility proxies and peak ground acceleration. The predictive model is tuned to capture the observations from past events and optimised for near-real-time estimates based on USGS-style ShakeMaps routinely produced by the Swiss Seismological Service. Our emphasis is on the use of high-resolution geospatial datasets along with additional local information on ground failure susceptibility. Even if calibrated on historic events with moderate magnitudes, the methodology presented in this paper yields sensible results also for low-magnitude recent events. The model is integrated in the Swiss ShakeMap framework. This study has a high practical relevance to many Swiss ShakeMap stakeholders, especially those managing lifeline systems, and to other global users interested in conducting a similar customisation for their region of interest. 15. Prospectively Evaluating the Collaboratory for the Study of Earthquake Predictability: An Evaluation of the UCERF2 and Updated Five-Year RELM Forecasts NASA Astrophysics Data System (ADS) Strader, Anne; Schneider, Max; Schorlemmer, Danijel; Liukis, Maria 2016-04-01 The Collaboratory for the Study of Earthquake Predictability (CSEP) was developed to rigorously test earthquake forecasts retrospectively and prospectively through reproducible, completely transparent experiments within a controlled environment (Zechar et al., 2010). During 2006-2011, thirteen five-year time-invariant prospective earthquake mainshock forecasts developed by the Regional Earthquake Likelihood Models (RELM) working group were evaluated through the CSEP testing center (Schorlemmer and Gerstenberger, 2007). The number, spatial, and magnitude components of the forecasts were compared to the respective observed seismicity components using a set of consistency tests (Schorlemmer et al., 2007, Zechar et al., 2010). In the initial experiment, all but three forecast models passed every test at the 95% significance level, with all forecasts displaying consistent log-likelihoods (L-test) and magnitude distributions (M-test) with the observed seismicity. In the ten-year RELM experiment update, we reevaluate these earthquake forecasts over an eight-year period from 2008-2016, to determine the consistency of previous likelihood testing results over longer time intervals. Additionally, we test the Uniform California Earthquake Rupture Forecast (UCERF2), developed by the U.S. Geological Survey (USGS), and the earthquake rate model developed by the California Geological Survey (CGS) and the USGS for the National Seismic Hazard Mapping Program (NSHMP) against the RELM forecasts. Both the UCERF2 and NSHMP forecasts pass all consistency tests, though the Helmstetter et al. (2007) and Shen et al. (2007) models exhibit greater information gain per earthquake according to the T- and W- tests (Rhoades et al., 2011). Though all but three RELM forecasts pass the spatial likelihood test (S-test), multiple forecasts fail the M-test due to overprediction of the number of earthquakes during the target period. Though there is no significant difference between the UCERF2 and NSHMP 16. Implications of next generation attenuation ground motion prediction equations for site coefficients used in earthquake resistant design USGS Publications Warehouse Borcherdt, Roger D. 2014-01-01 Proposals are developed to update Tables 11.4-1 and 11.4-2 of Minimum Design Loads for Buildings and Other Structures published as American Society of Civil Engineers Structural Engineering Institute standard 7-10 (ASCE/SEI 7–10). The updates are mean next generation attenuation (NGA) site coefficients inferred directly from the four NGA ground motion prediction equations used to derive the maximum considered earthquake response maps adopted in ASCE/SEI 7–10. Proposals include the recommendation to use straight-line interpolation to infer site coefficients at intermediate values of (average shear velocity to 30-m depth). The NGA coefficients are shown to agree well with adopted site coefficients at low levels of input motion (0.1 g) and those observed from the Loma Prieta earthquake. For higher levels of input motion, the majority of the adopted values are within the 95% epistemic-uncertainty limits implied by the NGA estimates with the exceptions being the mid-period site coefficient, Fv, for site class D and the short-period coefficient, Fa, for site class C, both of which are slightly less than the corresponding 95% limit. The NGA data base shows that the median value of 913 m/s for site class B is more typical than 760 m/s as a value to characterize firm to hard rock sites as the uniform ground condition for future maximum considered earthquake response ground motion estimates. Future updates of NGA ground motion prediction equations can be incorporated easily into future adjustments of adopted site coefficients using procedures presented herein. 17. A prototype of the procedure of strong ground motion prediction for intraslab earthquake based on characterized source model NASA Astrophysics Data System (ADS) Iwata, T.; Asano, K.; Sekiguchi, H. 2011-12-01 We propose a prototype of the procedure to construct source models for strong motion prediction during intraslab earthquakes based on the characterized source model (Irikura and Miyake, 2011). The key is the characterized source model which is based on the empirical scaling relationships for intraslab earthquakes and involve the correspondence between the SMGA (strong motion generation area, Miyake et al., 2003) and the asperity (large slip area). Iwata and Asano (2011) obtained the empirical relationships of the rupture area (S) and the total asperity area (Sa) to the seismic moment (Mo) as follows, with assuming power of 2/3 dependency of S and Sa on M0, S (km**2) = 6.57×10**(-11)×Mo**(2/3) (Nm) (1) Sa (km**2) = 1.04 ×10**(-11)×Mo**(2/3) (Nm) (2). Iwata and Asano (2011) also pointed out that the position and the size of SMGA approximately corresponds to the asperity area for several intraslab events. Based on the empirical relationships, we gave a procedure for constructing source models of intraslab earthquakes for strong motion prediction. [1] Give the seismic moment, Mo. [2] Obtain the total rupture area and the total asperity area according to the empirical scaling relationships between S, Sa, and Mo given by Iwata and Asano (2011). [3] Square rupture area and asperities are assumed. [4] The source mechanism is assumed to be the same as that of small events in the source region. [5] Plural scenarios including variety of the number of asperities and rupture starting points are prepared. We apply this procedure by simulating strong ground motions for several observed events for confirming the methodology. 18. First Results of the Regional Earthquake Likelihood Models Experiment NASA Astrophysics Data System (ADS) Schorlemmer, Danijel; Zechar, J. Douglas; Werner, Maximilian J.; Field, Edward H.; Jackson, David D.; Jordan, Thomas H. 2010-08-01 The ability to successfully predict the future behavior of a system is a strong indication that the system is well understood. Certainly many details of the earthquake system remain obscure, but several hypotheses related to earthquake occurrence and seismic hazard have been proffered, and predicting earthquake behavior is a worthy goal and demanded by society. Along these lines, one of the primary objectives of the Regional Earthquake Likelihood Models (RELM) working group was to formalize earthquake occurrence hypotheses in the form of prospective earthquake rate forecasts in California. RELM members, working in small research groups, developed more than a dozen 5-year forecasts; they also outlined a performance evaluation method and provided a conceptual description of a Testing Center in which to perform predictability experiments. Subsequently, researchers working within the Collaboratory for the Study of Earthquake Predictability (CSEP) have begun implementing Testing Centers in different locations worldwide, and the RELM predictability experiment—a truly prospective earthquake prediction effort—is underway within the U.S. branch of CSEP. The experiment, designed to compare time-invariant 5-year earthquake rate forecasts, is now approximately halfway to its completion. In this paper, we describe the models under evaluation and present, for the first time, preliminary results of this unique experiment. While these results are preliminary—the forecasts were meant for an application of 5 years—we find interesting results: most of the models are consistent with the observation and one model forecasts the distribution of earthquakes best. We discuss the observed sample of target earthquakes in the context of historical seismicity within the testing region, highlight potential pitfalls of the current tests, and suggest plans for future revisions to experiments such as this one. 19. First Results of the Regional Earthquake Likelihood Models Experiment USGS Publications Warehouse Schorlemmer, D.; Zechar, J.D.; Werner, M.J.; Field, E.H.; Jackson, D.D.; Jordan, T.H. 2010-01-01 The ability to successfully predict the future behavior of a system is a strong indication that the system is well understood. Certainly many details of the earthquake system remain obscure, but several hypotheses related to earthquake occurrence and seismic hazard have been proffered, and predicting earthquake behavior is a worthy goal and demanded by society. Along these lines, one of the primary objectives of the Regional Earthquake Likelihood Models (RELM) working group was to formalize earthquake occurrence hypotheses in the form of prospective earthquake rate forecasts in California. RELM members, working in small research groups, developed more than a dozen 5-year forecasts; they also outlined a performance evaluation method and provided a conceptual description of a Testing Center in which to perform predictability experiments. Subsequently, researchers working within the Collaboratory for the Study of Earthquake Predictability (CSEP) have begun implementing Testing Centers in different locations worldwide, and the RELM predictability experiment-a truly prospective earthquake prediction effort-is underway within the U. S. branch of CSEP. The experiment, designed to compare time-invariant 5-year earthquake rate forecasts, is now approximately halfway to its completion. In this paper, we describe the models under evaluation and present, for the first time, preliminary results of this unique experiment. While these results are preliminary-the forecasts were meant for an application of 5 years-we find interesting results: most of the models are consistent with the observation and one model forecasts the distribution of earthquakes best. We discuss the observed sample of target earthquakes in the context of historical seismicity within the testing region, highlight potential pitfalls of the current tests, and suggest plans for future revisions to experiments such as this one. ?? 2010 The Author(s). 20. Issues on the Japanese Earthquake Hazard Evaluation NASA Astrophysics Data System (ADS) Hashimoto, M.; Fukushima, Y.; Sagiya, T. 2013-12-01 The 2011 Great East Japan Earthquake forced the policy of counter-measurements to earthquake disasters, including earthquake hazard evaluations, to be changed in Japan. Before the March 11, Japanese earthquake hazard evaluation was based on the history of earthquakes that repeatedly occurs and the characteristic earthquake model. The source region of an earthquake was identified and its occurrence history was revealed. Then the conditional probability was estimated using the renewal model. However, the Japanese authorities changed the policy after the megathrust earthquake in 2011 such that the largest earthquake in a specific seismic zone should be assumed on the basis of available scientific knowledge. According to this policy, three important reports were issued during these two years. First, the Central Disaster Management Council issued a new estimate of damages by a hypothetical Mw9 earthquake along the Nankai trough during 2011 and 2012. The model predicts a 34 m high tsunami on the southern Shikoku coast and intensity 6 or higher on the JMA scale in most area of Southwest Japan as the maximum. Next, the Earthquake Research Council revised the long-term earthquake hazard evaluation of earthquakes along the Nankai trough in May 2013, which discarded the characteristic earthquake model and put much emphasis on the diversity of earthquakes. The so-called 'Tokai' earthquake was negated in this evaluation. Finally, another report by the CDMC concluded that, with the current knowledge, it is hard to predict the occurrence of large earthquakes along the Nankai trough using the present techniques, based on the diversity of earthquake phenomena. These reports created sensations throughout the country and local governments are struggling to prepare counter-measurements. These reports commented on large uncertainty in their evaluation near their ends, but are these messages transmitted properly to the public? Earthquake scientists, including authors, are involved in 1. Ground Motions Due to Earthquakes on Creeping Faults NASA Astrophysics Data System (ADS) Harris, R.; Abrahamson, N. A. 2014-12-01 We investigate the peak ground motions from the largest well-recorded earthquakes on creeping strike-slip faults in active-tectonic continental regions. Our goal is to evaluate if the strong ground motions from earthquakes on creeping faults are smaller than the strong ground motions from earthquakes on locked faults. Smaller ground motions might be expected from earthquakes on creeping faults if the fault sections that strongly radiate energy are surrounded by patches of fault that predominantly absorb energy. For our study we used the ground motion data available in the PEER NGA-West2 database, and the ground motion prediction equations that were developed from the PEER NGA-West2 dataset. We analyzed data for the eleven largest well-recorded creeping-fault earthquakes, that ranged in magnitude from M5.0-6.5. Our findings are that these earthquakes produced peak ground motions that are statistically indistinguishable from the peak ground motions produced by similar-magnitude earthquakes on locked faults. These findings may be implemented in earthquake hazard estimates for moderate-size earthquakes in creeping-fault regions. Further investigation is necessary to determine if this result will also apply to larger earthquakes on creeping faults. Please also see: Harris, R.A., and N.A. Abrahamson (2014), Strong ground motions generated by earthquakes on creeping faults, Geophysical Research Letters, vol. 41, doi:10.1002/2014GL060228. 2. Risk communication on earthquake prediction studies: Possible pitfalls of science communication NASA Astrophysics Data System (ADS) Oki, S.; Koketsu, K. 2012-04-01 The ANSA web news titled "'No L'Aquila quake risk' experts probed in Italy in June 2010" gave a shock to the Japanese seismological community. For the previous 6 months from the L'Aquila earthquake which occurred on 6th April 2009, the seismicity in that region had been active. Having become even more active and reached to magnitude 4 on 30th March, the government held the Major Risks Committee, which is a part of the Civil Protection Department and is tasked with forecasting possible risks by collating and analyzing data from a variety of sources and making preventative recommendations. According to this ANSA news, the committee did not insist on the risk of damaging earthquake at the press conference held after the committee. Six days later, however, a magnitude 6.3 earthquake attacked L'Aquila and killed 308 people. On 3rd June next year, the prosecutors started on the investigation after complaints of the victims that far more people would have fled their homes that night if there had been no reassurances of the Major Risks Committee in the previous week. Lessons from this issue are of significant importance. Science communication is now in currency, and more efforts are made to reach out to the public and policy makers. But when we deal with disaster sciences, it contains a much bigger proportion of risk communication. A similar incident had happened with the outbreak of the BSE back in the late 1980's. Many of the measures taken according to the Southwood Committee are laudable, but for one - science back then could not show whether or not it was contagious to humans, and is written in the committee minutes that "it is unlikely to infect humans". If read thoroughly, it does refer to the risk, but since it had not been stressed, the government started a campaign saying that "UK beef is safe". In the presentation, we review the L'Aquila affair referring to our interviews to some of the committee members and the Civil Protection Department, and also introduce 3. Using prediction markets to forecast research evaluations. PubMed Munafo, Marcus R; Pfeiffer, Thomas; Altmejd, Adam; Heikensten, Emma; Almenberg, Johan; Bird, Alexander; Chen, Yiling; Wilson, Brad; Johannesson, Magnus; Dreber, Anna 2015-10-01 The 2014 Research Excellence Framework (REF2014) was conducted to assess the quality of research carried out at higher education institutions in the UK over a 6 year period. However, the process was criticized for being expensive and bureaucratic, and it was argued that similar information could be obtained more simply from various existing metrics. We were interested in whether a prediction market on the outcome of REF2014 for 33 chemistry departments in the UK would provide information similar to that obtained during the REF2014 process. Prediction markets have become increasingly popular as a means of capturing what is colloquially known as the 'wisdom of crowds', and enable individuals to trade 'bets' on whether a specific outcome will occur or not. These have been shown to be successful at predicting various outcomes in a number of domains (e.g. sport, entertainment and politics), but have rarely been tested against outcomes based on expert judgements such as those that formed the basis of REF2014. 4. Using prediction markets to forecast research evaluations PubMed Central Munafo, Marcus R.; Pfeiffer, Thomas; Altmejd, Adam; Heikensten, Emma; Almenberg, Johan; Bird, Alexander; Chen, Yiling; Wilson, Brad; Johannesson, Magnus; Dreber, Anna 2015-01-01 The 2014 Research Excellence Framework (REF2014) was conducted to assess the quality of research carried out at higher education institutions in the UK over a 6 year period. However, the process was criticized for being expensive and bureaucratic, and it was argued that similar information could be obtained more simply from various existing metrics. We were interested in whether a prediction market on the outcome of REF2014 for 33 chemistry departments in the UK would provide information similar to that obtained during the REF2014 process. Prediction markets have become increasingly popular as a means of capturing what is colloquially known as the ‘wisdom of crowds’, and enable individuals to trade ‘bets’ on whether a specific outcome will occur or not. These have been shown to be successful at predicting various outcomes in a number of domains (e.g. sport, entertainment and politics), but have rarely been tested against outcomes based on expert judgements such as those that formed the basis of REF2014. PMID:26587243 5. The Iquique earthquake sequence of April 2014: Bayesian modeling accounting for prediction uncertainty USGS Publications Warehouse Duputel, Zacharie; Jiang, Junle; Jolivet, Romain; Simons, Mark; Rivera, Luis; Ampuero, Jean-Paul; Riel, Bryan; Owen, Susan E; Moore, Angelyn W; Samsonov, Sergey V; Ortega Culaciati, Francisco; Minson, Sarah E. 2016-01-01 The subduction zone in northern Chile is a well-identified seismic gap that last ruptured in 1877. On 1 April 2014, this region was struck by a large earthquake following a two week long series of foreshocks. This study combines a wide range of observations, including geodetic, tsunami, and seismic data, to produce a reliable kinematic slip model of the Mw=8.1 main shock and a static slip model of the Mw=7.7 aftershock. We use a novel Bayesian modeling approach that accounts for uncertainty in the Green's functions, both static and dynamic, while avoiding nonphysical regularization. The results reveal a sharp slip zone, more compact than previously thought, located downdip of the foreshock sequence and updip of high-frequency sources inferred by back-projection analysis. Both the main shock and the Mw=7.7 aftershock did not rupture to the trench and left most of the seismic gap unbroken, leaving the possibility of a future large earthquake in the region. 6. Development of a borehole stress meter for studying earthquake predictions and rock mechanics, and stress seismograms of the 2011 Tohoku earthquake ( M 9.0) NASA Astrophysics Data System (ADS) Ishii, Hiroshi; Asai, Yasuhiro 2015-02-01 Although precursory signs of an earthquake can occur before the event, it is difficult to observe such signs with precision, especially on earth's surface where artificial noise and other factors complicate signal detection. One possible solution to this problem is to install monitoring instruments into the deep bedrock where earthquakes are likely to begin. When evaluating earthquake occurrence, it is necessary to elucidate the processes of stress accumulation in a medium and then release as a fault (crack) is generated, and to do so, the stress must be observed continuously. However, continuous observations of stress have not been implemented yet for earthquake monitoring programs. Strain is a secondary physical quantity whose variation varies depending on the elastic coefficient of the medium, and it can yield potentially valuable information as well. This article describes the development of a borehole stress meter that is capable of recording both continuous stress and strain at a depth of about 1 km. Specifically, this paper introduces the design principles of the stress meter as well as its actual structure. It also describes a newly developed calibration procedure and the results obtained to date for stress and strain studies of deep boreholes at three locations in Japan. To show examples of the observations, records of stress seismic waveforms generated by the 2011 Tohoku earthquake ( M 9.0) are presented. The results demonstrate that the stress meter data have sufficient precision and reliability. 7. IMPROVEMENT SUPPORT RESEARCH OF LOCAL DISASTER PREVENTION POWER USING THE FIRE SPREADING SIMULATION SYSTEM IN CASE OF A BIG EARTHQUAKE NASA Astrophysics Data System (ADS) Futagami, Toru; Omoto, Shohei; Hamamoto, Kenichirou This research describes the risk communication towards improvement in the local disaster prevention power for Gobusho town in Marugame city which is only a high density city area in Kagawa Pref. Specifically, the key persons and authors of the area report the practice research towards improvement in the local disaster prevention power by the PDCA cycle of the area, such as formation of local voluntary disaster management organizations and implementation of an emergency drill, applying the fire spreading simulation system in case of a big earthquake. The fire spreading simulation system in case of the big earthquake which authors are developing describes the role and subject which have been achieved to BCP of the local community as a support system. 8. Dementia Research: Populations, Progress, Problems, and Predictions. PubMed Hunter, Sally; Smailagic, Nadja; Brayne, Carol 2018-05-16 Alzheimer's disease (AD) is a clinicopathologically defined syndrome leading to cognitive impairment. Following the recent failures of amyloid-based randomized controlled trials to change the course of AD, there are growing calls for a re-evaluation of basic AD research. Epidemiology offers one approach to integrating the available evidence. Here we examine relationships between evidence from population-based, clinicopathological studies of brain aging and a range of hypotheses from all areas of AD research. We identify various problems, including a lack of systematic approach to measurement of clinical and neuropathological factors associated with dementia in experimental and clinical settings, poor understanding of the strengths and weaknesses of different observational and experimental designs, a lack of clarity in relation to disease definitions from the clinical, neuropathological, and molecular perspectives, inadequate characterization of brain aging in the human population, difficulties in translation between laboratory-based and population-based evidence bases, and a lack of communication between different sections of the dementia research community. Population studies highlight complexity and predict that therapeutic approaches based on single disease features will not be successful. Better characterization of brain aging in the human population is urgently required to select biomarkers and therapeutic targets that are meaningful to human disease. The generation of detailed and reliable evidence must be addressed before progress toward therapeutic interventions can be made. 9. Operational earthquake forecasting can enhance earthquake preparedness USGS Publications Warehouse Jordan, T.H.; Marzocchi, W.; Michael, A.J.; Gerstenberger, M.C. 2014-01-01 We cannot yet predict large earthquakes in the short term with much reliability and skill, but the strong clustering exhibited in seismic sequences tells us that earthquake probabilities are not constant in time; they generally rise and fall over periods of days to years in correlation with nearby seismic activity. Operational earthquake forecasting (OEF) is the dissemination of authoritative information about these time‐dependent probabilities to help communities prepare for potentially destructive earthquakes. The goal of OEF is to inform the decisions that people and organizations must continually make to mitigate seismic risk and prepare for potentially destructive earthquakes on time scales from days to decades. To fulfill this role, OEF must provide a complete description of the seismic hazard—ground‐motion exceedance probabilities as well as short‐term rupture probabilities—in concert with the long‐term forecasts of probabilistic seismic‐hazard analysis (PSHA). 10. Seismo-induced effects in the near-earth space: Combined ground and space investigations as a contribution to earthquake prediction NASA Astrophysics Data System (ADS) Sgrigna, V.; Buzzi, A.; Conti, L.; Picozza, P.; Stagni, C.; Zilpimiani, D. 2007-02-01 The paper aims at giving a few methodological suggestions in deterministic earthquake prediction studies based on combined ground-based and space observations of earthquake precursors. Up to now what is lacking is the demonstration of a causal relationship with explained physical processes and looking for a correlation between data gathered simultaneously and continuously by space observations and ground-based measurements. Coordinated space and ground-based observations imply available test sites on the Earth surface to correlate ground data, collected by appropriate networks of instruments, with space ones detected on board of LEO satellites. At this purpose a new result reported in the paper is an original and specific space mission project (ESPERIA) and two instruments of its payload. The ESPERIA space project has been performed for the Italian Space Agency and three ESPERIA instruments (ARINA and LAZIO particle detectors, and EGLE search-coil magnetometer) have been built and tested in space. The EGLE experiment started last April 15, 2005 on board the ISS, within the ENEIDE mission. The launch of ARINA occurred on June 15, 2006, on board the RESURS DK-1 Russian LEO satellite. As an introduction and justification to these experiments the paper clarifies some basic concepts and critical methodological aspects concerning deterministic and statistic approaches and their use in earthquake prediction. We also take the liberty of giving the scientific community a few critical hints based on our personal experience in the field and propose a joint study devoted to earthquake prediction and warning. 11. Continuous borehole strain and pore pressure in the near field of the 28 September 2004 M 6.0 parkfield, California, earthquake: Implications for nucleation, fault response, earthquake prediction and tremor USGS Publications Warehouse Johnston, M.J.S.; Borcherdt, R.D.; Linde, A.T.; Gladwin, M.T. 2006-01-01 Near-field observations of high-precision borehole strain and pore pressure, show no indication of coherent accelerating strain or pore pressure during the weeks to seconds before the 28 September 2004 M 6.0 Parkfield earthquake. Minor changes in strain rate did occur at a few sites during the last 24 hr before the earthquake but these changes are neither significant nor have the form expected for strain during slip coalescence initiating fault failure. Seconds before the event, strain is stable at the 10-11 level. Final prerupture nucleation slip in the hypocentral region is constrained to have a moment less than 2 ?? 1012 N m (M 2.2) and a source size less than 30 m. Ground displacement data indicate similar constraints. Localized rupture nucleation and runaway precludes useful prediction of damaging earthquakes. Coseismic dynamic strains of about 10 microstrain peak-to-peak were superimposed on volumetric strain offsets of about 0.5 microstrain to the northwest of the epicenter and about 0.2 microstrain to the southeast of the epicenter, consistent with right lateral slip. Observed strain and Global Positioning System (GPS) offsets can be simply fit with 20 cm of slip between 4 and 10 km on a 20-km segment of the fault north of Gold Hill (M0 = 7 ?? 1017 N m). Variable slip inversion models using GPS data and seismic data indicate similar moments. Observed postseismic strain is 60% to 300% of the coseismic strain, indicating incomplete release of accumulated strain. No measurable change in fault zone compliance preceding or following the earthquake is indicated by stable earth tidal response. No indications of strain change accompany nonvolcanic tremor events reported prior to and following the earthquake. 12. ARMA models for earthquake ground motions. Seismic safety margins research program SciTech Connect Chang, M. K.; Kwiatkowski, J. W.; Nau, R. F. 1981-02-01 Four major California earthquake records were analyzed by use of a class of discrete linear time-domain processes commonly referred to as ARMA (Autoregressive/Moving-Average) models. It was possible to analyze these different earthquakes, identify the order of the appropriate ARMA model(s), estimate parameters, and test the residuals generated by these models. It was also possible to show the connections, similarities, and differences between the traditional continuous models (with parameter estimates based on spectral analyses) and the discrete models with parameters estimated by various maximum-likelihood techniques applied to digitized acceleration data in the time domain. The methodology proposed is suitable for simulatingmore » earthquake ground motions in the time domain, and appears to be easily adapted to serve as inputs for nonlinear discrete time models of structural motions. 60 references, 19 figures, 9 tables.« less 13. Probabilistic Tsunami Hazard Assessment along Nankai Trough (2) a comprehensive assessment including a variety of earthquake source areas other than those that the Earthquake Research Committee, Japanese government (2013) showed NASA Astrophysics Data System (ADS) Hirata, K.; Fujiwara, H.; Nakamura, H.; Osada, M.; Morikawa, N.; Kawai, S.; Ohsumi, T.; Aoi, S.; Yamamoto, N.; Matsuyama, H.; Toyama, N.; Kito, T.; Murashima, Y.; Murata, Y.; Inoue, T.; Saito, R.; Takayama, J.; Akiyama, S.; Korenaga, M.; Abe, Y.; Hashimoto, N. 2016-12-01 For the forthcoming Nankai earthquake with M8 to M9 class, the Earthquake Research Committee(ERC)/Headquarters for Earthquake Research Promotion, Japanese government (2013) showed 15 examples of earthquake source areas (ESAs) as possible combinations of 18 sub-regions (6 segments along trough and 3 segments normal to trough) and assessed the occurrence probability within the next 30 years (from Jan. 1, 2013) was 60% to 70%. Hirata et al.(2015, AGU) presented Probabilistic Tsunami Hazard Assessment (PTHA) along Nankai Trough in the case where diversity of the next event's ESA is modeled by only the 15 ESAs. In this study, we newly set 70 ESAs in addition of the previous 15 ESAs so that total of 85 ESAs are considered. By producing tens of faults models, with various slip distribution patterns, for each of 85 ESAs, we obtain 2500 fault models in addition of previous 1400 fault models so that total of 3900 fault models are considered to model the diversity of the next Nankai earthquake rupture (Toyama et al.,2015, JpGU). For PTHA, the occurrence probability of the next Nankai earthquake is distributed to possible 3900 fault models in the viewpoint of similarity to the 15 ESAs' extents (Abe et al.,2015, JpGU). A major concept of the occurrence probability distribution is; (i) earthquakes rupturing on any of 15 ESAs that ERC(2013) showed most likely occur, (ii) earthquakes rupturing on any of ESAs whose along-trench extent is the same as any of 15 ESAs but trough-normal extent differs from it second likely occur, (iii) earthquakes rupturing on any of ESAs whose both of along-trough and trough-normal extents differ from any of 15 ESAs rarely occur. Procedures for tsunami simulation and probabilistic tsunami hazard synthesis are the same as Hirata et al (2015). A tsunami hazard map, synthesized under an assumption that the Nankai earthquakes can be modeled as a renewal process based on BPT distribution with a mean recurrence interval of 88.2 years (ERC, 2013) and an 14. Measuring the effectiveness of earthquake forecasting in insurance strategies NASA Astrophysics Data System (ADS) Mignan, A.; Muir-Wood, R. 2009-04-01 Given the difficulty of judging whether the skill of a particular methodology of earthquake forecasts is offset by the inevitable false alarms and missed predictions, it is important to find a means to weigh the successes and failures according to a common currency. Rather than judge subjectively the relative costs and benefits of predictions, we develop a simple method to determine if the use of earthquake forecasts can increase the profitability of active financial risk management strategies employed in standard insurance procedures. Three types of risk management transactions are employed: (1) insurance underwriting, (2) reinsurance purchasing and (3) investment in CAT bonds. For each case premiums are collected based on modelled technical risk costs and losses are modelled for the portfolio in force at the time of the earthquake. A set of predetermined actions follow from the announcement of any change in earthquake hazard, so that, for each earthquake forecaster, the financial performance of an active risk management strategy can be compared with the equivalent passive strategy in which no notice is taken of earthquake forecasts. Overall performance can be tracked through time to determine which strategy gives the best long term financial performance. This will be determined by whether the skill in forecasting the location and timing of a significant earthquake (where loss is avoided) is outweighed by false predictions (when no premium is collected). This methodology is to be tested in California, where catastrophe modeling is reasonably mature and where a number of researchers issue earthquake forecasts. 15. Research to Operations: From Point Positions, Earthquake and Tsunami Modeling to GNSS-augmented Tsunami Early Warning NASA Astrophysics Data System (ADS) Stough, T.; Green, D. S. 2017-12-01 This collaborative research to operations demonstration brings together the data and algorithms from NASA research, technology, and applications-funded projects to deliver relevant data streams, algorithms, predictive models, and visualization tools to the NOAA National Tsunami Warning Center (NTWC) and Pacific Tsunami Warning Center (PTWC). Using real-time GNSS data and models in an operational environment, we will test and evaluate an augmented capability for tsunami early warning. Each of three research groups collect data from a selected network of real-time GNSS stations, exchange data consisting of independently processed 1 Hz station displacements, and merge the output into a single, more accurate and reliable set. The resulting merged data stream is delivered from three redundant locations to the TWCs with a latency of 5-10 seconds. Data from a number of seismogeodetic stations with collocated GPS and accelerometer instruments are processed for displacements and seismic velocities and also delivered. Algorithms for locating and determining the magnitude of earthquakes as well as algorithms that compute the source function of a potential tsunami using this new data stream are included in the demonstration. The delivered data, algorithms, models and tools are hosted on NOAA-operated machines at both warning centers, and, once tested, the results will be evaluated for utility in improving the speed and accuracy of tsunami warnings. This collaboration has the potential to dramatically improve the speed and accuracy of the TWCs local tsunami information over the current seismometer-only based methods. In our first year of this work, we have established and deployed an architecture for data movement and algorithm installation at the TWC's. We are addressing data quality issues and porting algorithms into the TWCs operating environment. Our initial module deliveries will focus on estimating moment magnitude (Mw) from Peak Ground Displacement (PGD), within 2 16. Economic consequences of earthquakes: bridging research and practice with HayWired NASA Astrophysics Data System (ADS) Wein, A. M.; Kroll, C. 2016-12-01 The U.S. Geological Survey partners with organizations and experts to develop multiple hazard scenarios. The HayWired earthquake scenario refers to a rupture of the Hayward fault in the Bay Area of California and addresses the potential chaos related to interconnectedness at many levels: the fault afterslip and aftershocks, interdependencies of lifelines, wired/wireless technology, communities at risk, and ripple effects throughout today's digital economy. The scenario is intended for diverse audiences. HayWired analyses translate earthquake hazards (surface rupture, ground shaking, liquefaction, landslides) into physical engineering and environmental health impacts, and into societal consequences. Damages to life and property and lifeline service disruptions are direct causes of business interruption. Economic models are used to estimate the economic impacts and resilience in the regional economy. The objective of the economic analysis is to inform policy discourse about economic resilience at all three levels of the economy: macro, meso, and micro. Stakeholders include businesses, economic development, and community leaders. Previous scenario analyses indicate the size of an event: large earthquakes and large winter storms are both "big ones" for California. They motivate actions to reduce the losses from fire following earthquake and water supply outages. They show the effect that resilience can have on reducing economic losses. Evaluators find that stakeholders learned the most about the economic consequences. 17. Comparisons of ground motions from five aftershocks of the 1999 Chi-Chi, Taiwan, earthquake with empirical predictions largely based on data from California USGS Publications Warehouse Wang, G.-Q.; Boore, D.M.; Igel, H.; Zhou, X.-Y. 2004-01-01 The observed ground motions from five large aftershocks of the 1999 Chi-Chi, Taiwan, earthquake are compared with predictions from four equations based primarily on data from California. The four equations for active tectonic regions are those developed by Abrahamson and Silva (1997), Boore et al. (1997), Campbell (1997, 2001), and Sadigh et al. (1997). Comparisons are made for horizontal-component peak ground accelerations and 5%-damped pseudoacceleration response spectra at periods between 0.02 sec and 5 sec. The observed motions are in reasonable agreement with the predictions, particularly for distances from 10 to 30 km. This is in marked contrast to the motions from the Chi-Chi mainshock, which are much lower than the predicted motions for periods less than about 1 sec. The results indicate that the low motions in the mainshock are not due to unusual, localized absorption of seismic energy, because waves from the mainshock and the aftershocks generally traverse the same section of the crust and are recorded at the same stations. The aftershock motions at distances of 30-60 km are somewhat lower than the predictions (but not nearly by as small a factor as those for the mainshock), suggesting that the ground motion attenuates more rapidly in this region of Taiwan than it does in the areas we compare with it. We provide equations for the regional attenuation of response spectra, which show increasing decay of motion with distance for decreasing oscillator periods. This observational study also demonstrates that ground motions have large earthquake-location-dependent variability for a specific site. This variability reduces the accuracy with which an earthquake-specific prediction of site response can be predicted. Online Material: PGAs and PSAs from the 1999 Chi-Chi earthquake and five aftershocks. 18. Rational for Conducting PTSD Research and Challenges of Recruiting and Training Volunteers to Screen and Treat PTSD among the Nepal 2015 Earthquake Survivors. PubMed Jha, A; Shakya, S 2015-01-01 Post-traumatic Stress Disorder (PTSD) is common psychiatric morbidity among earthquake survivors, and if untreated people suffer from it for years. Government of Nepal and NGOs provided various short-term mental health services to the victims of the 2015 earthquake in Nepal, but there was no plan or provision for long-term mental health problems. The prevalence of PTSD following natural disasters depends on various local factors requiring understanding and further investigation before identifying affordable evidence based interventions. This paper discusses the need for PTSD research among the survivors of the 2015 earthquake in Nepal, and describes the challenges and difficulties of recruiting and training PTSD volunteers. 19. Predicted liquefaction in the greater Oakland area and northern Santa Clara Valley during a repeat of the 1868 Hayward Fault (M6.7-7.0) earthquake USGS Publications Warehouse Holzer, Thomas L.; Noce, Thomas E.; Bennett, Michael J. 2010-01-01 Probabilities of surface manifestations of liquefaction due to a repeat of the 1868 (M6.7-7.0) earthquake on the southern segment of the Hayward Fault were calculated for two areas along the margin of San Francisco Bay, California: greater Oakland and the northern Santa Clara Valley. Liquefaction is predicted to be more common in the greater Oakland area than in the northern Santa Clara Valley owing to the presence of 57 km2 of susceptible sandy artificial fill. Most of the fills were placed into San Francisco Bay during the first half of the 20th century to build military bases, port facilities, and shoreline communities like Alameda and Bay Farm Island. Probabilities of liquefaction in the area underlain by this sandy artificial fill range from 0.2 to ~0.5 for a M7.0 earthquake, and decrease to 0.1 to ~0.4 for a M6.7 earthquake. In the greater Oakland area, liquefaction probabilities generally are less than 0.05 for Holocene alluvial fan deposits, which underlie most of the remaining flat-lying urban area. In the northern Santa Clara Valley for a M7.0 earthquake on the Hayward Fault and an assumed water-table depth of 1.5 m (the historically shallowest water level), liquefaction probabilities range from 0.1 to 0.2 along Coyote and Guadalupe Creeks, but are less than 0.05 elsewhere. For a M6.7 earthquake, probabilities are greater than 0.1 along Coyote Creek but decrease along Guadalupe Creek to less than 0.1. Areas with high probabilities in the Santa Clara Valley are underlain by young Holocene levee deposits along major drainages where liquefaction and lateral spreading occurred during large earthquakes in 1868 and 1906. 20. Earthquake and Tsunami Disaster Mitigation in The Marmara Region and Disaster Education in Turkey (SATREPS Project: Science and Technology Research Partnership for Sustainable Development by JICA-JST) NASA Astrophysics Data System (ADS) Kaneda, Yoshiyuki 2015-04-01 Earthquake and Tsunami Disaster Mitigation in The Marmara Region and Disaster Education in Turkey (SATREPS Project: Science and Technology Research Partnership for Sustainable Development by JICA-JST) Yoshiyuki KANEDA Disaster mitigation center Nagoya University/ Japan Agency for Marine-Earth Science and Technology (JAMSTEC) Mustafa ELDIK Boğaziçi University, Kandilli Observatory and Earthquake Researches Institute (KOERI) and Members of SATREPS Japan-Turkey project The target of this project is the Marmara Sea earthquake after the Izmit (Kocaeli) Earthquake 1999 along to the North Anatolian fault. According to occurrences of historical Earthquakes, epicenters have moved from East to West along to the North Anatolian Fault. There is a seismic gap in the Marmara Sea. In Marmara region, there is Istanbul with high populations such as Tokyo. Therefore, Japan and Turkey can share our own experiences during past damaging earthquakes and we can prepare for future large Earthquakes and Tsunamis in cooperation with each other in SATREPS project. This project is composed of Multidisciplinary research project including observation researches, simulation researches, educational researches, and goals are as follows, ① To develop disaster mitigation policy and strategies based on Multidisciplinary research activities. ② To provide decision makers with newly found knowledge for its implementation to the current regulations. ③ To organize disaster education programs in order to increase disaster awareness in Turkey. ④ To contribute the evaluation of active fault studies in Japan. In this SATREPS project, we will integrate Multidisciplinary research results for disaster mitigation in Marmara region and .disaster education in Turkey. 1. NGA-West 2 Equations for predicting PGA, PGV, and 5%-Damped PSA for shallow crustal earthquakes USGS Publications Warehouse Boore, David M.; Stewart, Jon P.; Seyhan, Emel; Atkinson, Gail M. 2013-01-01 We provide ground-motion prediction equations for computing medians and standard deviations of average horizontal component intensity measures (IMs) for shallow crustal earthquakes in active tectonic regions. The equations were derived from a global database with M 3.0–7.9 events. We derived equations for the primary M- and distance-dependence of the IMs after fixing the VS30-based nonlinear site term from a parallel NGA-West 2 study. We then evaluated additional effects using mixed effects residuals analysis, which revealed no trends with source depth over the M range of interest, indistinct Class 1 and 2 event IMs, and basin depth effects that increase and decrease long-period IMs for depths larger and smaller, respectively, than means from regional VS30-depth relations. Our aleatory variability model captures decreasing between-event variability with M, as well as within-event variability that increases or decreases with M depending on period, increases with distance, and decreases for soft sites. 2. NGA-West2 equations for predicting vertical-component PGA, PGV, and 5%-damped PSA from shallow crustal earthquakes USGS Publications Warehouse Stewart, Jonathan P.; Boore, David M.; Seyhan, Emel; Atkinson, Gail M. 2016-01-01 We present ground motion prediction equations (GMPEs) for computing natural log means and standard deviations of vertical-component intensity measures (IMs) for shallow crustal earthquakes in active tectonic regions. The equations were derived from a global database with M 3.0–7.9 events. The functions are similar to those for our horizontal GMPEs. We derive equations for the primary M- and distance-dependence of peak acceleration, peak velocity, and 5%-damped pseudo-spectral accelerations at oscillator periods between 0.01–10 s. We observe pronounced M-dependent geometric spreading and region-dependent anelastic attenuation for high-frequency IMs. We do not observe significant region-dependence in site amplification. Aleatory uncertainty is found to decrease with increasing magnitude; within-event variability is independent of distance. Compared to our horizontal-component GMPEs, attenuation rates are broadly comparable (somewhat slower geometric spreading, faster apparent anelastic attenuation), VS30-scaling is reduced, nonlinear site response is much weaker, within-event variability is comparable, and between-event variability is greater. 3. A comparison of observed and predicted ground motions from the 2015 MW7.8 Gorkha, Nepal, earthquake USGS Publications Warehouse Hough, Susan E.; Martin, Stacey S.; Gahalaut, V.; Joshi, A.; Landes, M.; Bossu, R. 2016-01-01 We use 21 strong motion recordings from Nepal and India for the 25 April 2015 moment magnitude (MW) 7.8 Gorkha, Nepal, earthquake together with the extensive macroseismic intensity data set presented by Martin et al. (Seism Res Lett 87:957–962, 2015) to analyse the distribution of ground motions at near-field and regional distances. We show that the data are consistent with the instrumental peak ground acceleration (PGA) versus macroseismic intensity relationship developed by Worden et al. (Bull Seism Soc Am 102:204–221, 2012), and use this relationship to estimate peak ground acceleration from intensities (PGAEMS). For nearest-fault distances (RRUP < 200 km), PGAEMS is consistent with the Atkinson and Boore (Bull Seism Soc Am 93:1703–1729, 2003) subduction zone ground motion prediction equation (GMPE). At greater distances (RRUP > 200 km), instrumental PGA values are consistent with this GMPE, while PGAEMS is systematically higher. We suggest the latter reflects a duration effect whereby effects of weak shaking are enhanced by long-duration and/or long-period ground motions from a large event at regional distances. We use PGAEMS values within 200 km to investigate the variability of high-frequency ground motions using the Atkinson and Boore (Bull Seism Soc Am 93:1703–1729, 2003) GMPE as a baseline. Across the near-field region, PGAEMS is higher by a factor of 2.0–2.5 towards the northern, down-dip edge of the rupture compared to the near-field region nearer to the southern, up-dip edge of the rupture. Inferred deamplification in the deepest part of the Kathmandu valley supports the conclusion that former lake-bed sediments experienced a pervasive nonlinear response during the mainshock (Dixit et al. in Seismol Res Lett 86(6):1533–1539, 2015; Rajaure et al. in Tectonophysics, 2016. Ground motions were significantly amplified in the southern Gangetic basin, but were relatively low in the northern basin. The overall distribution of ground motions 4. Sensitivity analysis of tall buildings in Semarang, Indonesia due to fault earthquakes with maximum 7 Mw NASA Astrophysics Data System (ADS) Partono, Windu; Pardoyo, Bambang; Atmanto, Indrastono Dwi; Azizah, Lisa; Chintami, Rouli Dian 2017-11-01 Fault is one of the dangerous earthquake sources that can cause building failure. A lot of buildings were collapsed caused by Yogyakarta (2006) and Pidie (2016) fault source earthquakes with maximum magnitude 6.4 Mw. Following the research conducted by Team for Revision of Seismic Hazard Maps of Indonesia 2010 and 2016, Lasem, Demak and Semarang faults are three closest earthquake sources surrounding Semarang. The ground motion from those three earthquake sources should be taken into account for structural design and evaluation. Most of tall buildings, with minimum 40 meter high, in Semarang were designed and constructed following the 2002 and 2012 Indonesian Seismic Code. This paper presents the result of sensitivity analysis research with emphasis on the prediction of deformation and inter-story drift of existing tall building within the city against fault earthquakes. The analysis was performed by conducting dynamic structural analysis of 8 (eight) tall buildings using modified acceleration time histories. The modified acceleration time histories were calculated for three fault earthquakes with magnitude from 6 Mw to 7 Mw. The modified acceleration time histories were implemented due to inadequate time histories data caused by those three fault earthquakes. Sensitivity analysis of building against earthquake can be predicted by evaluating surface response spectra calculated using seismic code and surface response spectra calculated from acceleration time histories from a specific earthquake event. If surface response spectra calculated using seismic code is greater than surface response spectra calculated from acceleration time histories the structure will stable enough to resist the earthquake force. 5. An interdisciplinary approach to study Pre-Earthquake processes NASA Astrophysics Data System (ADS) Ouzounov, D.; Pulinets, S. A.; Hattori, K.; Taylor, P. T. 2017-12-01 We will summarize a multi-year research effort on wide-ranging observations of pre-earthquake processes. Based on space and ground data we present some new results relevant to the existence of pre-earthquake signals. Over the past 15-20 years there has been a major revival of interest in pre-earthquake studies in Japan, Russia, China, EU, Taiwan and elsewhere. Recent large magnitude earthquakes in Asia and Europe have shown the importance of these various studies in the search for earthquake precursors either for forecasting or predictions. Some new results were obtained from modeling of the atmosphere-ionosphere connection and analyses of seismic records (foreshocks /aftershocks), geochemical, electromagnetic, and thermodynamic processes related to stress changes in the lithosphere, along with their statistical and physical validation. This cross - disciplinary approach could make an impact on our further understanding of the physics of earthquakes and the phenomena that precedes their energy release. We also present the potential impact of these interdisciplinary studies to earthquake predictability. A detail summary of our approach and that of several international researchers will be part of this session and will be subsequently published in a new AGU/Wiley volume. This book is part of the Geophysical Monograph series and is intended to show the variety of parameters seismic, atmospheric, geochemical and historical involved is this important field of research and will bring this knowledge and awareness to a broader geosciences community. 6. ElarmS Earthquake Early Warning System 2016 Performance and New Research NASA Astrophysics Data System (ADS) Chung, A. I.; Allen, R. M.; Hellweg, M.; Henson, I. H.; Neuhauser, D. S. 2016-12-01 The ElarmS earthquake early warning system has been detecting earthquakes throughout California since 2007. It is one of the algorithms that contributes to the West Coast ShakeAlert, a prototype earthquake early warning system being developed for the US West Coast. ElarmS is also running in the Pacific Northwest, and in Israel, Chile, Turkey, and Peru in test mode. We summarize the performance of the ElarmS system over the past year and review some of the more problematic events that the system has encountered. During the first half of 2016 (2016-01-01 through 2016-07-21), ElarmS successfully alerted on all events with ANSS catalog magnitudes M>3 in the Los Angeles area. The mean alert time for these 9 events was just 4.84 seconds. In the San Francisco Bay Area, ElarmS detected 26 events with ANSS catalog magnitudes M>3. The alert times for these events is 9.12 seconds. The alert times are longer in the Bay Area than in the Los Angeles area due to the sparser network of stations in the Bay Area. 7 Bay Area events were not detected by ElarmS. These events occurred in areas where there is less dense station coverage. In addition, ElarmS sent alerts for 13 of the 16 moderately-sized (ANSS catalog magnitudes M>4) events that occurred throughout the state of California. One of those missed events was a M4.5 that occurred far offshore in the northernmost part of the state. The other two missed events occurred inland in regions with sparse station coverage. Over the past year, we have worked towards the implementation of a new filterbank teleseismic filter algorithm, which we will discuss. Other than teleseismic events, a significant cause of false alerts and severely mislocated events is spurious triggers being associated with triggers from a real earthquake. Here, we address new approaches to filtering out problematic triggers. 7. Real-time forecasting and predictability of catastrophic failure events: from rock failure to volcanoes and earthquakes NASA Astrophysics Data System (ADS) Main, I. G.; Bell, A. F.; Naylor, M.; Atkinson, M.; Filguera, R.; Meredith, P. G.; Brantut, N. 2012-12-01 Accurate prediction of catastrophic brittle failure in rocks and in the Earth presents a significant challenge on theoretical and practical grounds. The governing equations are not known precisely, but are known to produce highly non-linear behavior similar to those of near-critical dynamical systems, with a large and irreducible stochastic component due to material heterogeneity. In a laboratory setting mechanical, hydraulic and rock physical properties are known to change in systematic ways prior to catastrophic failure, often with significant non-Gaussian fluctuations about the mean signal at a given time, for example in the rate of remotely-sensed acoustic emissions. The effectiveness of such signals in real-time forecasting has never been tested before in a controlled laboratory setting, and previous work has often been qualitative in nature, and subject to retrospective selection bias, though it has often been invoked as a basis in forecasting natural hazard events such as volcanoes and earthquakes. Here we describe a collaborative experiment in real-time data assimilation to explore the limits of predictability of rock failure in a best-case scenario. Data are streamed from a remote rock deformation laboratory to a user-friendly portal, where several proposed physical/stochastic models can be analysed in parallel in real time, using a variety of statistical fitting techniques, including least squares regression, maximum likelihood fitting, Markov-chain Monte-Carlo and Bayesian analysis. The results are posted and regularly updated on the web site prior to catastrophic failure, to ensure a true and and verifiable prospective test of forecasting power. Preliminary tests on synthetic data with known non-Gaussian statistics shows how forecasting power is likely to evolve in the live experiments. In general the predicted failure time does converge on the real failure time, illustrating the bias associated with the 'benefit of hindsight' in retrospective analyses 8. Ground-motion parameters of the southwestern Indiana earthquake of 18 June 2002 and the disparity between the observed and predicted values USGS Publications Warehouse Street, R.; Wiegand, J.; Woolery, E.W.; Hart, P. 2005-01-01 The M 4.5 southwestern Indiana earthquake of 18 June 2002 triggered 46 blast monitors in Indiana, Illinois, and Kentucky. The resulting free-field particle velocity records, along with similar data from previous earthquakes in the study area, provide a clear standard for judging the reliability of current maps for predicting ground motions greater than 2 Hz in southwestern Indiana and southeastern Illinois. Peak horizontal accelerations and velocities, and 5% damped pseudo-accelerations for the earthquake, generally exceeded ground motions predicted for the top of the bedrock by factors of 2 or more, even after soil amplifications were taken into consideration. It is suggested, but not proven, that the low shear-wave velocity and weathered bedrock in the area are also amplifying the higher-frequency ground motions that have been repeatedly recorded by the blast monitors in the study area. It is also shown that there is a good correlation between the peak ground motions and 5% pseudo-accelerations recorded for the event, and the Modified Mercalli intensities interpreted for the event by the U.S. Geological Survey. 9. Research on Optimal Observation Scale for Damaged Buildings after Earthquake Based on Optimal Feature Space NASA Astrophysics Data System (ADS) Chen, J.; Chen, W.; Dou, A.; Li, W.; Sun, Y. 2018-04-01 A new information extraction method of damaged buildings rooted in optimal feature space is put forward on the basis of the traditional object-oriented method. In this new method, ESP (estimate of scale parameter) tool is used to optimize the segmentation of image. Then the distance matrix and minimum separation distance of all kinds of surface features are calculated through sample selection to find the optimal feature space, which is finally applied to extract the image of damaged buildings after earthquake. The overall extraction accuracy reaches 83.1 %, the kappa coefficient 0.813. The new information extraction method greatly improves the extraction accuracy and efficiency, compared with the traditional object-oriented method, and owns a good promotional value in the information extraction of damaged buildings. In addition, the new method can be used for the information extraction of different-resolution images of damaged buildings after earthquake, then to seek the optimal observation scale of damaged buildings through accuracy evaluation. It is supposed that the optimal observation scale of damaged buildings is between 1 m and 1.2 m, which provides a reference for future information extraction of damaged buildings. 10. How can we transfer scientific knowledge to citizens? : Case studies from huge earthquake and tsunami researches NASA Astrophysics Data System (ADS) Kitazato, Hiroshi; Kijima, Akihiro; Kogure, Kazuhiro; Fujikura, Katsunori 2017-04-01 On March 11, 2011, huge earthquake and tsunamis took place coastal regions of Northeast Japan. Coastal infrastructure collapsed due to high waves of tsunamis. Marine ecosystems were also strongly disturbed by the earthquakes and tsunamis. TEAMS (Tohoku Ecosystem-Associated Marine Sciences) has started for monitoring recovering process of marine ecosystems. The project continues ten years. First five years are mainly monitored recovery process, then we should transfer our knowledge to fishermen and citizens for restoration of fishery and social systems. But, how can we actually transfer our knowledge from science to citizens? This is new experience for us. Socio-technology constructs a "high quality risk communication" model how scientific knowledge or technologies from scientific communities to citizens. They are progressing as follows, "observation, measurements and data", → "modeling and synthesis" → "information process" → "delivery to society" → " take action in society". These steps show detailed transition from inter-disciplinarity to trans-disciplinarity in science and technology. In our presentation, we plan to show a couple of case studies that are going forward from science to society. 11. Defeating Earthquakes NASA Astrophysics Data System (ADS) Stein, R. S. 2012-12-01 our actions. Using these global datasets will help to make the model as uniform as possible. The model must be built by scientists in the affected countries with GEM's support, augmented by their insights and data. The model will launch in 2014; to succeed it must be open, international, independent, and continuously tested. But the mission of GEM is not just the likelihood of ground shaking, but also gaging the economic and social consequences of earthquakes, which greatly amplify the losses. For example, should the municipality of Istanbul retrofit schools, or increase its insurance reserves and recovery capacity? Should a homeowner in a high-risk area move or strengthen her building? This is why GEM is a public-private partnership. GEM's fourteen public sponsors and eight non-governmental organization members are standing for the developing world. To extend GEM into the financial world, we draw upon the expertise of companies. GEM's ten private sponsors have endorsed the acquisition of public knowledge over private gain. In a competitive world, this is a courageous act. GEM is but one link in a chain of preparedness: from earth science and engineering research, through groups like GEM, to mitigation, retrofit or relocate decisions, building codes and insurance, and finally to prepared hospitals, schools, and homes. But it is a link that our community can make strong. 12. Applications of research from the U.S. Geological Survey program, assessment of regional earthquake hazards and risk along the Wasatch Front, Utah USGS Publications Warehouse Gori, Paula L. 1993-01-01 INTERACTIVE WORKSHOPS: ESSENTIAL ELEMENTS OF THE EARTHQUAKE HAZARDS RESEARCH AND REDUCTION PROGRAM IN THE WASATCH FRONT, UTAH: Interactive workshops provided the forum and stimulus necessary to foster collaboration among the participants in the multidisciplinary, 5-yr program of earthquake hazards reduction in the Wasatch Front, Utah. The workshop process validated well-documented social science theories on the importance of interpersonal interaction, including interaction between researchers and users of research to increase the probability that research will be relevant to the user's needs and, therefore, more readily used. REDUCING EARTHQUAKE HAZARDS IN UTAH: THE CRUCIAL CONNECTION BETWEEN RESEARCHERS AND PRACTITIONERS: Complex scientific and engineering studies must be translated for and transferred to nontechnical personnel for use in reducing earthquake hazards in Utah. The three elements needed for effective translation, likelihood of occurrence, location, and severity of potential hazards, and the three elements needed for effective transfer, delivery, assistance, and encouragement, are described and illustrated for Utah. The importance of evaluating and revising earthquake hazard reduction programs and their components is emphasized. More than 30 evaluations of various natural hazard reduction programs and techniques are introduced. This report was prepared for research managers, funding sources, and evaluators of the Utah earthquake hazard reduction program who are concerned about effectiveness. An overview of the Utah program is provided for those researchers, engineers, planners, and decisionmakers, both public and private, who are committed to reducing human casualties, property damage, and interruptions of socioeconomic systems. PUBLIC PERCEPTIONS OF THE IMPLEMENTATION OF EARTHQUAKE MITIGATION POLICIES ALONG THE WASATCH FRONT IN UTAH: The earthquake hazard potential along the Wasatch Front in Utah has been well defined by a number of scientific and 13. Earthquake and Tsunami Disaster Mitigation in the Marmara Region and Disaster Education in Turkey Part3 NASA Astrophysics Data System (ADS) Kaneda, Yoshiyuki; Ozener, Haluk; Meral Ozel, Nurcan; Kalafat, Dogan; Ozgur Citak, Seckin; Takahashi, Narumi; Hori, Takane; Hori, Muneo; Sakamoto, Mayumi; Pinar, Ali; Oguz Ozel, Asim; Cevdet Yalciner, Ahmet; Tanircan, Gulum; Demirtas, Ahmet 2017-04-01 There have been many destructive earthquakes and tsunamis in the world.The recent events are, 2011 East Japan Earthquake/Tsunami in Japan, 2015 Nepal Earthquake and 2016 Kumamoto Earthquake in Japan, and so on. And very recently a destructive earthquake occurred in Central Italy. In Turkey, the 1999 Izmit Earthquake as the destructive earthquake occurred along the North Anatolian Fault (NAF). The NAF crosses the Sea of Marmara and the only "seismic gap" remains beneath the Sea of Marmara. Istanbul with high population similar to Tokyo in Japan, is located around the Sea of Marmara where fatal damages expected to be generated as compound damages including Tsunami and liquefaction, when the next destructive Marmara Earthquake occurs. The seismic risk of Istanbul seems to be under the similar risk condition as Tokyo in case of Nankai Trough earthquake and metropolitan earthquake. It was considered that Japanese and Turkish researchers can share their own experiences during past damaging earthquakes and can prepare for the future large earthquakes in cooperation with each other. Therefore, in 2013 the two countries, Japan and Turkey made an agreement to start a multidisciplinary research project, MarDiM SATREPS. The Project runs researches to aim to raise the preparedness for possible large-scale earthquake and Tsunami disasters in Marmara Region and it has four research groups with the following goals. 1) The first one is Marmara Earthquake Source region observational research group. This group has 4 sub-groups such as Seismicity, Geodesy, Electromagnetics and Trench analyses. Preliminary results such as seismicity and crustal deformation on the sea floor in Sea of Marmara have already achieved. 2) The second group focuses on scenario researches of earthquake occurrence along the North Anatolia Fault and precise tsunami simulation in the Marmara region. Research results from this group are to be the model of earthquake occurrence scenario in Sea of Marmara and the 14. A consistent and uniform research earthquake catalog for the AlpArray region: preliminary results. NASA Astrophysics Data System (ADS) Molinari, I.; Bagagli, M.; Kissling, E. H.; Diehl, T.; Clinton, J. F.; Giardini, D.; Wiemer, S. 2017-12-01 The AlpArray initiative (www.alparray.ethz.ch) is a large-scale European collaboration ( 50 institutes involved) to study the entire Alpine orogen at high resolution with a variety of geoscientific methods. AlpArray provides unprecedentedly uniform station coverage for the region with more than 650 broadband seismic stations, 300 of which are temporary. The AlpArray Seismic Network (AASN) is a joint effort of 25 institutes from 10 nations, operates since January 2016 and is expected to continue until the end of 2018. In this study, we establish a uniform earthquake catalogue for the Greater Alpine region during the operation period of the AASN with a aimed completeness of M2.5. The catalog has two main goals: 1) calculation of consistent and precise hypocenter locations 2) provide preliminary but uniform magnitude calculations across the region. The procedure is based on automatic high-quality P- and S-wave pickers, providing consistent phase arrival times in combination with a picking quality assessment. First, we detect all events in the region in 2016/2017 using an STA/LTA based detector. Among the detected events, we select 50 geographically homogeneously distributed events with magnitudes ≥2.5 representative for the entire catalog. We manually pick the selected events to establish a consistent P- and S-phase reference data set, including arrival-time time uncertainties. The reference data, are used to adjust the automatic pickers and to assess their performance. In a first iteration, a simple P-picker algorithm is applied to the entire dataset, providing initial picks for the advanced MannekenPix (MPX) algorithm. In a second iteration, the MPX picker provides consistent and reliable automatic first arrival P picks together with a pick-quality estimate. The derived automatic P picks are then used as initial values for a multi-component S-phase picking algorithm. Subsequently, automatic picks of all well-locatable earthquakes will be considered to calculate 15. Research for Stakeholders: Delivering the ShakeOut Earthquake Scenario to Golden Guardian Emergency Exercise Planners NASA Astrophysics Data System (ADS) Perry, S. C.; Holbrook, C. C. 2008-12-01 The ShakeOut Scenario of a magnitude 7.8 earthquake on the southern San Andreas Fault was developed to fit needs of end users, particularly emergency managers at Federal, State, and local levels. Customization has continued after initial publication. The Scenario, a collaboration among some 300 experts in physical and social sciences, engineering, and industry, was released in May, 2008, to a key planning conference for the November 2008 Golden Guardian Exercise series. According to long-standing observers, the 2008 exercise is the most ambitious of their experience. The scientific foundation has attracted a large number of participants and there are already requests to continue use of the Scenario in 2009. Successful exercises cover a limited range of capabilities, in order to test performance in measurable ways, and to train staff without overwhelming them. Any one exercise would fail if it attempted to capture the complexity of impacts from a major earthquake. Instead, exercise planners have used the Scenario like a magnifying glass to identify risk and capabilities most critical to their own jurisdictions. Presentations by Scenario scientists and a 16-page narrative provided an initial overview. However, many planners were daunted in attempts to extract details from a 300-page report, 12 supplemental studies, and 10 appendices, or in attempts to cast the reality into straightforward events to drive successful exercises. Thus we developed an evolving collection of documents, presentations, and consultations that included impacts to specific jurisdictions; distillations of damages and consequences; and annotated lists of capabilities and situations to consider. Some exercise planners needed realistic extrapolations beyond posited damages; others sought reality checks; yet others needed new formats or perspectives. Through all this, it was essential to maintain flexibility, assisting planners to adjust findings where appropriate, while indicating why some results 16. Twitter predicts citation rates of ecological research USGS Publications Warehouse Peoples, Brandon K.; Midway, Stephen R.; Sackett, Dana K.; Lynch, Abigail; Cooney, Patrick B. 2016-01-01 The relationship between traditional metrics of research impact (e.g., number of citations) and alternative metrics (altmetrics) such as Twitter activity are of great interest, but remain imprecisely quantified. We used generalized linear mixed modeling to estimate the relative effects of Twitter activity, journal impact factor, and time since publication on Web of Science citation rates of 1,599 primary research articles from 20 ecology journals published from 2012–2014. We found a strong positive relationship between Twitter activity (i.e., the number of unique tweets about an article) and number of citations. Twitter activity was a more important predictor of citation rates than 5-year journal impact factor. Moreover, Twitter activity was not driven by journal impact factor; the ‘highest-impact’ journals were not necessarily the most discussed online. The effect of Twitter activity was only about a fifth as strong as time since publication; accounting for this confounding factor was critical for estimating the true effects of Twitter use. Articles in impactful journals can become heavily cited, but articles in journals with lower impact factors can generate considerable Twitter activity and also become heavily cited. Authors may benefit from establishing a strong social media presence, but should not expect research to become highly cited solely through social media promotion. Our research demonstrates that altmetrics and traditional metrics can be closely related, but not identical. We suggest that both altmetrics and traditional citation rates can be useful metrics of research impact. 17. Twitter Predicts Citation Rates of Ecological Research. PubMed Peoples, Brandon K; Midway, Stephen R; Sackett, Dana; Lynch, Abigail; Cooney, Patrick B 2016-01-01 The relationship between traditional metrics of research impact (e.g., number of citations) and alternative metrics (altmetrics) such as Twitter activity are of great interest, but remain imprecisely quantified. We used generalized linear mixed modeling to estimate the relative effects of Twitter activity, journal impact factor, and time since publication on Web of Science citation rates of 1,599 primary research articles from 20 ecology journals published from 2012-2014. We found a strong positive relationship between Twitter activity (i.e., the number of unique tweets about an article) and number of citations. Twitter activity was a more important predictor of citation rates than 5-year journal impact factor. Moreover, Twitter activity was not driven by journal impact factor; the 'highest-impact' journals were not necessarily the most discussed online. The effect of Twitter activity was only about a fifth as strong as time since publication; accounting for this confounding factor was critical for estimating the true effects of Twitter use. Articles in impactful journals can become heavily cited, but articles in journals with lower impact factors can generate considerable Twitter activity and also become heavily cited. Authors may benefit from establishing a strong social media presence, but should not expect research to become highly cited solely through social media promotion. Our research demonstrates that altmetrics and traditional metrics can be closely related, but not identical. We suggest that both altmetrics and traditional citation rates can be useful metrics of research impact. 18. Demand surge following earthquakes USGS Publications Warehouse Olsen, Anna H. 2012-01-01 Demand surge is understood to be a socio-economic phenomenon where repair costs for the same damage are higher after large- versus small-scale natural disasters. It has reportedly increased monetary losses by 20 to 50%. In previous work, a model for the increased costs of reconstruction labor and materials was developed for hurricanes in the Southeast United States. The model showed that labor cost increases, rather than the material component, drove the total repair cost increases, and this finding could be extended to earthquakes. A study of past large-scale disasters suggested that there may be additional explanations for demand surge. Two such explanations specific to earthquakes are the exclusion of insurance coverage for earthquake damage and possible concurrent causation of damage from an earthquake followed by fire or tsunami. Additional research into these aspects might provide a better explanation for increased monetary losses after large- vs. small-scale earthquakes. 19. Bringing science from the top of the world to the rest of the world: using video to describe earthquake research in Nepal following the devastating 2015 M7.8 Gorkha earthquake NASA Astrophysics Data System (ADS) Karplus, M. S.; Barajas, A.; Garibay, L. 2016-12-01 In response to the April 25, 2015 M7.8 earthquake on the Main Himalayan Thrust in Nepal, NSF Geosciences funded a rapid seismological response project entitled NAMASTE (Nepal Array Measuring Aftershock Seismicity Trailing Earthquake). This project included the deployment, maintenance, and demobilization of a network of 45 temporary seismic stations from June 2015 to May 2016. During the demobilization of the seismic network, video footage was recorded to tell the story of the NAMASTE team's seismic research in Nepal using short movies. In this presentation, we will describe these movies and discuss our strategies for effectively communicating this research to both the academic and general public with the goals of promoting earthquake hazards and international awareness and inspiring enthusiasm about learning and participating in science research. For example, an initial screening of these videos took place for an Introduction to Geology class at the University of Texas at El Paso to obtain feedback from approximately 100 first-year students with only a basic geology background. The feedback was then used to inform final cuts of the video suitable for a range of audiences, as well as to help guide future videography of field work. The footage is also being cut into a short, three-minute video to be featured on the website of The University of Texas at El Paso, home to several of the NAMASTE team researchers. 20. NASA's Earth Science Research and Environmental Predictions NASA Technical Reports Server (NTRS) Hilsenrath, E. 2004-01-01 NASA Earth Science program began in the 1960s with cloud imaging satellites used for weather observations. A fleet of satellites are now in orbit to investigate the Earth Science System to uncover the connections between land, Oceans and the atmosphere. Satellite systems using an array of active and passive remote sensors are used to search for answers on how is the Earth changing and what are the consequences for life on Earth? The answer to these questions can be used for applications to serve societal needs and contribute to decision support systems for weather, hazard, and air quality predictions and mitigation of adverse effects. Partnerships with operational agencies using NASA's observational capabilities are now being explored. The system of the future will require new technology, data assimilation systems which includes data and models that will be used for forecasts that respond to user needs. 1. Research on regional numerical weather prediction NASA Technical Reports Server (NTRS) Kreitzberg, C. W. 1976-01-01 Extension of the predictive power of dynamic weather forecasting to scales below the conventional synoptic or cyclonic scales in the near future is assessed. Lower costs per computation, more powerful computers, and a 100 km mesh over the North American area (with coarser mesh extending beyond it) are noted at present. Doubling the resolution even locally (to 50 km mesh) would entail a 16-fold increase in costs (including vertical resolution and halving the time interval), and constraints on domain size and length of forecast. Boundary conditions would be provided by the surrounding 100 km mesh, and time-varying lateral boundary conditions can be considered to handle moving phenomena. More physical processes to treat, more efficient numerical techniques, and faster computers (improved software and hardware) backing up satellite and radar data could produce further improvements in forecasting in the 1980s. Boundary layer modeling, initialization techniques, and quantitative precipitation forecasting are singled out among key tasks. 2. Market Research: Faster, Smarter and Predictive DTIC Science & Technology 2015-08-01 for acquisition workforce, and mar- ket research report generation. MRCOE Release 3 will include full transition of capability to strategic platform...2015 Wesley,deputy director for technology and innovation, is acting director of the Department of Defense Office of Small Business Programs (OSBP...where Chowdhury provides senior man- agement support. S P E C I A L • I S S U E BBP 3.0 T hrough implementation of the “Increasing Small Business 3. An Offshore Geophysical Network in the Pacific Northwest for Earthquake and Tsunami Early Warning and Hazard Research NASA Astrophysics Data System (ADS) Wilcock, W. S. D.; Schmidt, D. A.; Vidale, J. E.; Harrington, M.; Bodin, P.; Cram, G.; Delaney, J. R.; Gonzalez, F. I.; Kelley, D. S.; LeVeque, R. J.; Manalang, D.; McGuire, C.; Roland, E. C.; Tilley, J.; Vogl, C. J.; Stoermer, M. 2016-12-01 The Cascadia subduction zone hosts catastrophic earthquakes every few hundred years. On land, there are extensive geophysical networks available to monitor the subduction zone, but since the locked portion of the plate boundary lies mostly offshore, these networks are ideally complemented by seafloor observations. Such considerations helped motivate the development of scientific cabled observatories that cross the subduction zone at two sites off Vancouver Island and one off central Oregon, but these have a limited spatial footprint along the strike of the subduction zone. The Pacific Northwest Seismic Network is leading a collaborative effort to implement an earthquake early warning system in the Washington and Oregon using data streams from land networks as well as the few existing offshore instruments. For subduction zone earthquakes that initiate offshore, this system will provide a warning. However, the availability of real time offshore instrumentation along the entire subduction zone would improve its reliability and accuracy, add up to 15 s to the warning time, and ensure an early warning for coastal communities near the epicenter. Furthermore, real-time networks of seafloor pressure sensors above the subduction zone would enable monitoring and contribute to accurate predictions of the incoming tsunami. There is also strong scientific motivation for offshore monitoring. We lack a complete knowledge of the plate convergence rate and direction. Measurements of steady deformation and observations of transient processes such as fluid pulsing, microseismic cycles, tremor and slow-slip are necessary for assessing the dimensions of the locked zone and its along-strike segmentation. Long-term monitoring will also provide baseline observations that can be used to detect and evaluate changes in the subduction environment. There are significant engineering challenges to be solved to ensure the system is sufficiently reliable and maintainable. It must provide 4. Earthquake Early Warning: User Education and Designing Effective Messages NASA Astrophysics Data System (ADS) Burkett, E. R.; Sellnow, D. D.; Jones, L.; Sellnow, T. L. 2014-12-01 The U.S. Geological Survey (USGS) and partners are transitioning from test-user trials of a demonstration earthquake early warning system (ShakeAlert) to deciding and preparing how to implement the release of earthquake early warning information, alert messages, and products to the public and other stakeholders. An earthquake early warning system uses seismic station networks to rapidly gather information about an occurring earthquake and send notifications to user devices ahead of the arrival of potentially damaging ground shaking at their locations. Earthquake early warning alerts can thereby allow time for actions to protect lives and property before arrival of damaging shaking, if users are properly educated on how to use and react to such notifications. A collaboration team of risk communications researchers and earth scientists is researching the effectiveness of a chosen subset of potential earthquake early warning interface designs and messages, which could be displayed on a device such as a smartphone. Preliminary results indicate, for instance, that users prefer alerts that include 1) a map to relate their location to the earthquake and 2) instructions for what to do in response to the expected level of shaking. A number of important factors must be considered to design a message that will promote appropriate self-protective behavior. While users prefer to see a map, how much information can be processed in limited time? Are graphical representations of wavefronts helpful or confusing? The most important factor to promote a helpful response is the predicted earthquake intensity, or how strong the expected shaking will be at the user's location. Unlike Japanese users of early warning, few Californians are familiar with the earthquake intensity scale, so we are exploring how differentiating instructions between intensity levels (e.g., "Be aware" for lower shaking levels and "Drop, cover, hold on" at high levels) can be paired with self-directed supplemental 5. The Role of Communication in Post-disaster Research Coordination: Communicating the research moratorium after the 22 February 2011 Mw 6 Christchurch Earthquake in New Zealand. NASA Astrophysics Data System (ADS) Beaven, S. 2015-12-01 Disasters stimulate research activity by creating comparatively rare post-disaster data, while also increasing the urgency of agency demand for scientific evidence. In the wake of the 2011 Christchurch Earthquake disaster, New Zealand, post-disaster research activity was coordinated by a national Natural Hazards Research Platform, in collaboration with response agencies. The focus was on research support for responding agencies, with an emphasis on creating high quality scientific outcomes. This coordinated research effort did not include independent research activity, which escalated steeply in the weeks after the event. The risks this increased research pressure posed to response operations and impacted populations informed the declaration of a moratorium on research not deemed relevant to the needs of response agencies. This presentation summarizes communication issues that made it difficult to disseminate the moratorium, and to establish the relevance of this decision where it might have been most effective in diminishing these risks: within national and international natural hazard and disaster research communities, other national research communities, across responding agencies and organisations, and among impacted organizations and communities. 6. Earthquake forecasting test for Kanto district to reduce vulnerability of urban mega earthquake disasters NASA Astrophysics Data System (ADS) Yokoi, S.; Tsuruoka, H.; Nanjo, K.; Hirata, N. 2012-12-01 Collaboratory for the Study of Earthquake Predictability (CSEP) is a global project on earthquake predictability research. The final goal of this project is to search for the intrinsic predictability of the earthquake rupture process through forecast testing experiments. The Earthquake Research Institute, the University of Tokyo joined CSEP and started the Japanese testing center called as CSEP-Japan. This testing center provides an open access to researchers contributing earthquake forecast models applied to Japan. Now more than 100 earthquake forecast models were submitted on the prospective experiment. The models are separated into 4 testing classes (1 day, 3 months, 1 year and 3 years) and 3 testing regions covering an area of Japan including sea area, Japanese mainland and Kanto district. We evaluate the performance of the models in the official suite of tests defined by CSEP. The total number of experiments was implemented for approximately 300 rounds. These results provide new knowledge concerning statistical forecasting models. We started a study for constructing a 3-dimensional earthquake forecasting model for Kanto district in Japan based on CSEP experiments under the Special Project for Reducing Vulnerability for Urban Mega Earthquake Disasters. Because seismicity of the area ranges from shallower part to a depth of 80 km due to subducting Philippine Sea plate and Pacific plate, we need to study effect of depth distribution. We will develop models for forecasting based on the results of 2-D modeling. We defined the 3D - forecasting area in the Kanto region with test classes of 1 day, 3 months, 1 year and 3 years, and magnitudes from 4.0 to 9.0 as in CSEP-Japan. In the first step of the study, we will install RI10K model (Nanjo, 2011) and the HISTETAS models (Ogata, 2011) to know if those models have good performance as in the 3 months 2-D CSEP-Japan experiments in the Kanto region before the 2011 Tohoku event (Yokoi et al., in preparation). We use CSEP 7. Earthquake Scaling Relations NASA Astrophysics Data System (ADS) Jordan, T. H.; Boettcher, M.; Richardson, E. 2002-12-01 Using scaling relations to understand nonlinear geosystems has been an enduring theme of Don Turcotte's research. In particular, his studies of scaling in active fault systems have led to a series of insights about the underlying physics of earthquakes. This presentation will review some recent progress in developing scaling relations for several key aspects of earthquake behavior, including the inner and outer scales of dynamic fault rupture and the energetics of the rupture process. The proximate observations of mining-induced, friction-controlled events obtained from in-mine seismic networks have revealed a lower seismicity cutoff at a seismic moment Mmin near 109 Nm and a corresponding upper frequency cutoff near 200 Hz, which we interpret in terms of a critical slip distance for frictional drop of about 10-4 m. Above this cutoff, the apparent stress scales as M1/6 up to magnitudes of 4-5, consistent with other near-source studies in this magnitude range (see special session S07, this meeting). Such a relationship suggests a damage model in which apparent fracture energy scales with the stress intensity factor at the crack tip. Under the assumption of constant stress drop, this model implies an increase in rupture velocity with seismic moment, which successfully predicts the observed variation in corner frequency and maximum particle velocity. Global observations of oceanic transform faults (OTFs) allow us to investigate a situation where the outer scale of earthquake size may be controlled by dynamics (as opposed to geologic heterogeneity). The seismicity data imply that the effective area for OTF moment release, AE, depends on the thermal state of the fault but is otherwise independent of fault's average slip rate; i.e., AE ~ AT, where AT is the area above a reference isotherm. The data are consistent with β = 1/2 below an upper cutoff moment Mmax that increases with AT and yield the interesting scaling relation Amax ~ AT1/2. Taken together, the OTF 8. Rail-highway crossing accident prediction research results - FY80 DOT National Transportation Integrated Search 1981-01-01 This report presents the results of research performed at the : Transportation Systems Center (TSC) dealing with mathematical : methods of predicting accidents at rail-highway crossings. The : work consists of three parts : Part I - Revised DOT Accid... 9. A prospective earthquake forecast experiment for Japan NASA Astrophysics Data System (ADS) Yokoi, Sayoko; Nanjo, Kazuyoshi; Tsuruoka, Hiroshi; Hirata, Naoshi 2013-04-01 One major focus of the current Japanese earthquake prediction research program (2009-2013) is to move toward creating testable earthquake forecast models. For this purpose we started an experiment of forecasting earthquake activity in Japan under the framework of the Collaboratory for the Study of Earthquake Predictability (CSEP) through an international collaboration. We established the CSEP Testing Centre, an infrastructure to encourage researchers to develop testable models for Japan, and to conduct verifiable prospective tests of their model performance. On 1 November in 2009, we started the 1st earthquake forecast testing experiment for the Japan area. We use the unified JMA catalogue compiled by the Japan Meteorological Agency as authorized catalogue. The experiment consists of 12 categories, with 4 testing classes with different time spans (1 day, 3 months, 1 year, and 3 years) and 3 testing regions called All Japan, Mainland, and Kanto. A total of 91 models were submitted to CSEP-Japan, and are evaluated with the CSEP official suite of tests about forecast performance. In this presentation, we show the results of the experiment of the 3-month testing class for 5 rounds. HIST-ETAS7pa, MARFS and RI10K models corresponding to the All Japan, Mainland and Kanto regions showed the best score based on the total log-likelihood. It is also clarified that time dependency of model parameters is no effective factor to pass the CSEP consistency tests for the 3-month testing class in all regions. Especially, spatial distribution in the All Japan region was too difficult to pass consistency test due to multiple events at a bin. Number of target events for a round in the Mainland region tended to be smaller than model's expectation during all rounds, which resulted in rejections of consistency test because of overestimation. In the Kanto region, pass ratios of consistency tests in each model showed more than 80%, which was associated with good balanced forecasting of event 10. Statistical aspects and risks of human-caused earthquakes NASA Astrophysics Data System (ADS) Klose, C. D. 2013-12-01 The seismological community invests ample human capital and financial resources to research and predict risks associated with earthquakes. Industries such as the insurance and re-insurance sector are equally interested in using probabilistic risk models developed by the scientific community to transfer risks. These models are used to predict expected losses due to naturally occurring earthquakes. But what about the risks associated with human-caused earthquakes? Such risk models are largely absent from both industry and academic discourse. In countries around the world, informed citizens are becoming increasingly aware and concerned that this economic bias is not sustainable for long-term economic growth, environmental and human security. Ultimately, citizens look to their government officials to hold industry accountable. In the Netherlands, for example, the hydrocarbon industry is held accountable for causing earthquakes near Groningen. In Switzerland, geothermal power plants were shut down or suspended because they caused earthquakes in canton Basel and St. Gallen. The public and the private non-extractive industry needs access to information about earthquake risks in connection with sub/urban geoengineeing activities, including natural gas production through fracking, geothermal energy production, carbon sequestration, mining and water irrigation. This presentation illuminates statistical aspects of human-caused earthquakes with respect to different geologic environments. Statistical findings are based on the first catalog of human-caused earthquakes (in Klose 2013). Findings are discussed which include the odds to die during a medium-size earthquake that is set off by geomechanical pollution. Any kind of geoengineering activity causes this type of pollution and increases the likelihood of triggering nearby faults to rupture. 11. Earthquakes in Action: Incorporating Multimedia, Internet Resources, Large-scale Seismic Data, and 3-D Visualizations into Innovative Activities and Research Projects for Today's High School Students NASA Astrophysics Data System (ADS) Smith-Konter, B.; Jacobs, A.; Lawrence, K.; Kilb, D. 2006-12-01 The most effective means of communicating science to today's "high-tech" students is through the use of visually attractive and animated lessons, hands-on activities, and interactive Internet-based exercises. To address these needs, we have developed Earthquakes in Action, a summer high school enrichment course offered through the California State Summer School for Mathematics and Science (COSMOS) Program at the University of California, San Diego. The summer course consists of classroom lectures, lab experiments, and a final research project designed to foster geophysical innovations, technological inquiries, and effective scientific communication (http://topex.ucsd.edu/cosmos/earthquakes). Course content includes lessons on plate tectonics, seismic wave behavior, seismometer construction, fault characteristics, California seismicity, global seismic hazards, earthquake stress triggering, tsunami generation, and geodetic measurements of the Earth's crust. Students are introduced to these topics through lectures-made-fun using a range of multimedia, including computer animations, videos, and interactive 3-D visualizations. These lessons are further enforced through both hands-on lab experiments and computer-based exercises. Lab experiments included building hand-held seismometers, simulating the frictional behavior of faults using bricks and sandpaper, simulating tsunami generation in a mini-wave pool, and using the Internet to collect global earthquake data on a daily basis and map earthquake locations using a large classroom map. Students also use Internet resources like Google Earth and UNAVCO/EarthScope's Jules Verne Voyager Jr. interactive mapping tool to study Earth Science on a global scale. All computer-based exercises and experiments developed for Earthquakes in Action have been distributed to teachers participating in the 2006 Earthquake Education Workshop, hosted by the Visualization Center at Scripps Institution of Oceanography (http 12. Resource loss, self-efficacy, and family support predict posttraumatic stress symptoms: a 3-year study of earthquake survivors. PubMed Warner, Lisa Marie; Gutiérrez-Doña, Benicio; Villegas Angulo, Maricela; Schwarzer, Ralf 2015-01-01 Social support and self-efficacy are regarded as coping resources that may facilitate readjustment after traumatic events. The 2009 Cinchona earthquake in Costa Rica serves as an example for such an event to study resources to prevent subsequent severity of posttraumatic stress symptoms. At Time 1 (1-6 months after the earthquake in 2009), N=200 survivors were interviewed, assessing resource loss, received family support, and posttraumatic stress response. At Time 2 in 2012, severity of posttraumatic stress symptoms and general self-efficacy beliefs were assessed. Regression analyses estimated the severity of posttraumatic stress symptoms accounted for by all variables. Moderator and mediator models were examined to understand the interplay of received family support and self-efficacy with posttraumatic stress symptoms. Baseline posttraumatic stress symptoms and resource loss (T1) accounted for significant but small amounts of the variance in the severity of posttraumatic stress symptoms (T2). The main effects of self-efficacy (T2) and social support (T1) were negligible, but social support buffered resource loss, indicating that only less supported survivors were affected by resource loss. Self-efficacy at T2 moderated the support-stress relationship, indicating that low levels of self-efficacy could be compensated by higher levels of family support. Receiving family support at T1 enabled survivors to feel self-efficacious, underlining the enabling hypothesis. Receiving social support from relatives shortly after an earthquake was found to be an important coping resource, as it alleviated the association between resource loss and the severity of posttraumatic stress response, compensated for deficits of self-efficacy, and enabled self-efficacy, which was in turn associated with more adaptive adjustment 3 years after the earthquake. 13. Earthquake Forecasting System in Italy NASA Astrophysics Data System (ADS) Falcone, G.; Marzocchi, W.; Murru, M.; Taroni, M.; Faenza, L. 2017-12-01 In Italy, after the 2009 L'Aquila earthquake, a procedure was developed for gathering and disseminating authoritative information about the time dependence of seismic hazard to help communities prepare for a potentially destructive earthquake. The most striking time dependency of the earthquake occurrence process is the time clustering, which is particularly pronounced in time windows of days and weeks. The Operational Earthquake Forecasting (OEF) system that is developed at the Seismic Hazard Center (Centro di Pericolosità Sismica, CPS) of the Istituto Nazionale di Geofisica e Vulcanologia (INGV) is the authoritative source of seismic hazard information for Italian Civil Protection. The philosophy of the system rests on a few basic concepts: transparency, reproducibility, and testability. In particular, the transparent, reproducible, and testable earthquake forecasting system developed at CPS is based on ensemble modeling and on a rigorous testing phase. Such phase is carried out according to the guidance proposed by the Collaboratory for the Study of Earthquake Predictability (CSEP, international infrastructure aimed at evaluating quantitatively earthquake prediction and forecast models through purely prospective and reproducible experiments). In the OEF system, the two most popular short-term models were used: the Epidemic-Type Aftershock Sequences (ETAS) and the Short-Term Earthquake Probabilities (STEP). Here, we report the results from OEF's 24hour earthquake forecasting during the main phases of the 2016-2017 sequence occurred in Central Apennines (Italy). 14. Earthquake Risk Mitigation in the Tokyo Metropolitan area NASA Astrophysics Data System (ADS) Hirata, N.; Sakai, S.; Kasahara, K.; Nakagawa, S.; Nanjo, K.; Panayotopoulos, Y.; Tsuruoka, H. 2010-12-01 Seismic disaster risk mitigation in urban areas constitutes a challenge through collaboration of scientific, engineering, and social-science fields. Examples of collaborative efforts include research on detailed plate structure with identification of all significant faults, developing dense seismic networks; strong ground motion prediction, which uses information on near-surface seismic site effects and fault models; earthquake resistant and proof structures; and cross-discipline infrastructure for effective risk mitigation just after catastrophic events. Risk mitigation strategy for the next greater earthquake caused by the Philippine Sea plate (PSP) subducting beneath the Tokyo metropolitan area is of major concern because it caused past mega-thrust earthquakes, such as the 1703 Genroku earthquake (magnitude M8.0) and the 1923 Kanto earthquake (M7.9) which had 105,000 fatalities. A M7 or greater (M7+) earthquake in this area at present has high potential to produce devastating loss of life and property with even greater global economic repercussions. The Central Disaster Management Council of Japan estimates that the M7+ earthquake will cause 11,000 fatalities and 112 trillion yen (about 1 trillion US$) economic loss. This earthquake is evaluated to occur with a probability of 70% in 30 years by the Earthquake Research Committee of Japan. In order to mitigate disaster for greater Tokyo, the Special Project for Earthquake Disaster Mitigation in the Tokyo Metropolitan Area (2007-2011) was launched in collaboration with scientists, engineers, and social-scientists in nationwide institutions. The results that are obtained in the respective fields will be integrated until project termination to improve information on the strategy assessment for seismic risk mitigation in the Tokyo metropolitan area. In this talk, we give an outline of our project as an example of collaborative research on earthquake risk mitigation. Discussion is extended to our effort in progress and

15. Analog earthquakes

SciTech Connect

Hofmann, R.B.

1995-09-01

Analogs are used to understand complex or poorly understood phenomena for which little data may be available at the actual repository site. Earthquakes are complex phenomena, and they can have a large number of effects on the natural system, as well as on engineered structures. Instrumental data close to the source of large earthquakes are rarely obtained. The rare events for which measurements are available may be used, with modfications, as analogs for potential large earthquakes at sites where no earthquake data are available. In the following, several examples of nuclear reactor and liquified natural gas facility siting are discussed.more » A potential use of analog earthquakes is proposed for a high-level nuclear waste (HLW) repository.« less

16. Methodology to determine the parameters of historical earthquakes in China

Wang, Jian; Lin, Guoliang; Zhang, Zhe

2017-12-01

China is one of the countries with the longest cultural tradition. Meanwhile, China has been suffering very heavy earthquake disasters; so, there are abundant earthquake recordings. In this paper, we try to sketch out historical earthquake sources and research achievements in China. We will introduce some basic information about the collections of historical earthquake sources, establishing intensity scale and the editions of historical earthquake catalogues. Spatial-temporal and magnitude distributions of historical earthquake are analyzed briefly. Besides traditional methods, we also illustrate a new approach to amend the parameters of historical earthquakes or even identify candidate zones for large historical or palaeo-earthquakes. In the new method, a relationship between instrumentally recorded small earthquakes and strong historical earthquakes is built up. Abundant historical earthquake sources and the achievements of historical earthquake research in China are of valuable cultural heritage in the world.

17. Academia Sinica, TW E-science to Assistant Seismic Observations for Earthquake Research, Monitor and Hazard Reduction Surrounding the South China Sea

Huang, Bor-Shouh; Liu, Chun-Chi; Yen, Eric; Liang, Wen-Tzong; Lin, Simon C.; Huang, Win-Gee; Lee, Shiann-Jong; Chen, Hsin-Yen

Experience from the 1994 giant Sumatra earthquake, seismic and tsunami hazard have been considered as important issues in the South China Sea and its surrounding region, and attracted many seismologist's interesting. Currently, more than 25 broadband seismic instruments are currently operated by Institute of Earth Sciences, Academia Sinica in northern Vietnam to study the geodynamic evolution of the Red river fracture zone and rearranged to distribute to southern Vietnam recently to study the geodynamic evolution and its deep structures of the South China Sea. Similar stations are planned to deploy in Philippines in near future. In planning, some high quality stations may be as permanent stations and added continuous GPS observations, and instruments to be maintained and operated by several cooperation institutes, for instance, Institute of Geophysics, Vietnamese Acadamy of Sciences and Technology in Vietnam and Philippine Institute of Volcanology and Seismology in Philippines. Finally, those stations will be planed to upgrade as real time transmission stations for earthquake monitoring and tsunami warning. However, high speed data transfer within different agencies is always a critical issue for successful network operation. By taking advantage of both EGEE and EUAsiaGrid e-Infrastructure, Academia Sinica Grid Computing Centre coordinates researchers from various Asian countries to construct a platform to high performance data transfer for huge parallel computation. Efforts from this data service and a newly build earthquake data centre for data management may greatly improve seismic network performance. Implementation of Grid infrastructure and e-science issues in this region may assistant development of earthquake research, monitor and natural hazard reduction. In the near future, we will search for new cooperation continually from the surrounding countries of the South China Sea to install new seismic stations to construct a complete seismic network of the

18. A precast concrete bridge bent designed to re-center after an earthquake : draft research report, August 2008.

DOT National Transportation Integrated Search

2008-08-01

In this study the post-earthquake residual displacements of reinforced concrete bridge bents were investigated. The system had mild steel that was intended to dissipate energy and an unbonded, post-tensioned tendon that was supposed to remain elastic...

19. A precast concrete bridge bent designed to re-center after an earthquake : research report, October 2008.

DOT National Transportation Integrated Search

2008-10-01

In this study the post-earthquake residual displacements of reinforced concrete bridge bents were investigated. The system had mild steel that was intended to dissipate energy and an unbonded, post-tensioned tendon that was supposed to remain elastic...

20. Earthquake precursory events around epicenters and local active faults

2013-05-01

The chain of underground events which are triggered by seismic activities and physical/chemical interactions prior to a shake in the earth's crust may produce surface and above surface phenomena. During the past decades many researchers have been carried away to seek the possibility of short term earthquake prediction using remote sensing data. Currently, there are several theories about the preparation stages of earthquakes most of which stress on raises in heat and seismic waves as the main signs of an impending earthquakes. Their differences only lie in the secondary phenomena which are triggered by these events. In any case, with the recent advances in remote sensing sensors and techniques now we are able to provide wider, more accurate monitoring of land, ocean and atmosphere. Among all theoretical factors, changes in Surface Latent Heat Flux (SLHF), Sea & Land Surface Temperature (SST & LST) and surface chlorophyll-a are easier to record from earth observing satellites. SLHF is the amount of energy exchange in the form of water vapor between the earth's surface and atmosphere. Abnormal variations in this factor have been frequently reported as an earthquake precursor during the past years. The accumulated stress in the earth's crust during the preparation phase of earthquakes is said to be the main cause of temperature anomalies weeks to days before the main event and subsequent shakes. Chemical and physical interactions in the presence of underground water lead to higher water evaporation prior to inland earthquakes. In case of oceanic earthquakes, higher temperature at the ocean beds may lead to higher amount of Chl-a on the sea surface. On the other hand, it has been also said that the leak of Radon gas which occurs as rocks break during earthquake preparation causes the formation of airborne ions and higher Air Temperature (AT). We have chosen to perform a statistical, long-term, and short-term approach by considering the reoccurrence intervals of past

1. Spatial Distribution of the Coefficient of Variation for the Paleo-Earthquakes in Japan

Nomura, S.; Ogata, Y.

2015-12-01

Renewal processes, point prccesses in which intervals between consecutive events are independently and identically distributed, are frequently used to describe this repeating earthquake mechanism and forecast the next earthquakes. However, one of the difficulties in applying recurrent earthquake models is the scarcity of the historical data. Most studied fault segments have few, or only one observed earthquake that often have poorly constrained historic and/or radiocarbon ages. The maximum likelihood estimate from such a small data set can have a large bias and error, which tends to yield high probability for the next event in a very short time span when the recurrence intervals have similar lengths. On the other hand, recurrence intervals at a fault depend on the long-term slip rate caused by the tectonic motion in average. In addition, recurrence times are also fluctuated by nearby earthquakes or fault activities which encourage or discourage surrounding seismicity. These factors have spatial trends due to the heterogeneity of tectonic motion and seismicity. Thus, this paper introduces a spatial structure on the key parameters of renewal processes for recurrent earthquakes and estimates it by using spatial statistics. Spatial variation of mean and variance parameters of recurrence times are estimated in Bayesian framework and the next earthquakes are forecasted by Bayesian predictive distributions. The proposal model is applied for recurrent earthquake catalog in Japan and its result is compared with the current forecast adopted by the Earthquake Research Committee of Japan.

2. NGO collaboration in community post-disaster reconstruction: field research following the 2008 Wenchuan earthquake in China.

PubMed

Lu, Yi; Xu, Jiuping

2015-04-01

The number of communities affected by disasters has been rising. As a result, non-governmental organisations (NGOs) that attend community post-disaster reconstruction are often unable to deliver all requirements and have to develop cooperative approaches. However, this collaboration can cause problems because of the complex environments, the fight for limited resources and uncoordinated management, all of which result in poor service delivery to the communities, adding to their woes. From extensive field research and case studies conducted in the post-Wenchuan earthquake-stricken communities, this paper introduces an integrated collaboration framework for community post-disaster reconstruction with the focus on three types of NGOs: international, government organised and civil. The proposed collaboration framework examines the three interrelated components of organisational structure, operational processes and reconstruction goals/implementation areas. Of great significance in better promoting collaborative participation between NGOs are the crucial concepts of participatory reconstruction, double-layer collaborative networks, and circular review and revision. © 2015 The Author(s). Disasters © Overseas Development Institute, 2015.

3. Real data assimilation for optimization of frictional parameters and prediction of afterslip in the 2003 Tokachi-oki earthquake inferred from slip velocity by an adjoint method

Kano, Masayuki; Miyazaki, Shin'ichi; Ishikawa, Yoichi; Hiyoshi, Yoshihisa; Ito, Kosuke; Hirahara, Kazuro

2015-10-01

Data assimilation is a technique that optimizes the parameters used in a numerical model with a constraint of model dynamics achieving the better fit to observations. Optimized parameters can be utilized for the subsequent prediction with a numerical model and predicted physical variables are presumably closer to observations that will be available in the future, at least, comparing to those obtained without the optimization through data assimilation. In this work, an adjoint data assimilation system is developed for optimizing a relatively large number of spatially inhomogeneous frictional parameters during the afterslip period in which the physical constraints are a quasi-dynamic equation of motion and a laboratory derived rate and state dependent friction law that describe the temporal evolution of slip velocity at subduction zones. The observed variable is estimated slip velocity on the plate interface. Before applying this method to the real data assimilation for the afterslip of the 2003 Tokachi-oki earthquake, a synthetic data assimilation experiment is conducted to examine the feasibility of optimizing the frictional parameters in the afterslip area. It is confirmed that the current system is capable of optimizing the frictional parameters A-B, A and L by adopting the physical constraint based on a numerical model if observations capture the acceleration and decaying phases of slip on the plate interface. On the other hand, it is unlikely to constrain the frictional parameters in the region where the amplitude of afterslip is less than 1.0 cm d-1. Next, real data assimilation for the 2003 Tokachi-oki earthquake is conducted to incorporate slip velocity data inferred from time dependent inversion of Global Navigation Satellite System time-series. The optimized values of A-B, A and L are O(10 kPa), O(102 kPa) and O(10 mm), respectively. The optimized frictional parameters yield the better fit to the observations and the better prediction skill of slip

4. Earthquake Simulator Finds Tremor Triggers

ScienceCinema

Johnson, Paul

2018-01-16

Using a novel device that simulates earthquakes in a laboratory setting, a Los Alamos researcher has found that seismic waves-the sounds radiated from earthquakes-can induce earthquake aftershocks, often long after a quake has subsided. The research provides insight into how earthquakes may be triggered and how they recur. Los Alamos researcher Paul Johnson and colleague Chris Marone at Penn State have discovered how wave energy can be stored in certain types of granular materials-like the type found along certain fault lines across the globe-and how this stored energy can suddenly be released as an earthquake when hit by relatively small seismic waves far beyond the traditional âaftershock zoneâ of a main quake. Perhaps most surprising, researchers have found that the release of energy can occur minutes, hours, or even days after the sound waves pass; the cause of the delay remains a tantalizing mystery.

5. Earthquake precursory events around epicenters and local active faults; the cases of two inland earthquakes in Iran

2012-12-01

The possibility of earthquake prediction in the frame of several days to few minutes before its occurrence has stirred interest among researchers, recently. Scientists believe that the new theories and explanations of the mechanism of this natural phenomenon are trustable and can be the basis of future prediction efforts. During the last thirty years experimental researches resulted in some pre-earthquake events which are now recognized as confirmed warning signs (precursors) of past known earthquakes. With the advances in in-situ measurement devices and data analysis capabilities and the emergence of satellite-based data collectors, monitoring the earth's surface is now a regular work. Data providers are supplying researchers from all over the world with high quality and validated imagery and non-imagery data. Surface Latent Heat Flux (SLHF) or the amount of energy exchange in the form of water vapor between the earth's surface and atmosphere has been frequently reported as an earthquake precursor during the past years. The accumulated stress in the earth's crust during the preparation phase of earthquakes is said to be the main cause of temperature anomalies weeks to days before the main event and subsequent shakes. Chemical and physical interactions in the presence of underground water lead to higher water evaporation prior to inland earthquakes. On the other hand, the leak of Radon gas occurred as rocks break during earthquake preparation causes the formation of airborne ions and higher Air Temperature (AT) prior to main event. Although co-analysis of direct and indirect observation for precursory events is considered as a promising method for future successful earthquake prediction, without proper and thorough knowledge about the geological setting, atmospheric factors and geodynamics of the earthquake-prone regions we will not be able to identify anomalies due to seismic activity in the earth's crust. Active faulting is a key factor in identification of the

6. Earthquake number forecasts testing

Kagan, Yan Y.

2017-10-01

We study the distributions of earthquake numbers in two global earthquake catalogues: Global Centroid-Moment Tensor and Preliminary Determinations of Epicenters. The properties of these distributions are especially required to develop the number test for our forecasts of future seismic activity rate, tested by the Collaboratory for Study of Earthquake Predictability (CSEP). A common assumption, as used in the CSEP tests, is that the numbers are described by the Poisson distribution. It is clear, however, that the Poisson assumption for the earthquake number distribution is incorrect, especially for the catalogues with a lower magnitude threshold. In contrast to the one-parameter Poisson distribution so widely used to describe earthquake occurrences, the negative-binomial distribution (NBD) has two parameters. The second parameter can be used to characterize the clustering or overdispersion of a process. We also introduce and study a more complex three-parameter beta negative-binomial distribution. We investigate the dependence of parameters for both Poisson and NBD distributions on the catalogue magnitude threshold and on temporal subdivision of catalogue duration. First, we study whether the Poisson law can be statistically rejected for various catalogue subdivisions. We find that for most cases of interest, the Poisson distribution can be shown to be rejected statistically at a high significance level in favour of the NBD. Thereafter, we investigate whether these distributions fit the observed distributions of seismicity. For this purpose, we study upper statistical moments of earthquake numbers (skewness and kurtosis) and compare them to the theoretical values for both distributions. Empirical values for the skewness and the kurtosis increase for the smaller magnitude threshold and increase with even greater intensity for small temporal subdivision of catalogues. The Poisson distribution for large rate values approaches the Gaussian law, therefore its skewness

7. Identifying Future Scientists: Predicting Persistence into Research Training

PubMed Central

2007-01-01

This study used semistructured interviews and grounded theory to look for characteristics among college undergraduates that predicted persistence into Ph.D. and M.D./Ph.D. training. Participants in the summer undergraduate and postbaccalaureate research programs at the Mayo Clinic College of Medicine were interviewed at the start, near the end, and 8–12 months after their research experience. Of more than 200 themes considered, five characteristics predicted those students who went on to Ph.D. and M.D./Ph.D. training or to M.D. training intending to do research: 1) Curiosity to discover the unknown, 2) Enjoyment of problem solving, 3) A high level of independence, 4) The desire to help others indirectly through research, and 5) A flexible, minimally structured approach to the future. Web-based surveys with different students confirmed the high frequency of curiosity and/or problem solving as the primary reason students planned research careers. No evidence was found for differences among men, women, and minority and nonminority students. Although these results seem logical compared with successful scientists, their constancy, predictive capabilities, and sharp contrast to students who chose clinical medicine were striking. These results provide important insights into selection and motivation of potential biomedical scientists and the early experiences that will motivate them toward research careers. PMID:18056303

8. Identifying future scientists: predicting persistence into research training.

PubMed

McGee, Richard; Keller, Jill L

2007-01-01

This study used semistructured interviews and grounded theory to look for characteristics among college undergraduates that predicted persistence into Ph.D. and M.D./Ph.D. training. Participants in the summer undergraduate and postbaccalaureate research programs at the Mayo Clinic College of Medicine were interviewed at the start, near the end, and 8-12 months after their research experience. Of more than 200 themes considered, five characteristics predicted those students who went on to Ph.D. and M.D./Ph.D. training or to M.D. training intending to do research: 1) Curiosity to discover the unknown, 2) Enjoyment of problem solving, 3) A high level of independence, 4) The desire to help others indirectly through research, and 5) A flexible, minimally structured approach to the future. Web-based surveys with different students confirmed the high frequency of curiosity and/or problem solving as the primary reason students planned research careers. No evidence was found for differences among men, women, and minority and nonminority students. Although these results seem logical compared with successful scientists, their constancy, predictive capabilities, and sharp contrast to students who chose clinical medicine were striking. These results provide important insights into selection and motivation of potential biomedical scientists and the early experiences that will motivate them toward research careers.

9. Identifying Future Scientists: Predicting Persistence into Research Training

ERIC Educational Resources Information Center

McGee, Richard; Keller, Jill L.

2007-01-01

This study used semistructured interviews and grounded theory to look for characteristics among college undergraduates that predicted persistence into Ph.D. and M.D./Ph.D. training. Participants in the summer undergraduate and postbaccalaureate research programs at the Mayo Clinic College of Medicine were interviewed at the start, near the end,…

10. Earthquake Facts

MedlinePlus

... recordings of large earthquakes, scientists built large spring-pendulum seismometers in an attempt to record the long- ... are moving away from one another. The first “pendulum seismoscope” to measure the shaking of the ground ...

11. Hydroclimatic variability and predictability: a survey of recent research

SciTech Connect

Koster, Randal D.; Betts, Alan K.; Dirmeyer, Paul A.

Recent research in large-scale hydroclimatic variability is surveyed, focusing on five topics: (i) variability in general, (ii) droughts, (iii) floods, (iv) land–atmosphere coupling, and (v) hydroclimatic prediction. Moreover, each surveyed topic is supplemented by illustrative examples of recent research, as presented at a 2016 symposium honoring the career of Professor Eric Wood. Altogether, the recent literature and the illustrative examples clearly show that current research into hydroclimatic variability is strong, vibrant, and multifaceted.

12. Hydroclimatic variability and predictability: a survey of recent research

DOE PAGES

Koster, Randal D.; Betts, Alan K.; Dirmeyer, Paul A.; ...

2017-07-25

Recent research in large-scale hydroclimatic variability is surveyed, focusing on five topics: (i) variability in general, (ii) droughts, (iii) floods, (iv) land–atmosphere coupling, and (v) hydroclimatic prediction. Moreover, each surveyed topic is supplemented by illustrative examples of recent research, as presented at a 2016 symposium honoring the career of Professor Eric Wood. Altogether, the recent literature and the illustrative examples clearly show that current research into hydroclimatic variability is strong, vibrant, and multifaceted.

13. Charles Darwin's earthquake reports

Galiev, Shamil

2010-05-01

As it is the 200th anniversary of Darwin's birth, 2009 has also been marked as 170 years since the publication of his book Journal of Researches. During the voyage Darwin landed at Valdivia and Concepcion, Chile, just before, during, and after a great earthquake, which demolished hundreds of buildings, killing and injuring many people. Land was waved, lifted, and cracked, volcanoes awoke and giant ocean waves attacked the coast. Darwin was the first geologist to observe and describe the effects of the great earthquake during and immediately after. These effects sometimes repeated during severe earthquakes; but great earthquakes, like Chile 1835, and giant earthquakes, like Chile 1960, are rare and remain completely unpredictable. This is one of the few areas of science, where experts remain largely in the dark. Darwin suggested that the effects were a result of ‘ …the rending of strata, at a point not very deep below the surface of the earth…' and ‘…when the crust yields to the tension, caused by its gradual elevation, there is a jar at the moment of rupture, and a greater movement...'. Darwin formulated big ideas about the earth evolution and its dynamics. These ideas set the tone for the tectonic plate theory to come. However, the plate tectonics does not completely explain why earthquakes occur within plates. Darwin emphasised that there are different kinds of earthquakes ‘...I confine the foregoing observations to the earthquakes on the coast of South America, or to similar ones, which seem generally to have been accompanied by elevation of the land. But, as we know that subsidence has gone on in other quarters of the world, fissures must there have been formed, and therefore earthquakes...' (we cite the Darwin's sentences following researchspace. auckland. ac. nz/handle/2292/4474). These thoughts agree with results of the last publications (see Nature 461, 870-872; 636-639 and 462, 42-43; 87-89). About 200 years ago Darwin gave oneself airs by the

14. Observation and prediction of dynamic ground strains, tilts, and torsions caused by the Mw 6.0 2004 Parkfield, California, earthquake and aftershocks, derived from UPSAR array observations

USGS Publications Warehouse

Spudich, P.; Fletcher, Joe B.

2008-01-01

The 28 September 2004 Parkfield, California, earthquake (Mw 6.0) and four aftershocks (Mw 4.7-5.1) were recorded on 12 accelerograph stations of the U.S. Geological Survey Parkfield seismic array (UPSAR), an array of three-component accelerographs occupying an area of about 1 km2 located 8.8 km from the San Andreas fault. Peak horizontal acceleration and velocity at UPSAR during the mainshock were 0.45g and 27 cm/sec, respectively. We determined both time-varying and peak values of ground dilatations, shear strains, torsions, tilts, torsion rates, and tilt rates by applying a time-dependent geodetic analysis to the observed array displacement time series. Array-derived dilatations agree fairly well with point measurements made on high sample rate recordings of the Parkfield-area dilatometers (Johnston et al., 2006). Torsion Fourier amplitude spectra agree well with ground velocity spectra, as expected for propagating plane waves. A simple predictive relation, using the predicted peak velocity from the Boore-Atkinson ground-motion prediction relation (Boore and Atkinson, 2007) scaled by a phase velocity of 1 km/sec, predicts observed peak Parkfield and Chi-Chi rotations (Huang, 2003) well. However, rotation rates measured during Mw 5 Ito, Japan, events observed on a gyro sensor (Takeo, 1998) are factors of 5-60 greater than those predicted by our predictive relation. This discrepancy might be caused by a scale dependence in rotation, with rotations measured over a short baseline exceeding those measured over long baselines. An alternative hypothesis is that events having significant non-double-couple mechanisms, like the Ito events, radiate much stronger rotations than double-couple events. If this is true, then rotational observations might provide an important source of new information for monitoring seismicity in volcanic areas.

15. Evaluation of predictive capacities of biomarkers based on research synthesis.

PubMed

Hattori, Satoshi; Zhou, Xiao-Hua

2016-11-10

The objective of diagnostic studies or prognostic studies is to evaluate and compare predictive capacities of biomarkers. Suppose we are interested in evaluation and comparison of predictive capacities of continuous biomarkers for a binary outcome based on research synthesis. In analysis of each study, subjects are often classified into two groups of the high-expression and low-expression groups according to a cut-off value, and statistical analysis is based on a 2 × 2 table defined by the response and the high expression or low expression of the biomarker. Because the cut-off is study specific, it is difficult to interpret a combined summary measure such as an odds ratio based on the standard meta-analysis techniques. The summary receiver operating characteristic curve is a useful method for meta-analysis of diagnostic studies in the presence of heterogeneity of cut-off values to examine discriminative capacities of biomarkers. We develop a method to estimate positive or negative predictive curves, which are alternative to the receiver operating characteristic curve based on information reported in published papers of each study. These predictive curves provide a useful graphical presentation of pairs of positive and negative predictive values and allow us to compare predictive capacities of biomarkers of different scales in the presence of heterogeneity in cut-off values among studies. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

16. What kind of disturbances did March 11, 2011 Tohoku Earthquake and Tsunamis leave continental margin ecosystems? : Lessons from five years monitoring research

Kitazato, Hiroshi; Kijima, Akihiro; Kogure, Kazuhiro; Hara, Motoyuki; Nagata, Toshi; Fujikura, Kasunori; Sonoda, Akira

2016-04-01

On March 11, 2011, huge earthquake with M9.0 took place at Japan Trench area off Northeast Japan. Vigorous disturbances of marine environments and ecosystems have taken place at coastal areas where huge tsunamis swept sediments and organisms away from the coastal areas to deeper oceans. Distributional pattern of sediments and organisms in coves and bays have strongly changed after tsunamis. Marine ecosystems at Northeast Japan have totally disturbed and damaged. Scientists from Tohoku University, the University of Tokyo and JAMSTEC have started to monitor how much marine ecosystem disturbed and how it may recover. A research team, named Tohoku Ecosystem-Associated Marine Sciences, continually makes research on marine ecosystems as ten years monitoring project funded by MEXT, Japan since 2011. On 2016, it takes five years from the Earthquake and Tsunami occurred. What happens marine ecosystems at Tohoku area during these years. Water column ecosystems are rather easy to recover from disturbances. Seaweed communities have strongly damaged, but, they gradually recover. Sediment communities have not recovered yet as sediment distribution is different from before earthquake and tsunamis. Most difficulties are scars in human minds. We, scientists, try to share scientific activities and results with local peoples including fishermen and local governments for better understanding of both oceanic conditions and fishery resources. Disaster risk reduction should accelerate with resilience of community structure. But, mental resilience is the most effective way to recover human activities at the damaged areas.

17. USEMS & GLASS: investigator-driven frontier research in earthquake physics. Ground-breaking research in Europe enhances outreach to the general public

Mariano, S.; di Toro, G.; Collettini, C.; Usems Team; Glass Team

2011-12-01

USEMS and GLASS are two projects financed by the European Research Council (ERC) as part of the ERC starting grants scheme within the FP7 framework. The rationale behind the funding scheme is to support some of the most promising scientific endeavours in Europe that are being led by young researchers, and to emphasize the excellence of individual ideas rather than specific research areas; in other words, to promote bottom-up frontier research. The general benefits of this rationale are evident in the two ongoing projects that deal with earthquake physics, as these projects are increasingly recognized in their scientific community. We can say that putting excellence at the heart of European Research strongly contributes to the construction of a European knowledge-based society. From a researcher point-of-view one of the most challenging aspects of these projects is to approach and convey the results of the projects to a general public, contributing to the construction of knowledge-based society. Luckily, media interest and the availability of a number of new communication tools facilitate the outreach of scientific achievements. The largest earthquakes during the last ten years (e.g. Sumatra 2004 and Japan 2011) have received widespread attention in the media world (TV, W.W.W., Newspaper and so on) for months, and successful research projects such as those above also become media protagonists, gaining their space in the media bullring. The USEMS principal investigator and his team have participated in several dissemination events in the Mass Media, such as interviews wit Italian and French TV national broadcasts (RAI Due TG2, RAI Uno Unomattina, Rai Tre Geo & Geo, FRANCE 2); interviews in scientific journals: SCIENCE (Sept. 2010), newspapers and web (Corriere della Sera, Il Gazzettino, Il Messagero, La Stampa, Libero, Il Mattino, Yahoo, ANSA, AdnKronos and AGI); radio (RadioRai Uno, RadioRai Tre Scienza); documentary "Die Eroberung der Alpen" produced by Tangram

18. Earthquake impact scale

USGS Publications Warehouse

Wald, D.J.; Jaiswal, K.S.; Marano, K.D.; Bausch, D.

2011-01-01

With the advent of the USGS prompt assessment of global earthquakes for response (PAGER) system, which rapidly assesses earthquake impacts, U.S. and international earthquake responders are reconsidering their automatic alert and activation levels and response procedures. To help facilitate rapid and appropriate earthquake response, an Earthquake Impact Scale (EIS) is proposed on the basis of two complementary criteria. On the basis of the estimated cost of damage, one is most suitable for domestic events; the other, on the basis of estimated ranges of fatalities, is generally more appropriate for global events, particularly in developing countries. Simple thresholds, derived from the systematic analysis of past earthquake impact and associated response levels, are quite effective in communicating predicted impact and response needed after an event through alerts of green (little or no impact), yellow (regional impact and response), orange (national-scale impact and response), and red (international response). Corresponding fatality thresholds for yellow, orange, and red alert levels are 1, 100, and 1,000, respectively. For damage impact, yellow, orange, and red thresholds are triggered by estimated losses reaching $1M,$100M, and $1B, respectively. The rationale for a dual approach to earthquake alerting stems from the recognition that relatively high fatalities, injuries, and homelessness predominate in countries in which local building practices typically lend themselves to high collapse and casualty rates, and these impacts lend to prioritization for international response. In contrast, financial and overall societal impacts often trigger the level of response in regions or countries in which prevalent earthquake resistant construction practices greatly reduce building collapse and resulting fatalities. Any newly devised alert, whether economic- or casualty-based, should be intuitive and consistent with established lexicons and procedures. Useful alerts should 19. Earthquakes for Kids MedlinePlus ... across a fault to learn about past earthquakes. Science Fair Projects A GPS instrument measures slow movements of the ground. Become an Earthquake Scientist Cool Earthquake Facts Today in Earthquake History A scientist stands in ... 20. Modified Mercalli Intensity for scenario earthquakes in Evansville, Indiana USGS Publications Warehouse Cramer, Chris; Haase, Jennifer; Boyd, Oliver 2012-01-01 Evansville, Indiana, has experienced minor damage from earthquakes several times in the past 200 years. Because of this history and the fact that Evansville is close to the Wabash Valley and New Madrid seismic zones, there is concern about the hazards from earthquakes. Earthquakes currently cannot be predicted, but scientists can estimate how strongly the ground is likely to shake as a result of an earthquake. Earthquake-hazard maps provide one way of conveying such estimates of strong ground shaking and will help the region prepare for future earthquakes and reduce earthquake-caused losses. 1. Earthquake Safety Tips in the Classroom NASA Astrophysics Data System (ADS) Melo, M. O.; Maciel, B. A. P. C.; Neto, R. P.; Hartmann, R. P.; Marques, G.; Gonçalves, M.; Rocha, F. L.; Silveira, G. M. 2014-12-01 The catastrophes induced by earthquakes are among the most devastating ones, causing an elevated number of human losses and economic damages. But, we have to keep in mind that earthquakes don't kill people, buildings do. Earthquakes can't be predicted and the only way of dealing with their effects is to teach the society how to be prepared for them, and how to deal with their consequences. In spite of being exposed to moderate and large earthquakes, most of the Portuguese are little aware of seismic risk, mainly due to the long recurrence intervals between strong events. The acquisition of safe and correct attitudes before, during and after an earthquake is relevant for human security. Children play a determinant role in the establishment of a real and long-lasting "culture of prevention", both through action and new attitudes. On the other hand, when children assume correct behaviors, their relatives often change their incorrect behaviors to mimic the correct behaviors of their kids. In the framework of a Parents-in-Science initiative, we started with bi-monthly sessions for children aged 5 - 6 years old and 9 - 10 years old. These sessions, in which parents, teachers and high-school students participate, became part of the school's permanent activities. We start by a short introduction to the Earth and to earthquakes by story telling and by using simple science activities to trigger children curiosity. With safety purposes, we focus on how crucial it is to know basic information about themselves and to define, with their families, an emergency communications plan, in case family members are separated. Using a shaking table we teach them how to protect themselves during an earthquake. We then finish with the preparation on an individual emergency kit. This presentation will highlight the importance of encouraging preventive actions in order to reduce the impact of earthquakes on society. This project is developed by science high-school students and teachers, in 2. Predicting research use in nursing organizations: a multilevel analysis. PubMed Estabrooks, Carole A; Midodzi, William K; Cummings, Greta G; Wallin, Lars 2007-01-01 No empirical literature was found that explained how organizational context (operationalized as a composite of leadership, culture, and evaluation) influences research utilization. Similarly, no work was found on the interaction of individuals and contextual factors, or the relative importance or contribution of forces at different organizational levels to either such proposed interactions or, ultimately, to research utilization. To determine independent factors that predict research utilization among nurses, taking into account influences at individual nurse, specialty, and hospital levels. Cross-sectional survey data for 4,421 registered nurses in Alberta, Canada were used in a series of multilevel (three levels) modeling analyses to predict research utilization. A multilevel model was developed in MLwiN version 2.0 and used to: (a) estimate simultaneous effects of several predictors and (b) quantify the amount of explained variance in research utilization that could be apportioned to individual, specialty, and hospital levels. There was significant variation in research utilization (p <.05). Factors (remaining in the final model at statistically significant levels) found to predict more research utilization at the three levels of analysis were as follows. At the individual nurse level (Level 1): time spent on the Internet and lower levels of emotional exhaustion. At the specialty level (Level 2): facilitation, nurse-to-nurse collaboration, a higher context (i.e., of nursing culture, leadership, and evaluation), and perceived ability to control policy. At the hospital level (Level 3): only hospital size was significant in the final model. The total variance in research utilization was 1.04, and the intraclass correlations (the percent contribution by contextual factors) were 4% (variance = 0.04, p <.01) at the hospital level and 8% (variance = 0.09, p <.05) at the specialty level. The contribution attributable to individual factors alone was 87% (variance = 0 3. The Scintillation Prediction Observations Research Task (SPORT) Mission NASA Technical Reports Server (NTRS) Spann, James; Swenson, Charles; Durao, Otavio; Loures, Luis; Heelis, Rod; Bishop, Rebecca; Le, Guan; Abdu, Mangalathayil; Krause, Linda; Denardin, Clezio; 2017-01-01 SPORT is a science mission using a 6U CubeSat and integrated ground network that will (1) advance understanding and (2) enable improved predictions of scintillation occurrence that impact GPS signals and radio communications. This is the science of Space Weather. SPORT is an international partnership with NASA, U.S. institutions, the Brazilian National Institute for Space Research (INPE), and the Technical Aeronautics Institute under the Brazilian Air Force Command Department (DCTA/ITA). 4. PubMed Uenishi, Koji 2017-01-01 Normally, an earthquake is considered as a phenomenon of wave energy radiation by rupture (fracture) of solid Earth. However, the physics of dynamic process around seismic sources, which may play a crucial role in the occurrence of earthquakes and generation of strong waves, has not been fully understood yet. Instead, much of former investigation in seismology evaluated earthquake characteristics in terms of kinematics that does not directly treat such dynamic aspects and usually excludes the influence of high-frequency wave components over 1 Hz. There are countless valuable research outcomes obtained through this kinematics-based approach, but "extraordinary" phenomena that are difficult to be explained by this conventional description have been found, for instance, on the occasion of the 1995 Hyogo-ken Nanbu, Japan, earthquake, and more detailed study on rupture and wave dynamics, namely, possible mechanical characteristics of (1) rupture development around seismic sources, (2) earthquake-induced structural failures and (3) wave interaction that connects rupture (1) and failures (2), would be indispensable. 5. Rupture, waves and earthquakes PubMed Central UENISHI, Koji 2017-01-01 Normally, an earthquake is considered as a phenomenon of wave energy radiation by rupture (fracture) of solid Earth. However, the physics of dynamic process around seismic sources, which may play a crucial role in the occurrence of earthquakes and generation of strong waves, has not been fully understood yet. Instead, much of former investigation in seismology evaluated earthquake characteristics in terms of kinematics that does not directly treat such dynamic aspects and usually excludes the influence of high-frequency wave components over 1 Hz. There are countless valuable research outcomes obtained through this kinematics-based approach, but “extraordinary” phenomena that are difficult to be explained by this conventional description have been found, for instance, on the occasion of the 1995 Hyogo-ken Nanbu, Japan, earthquake, and more detailed study on rupture and wave dynamics, namely, possible mechanical characteristics of (1) rupture development around seismic sources, (2) earthquake-induced structural failures and (3) wave interaction that connects rupture (1) and failures (2), would be indispensable. PMID:28077808 6. Purposes and methods of scoring earthquake forecasts NASA Astrophysics Data System (ADS) Zhuang, J. 2010-12-01 There are two kinds of purposes in the studies on earthquake prediction or forecasts: one is to give a systematic estimation of earthquake risks in some particular region and period in order to give advice to governments and enterprises for the use of reducing disasters, the other one is to search for reliable precursors that can be used to improve earthquake prediction or forecasts. For the first case, a complete score is necessary, while for the latter case, a partial score, which can be used to evaluate whether the forecasts or predictions have some advantages than a well know model, is necessary. This study reviews different scoring methods for evaluating the performance of earthquake prediction and forecasts. Especially, the gambling scoring method, which is developed recently, shows its capacity in finding good points in an earthquake prediction algorithm or model that are not in a reference model, even if its overall performance is no better than the reference model. 7. Logging utilization research in the Pacific Northwest: residue prediction and unique research challenges Treesearch Erik C. Berg; Todd A. Morgan; Eric A. Simmons; Stanley J. Zarnoch 2015-01-01 Logging utilization research results have informed land managers of changes in utilization of forest growing stock for more than 40 years. The logging utilization residue ratio- growing stock residue volume/mill delivered volume- can be applied to historic or projected timber harvest volumes to predict woody residue volumes at varied spatial scales. Researchers at the... 8. Global earthquake casualties due to secondary effects: A quantitative analysis for improving PAGER losses USGS Publications Warehouse Wald, David J. 2010-01-01 This study presents a quantitative and geospatial description of global losses due to earthquake-induced secondary effects, including landslide, liquefaction, tsunami, and ﬁre for events during the past 40 years. These processes are of great importance to the US Geological Survey’s (USGS) Prompt Assessment of Global Earthquakes for Response (PAGER) system, which is currently being developed to deliver rapid earthquake impact and loss assessments following large/signiﬁcant global earthquakes. An important question is how dominant are losses due to secondary effects (and under what conditions, and in which regions)? Thus, which of these effects should receive higher priority research efforts in order to enhance PAGER’s overall assessment of earthquakes losses and alerting for the likelihood of secondary impacts? We ﬁnd that while 21.5% of fatal earthquakes have deaths due to secondary (non-shaking) causes, only rarely are secondary effects the main cause of fatalities. The recent 2004 Great Sumatra–Andaman Islands earthquake is a notable exception, with extraordinary losses due to tsunami. The potential for secondary hazards varies greatly, and systematically, due to regional geologic and geomorphic conditions. Based on our ﬁndings, we have built country-speciﬁc disclaimers for PAGER that address potential for each hazard (Earle et al., Proceedings of the 14th World Conference of the Earthquake Engineering, Beijing, China, 2008). We will now focus on ways to model casualties from secondary effects based on their relative importance as well as their general predictability. 9. Global earthquake casualties due to secondary effects: A quantitative analysis for improving rapid loss analyses USGS Publications Warehouse Marano, K.D.; Wald, D.J.; Allen, T.I. 2010-01-01 This study presents a quantitative and geospatial description of global losses due to earthquake-induced secondary effects, including landslide, liquefaction, tsunami, and fire for events during the past 40 years. These processes are of great importance to the US Geological Survey's (USGS) Prompt Assessment of Global Earthquakes for Response (PAGER) system, which is currently being developed to deliver rapid earthquake impact and loss assessments following large/significant global earthquakes. An important question is how dominant are losses due to secondary effects (and under what conditions, and in which regions)? Thus, which of these effects should receive higher priority research efforts in order to enhance PAGER's overall assessment of earthquakes losses and alerting for the likelihood of secondary impacts? We find that while 21.5% of fatal earthquakes have deaths due to secondary (non-shaking) causes, only rarely are secondary effects the main cause of fatalities. The recent 2004 Great Sumatra-Andaman Islands earthquake is a notable exception, with extraordinary losses due to tsunami. The potential for secondary hazards varies greatly, and systematically, due to regional geologic and geomorphic conditions. Based on our findings, we have built country-specific disclaimers for PAGER that address potential for each hazard (Earle et al., Proceedings of the 14th World Conference of the Earthquake Engineering, Beijing, China, 2008). We will now focus on ways to model casualties from secondary effects based on their relative importance as well as their general predictability. ?? Springer Science+Business Media B.V. 2009. 10. Predicting Space Weather: Challenges for Research and Operations NASA Astrophysics Data System (ADS) Singer, H. J.; Onsager, T. G.; Rutledge, R.; Viereck, R. A.; Kunches, J. 2013-12-01 Society's growing dependence on technologies and infrastructure susceptible to the consequences of space weather has given rise to increased attention at the highest levels of government as well as inspired the need for both research and improved space weather services. In part, for these reasons, the number one goal of the recent National Research Council report on a Decadal Strategy for Solar and Space Physics is to 'Determine the origins of the Sun's activity and predict the variations in the space environment.' Prediction of conditions in our space environment is clearly a challenge for both research and operations, and we require the near-term development and validation of models that have sufficient accuracy and lead time to be useful to those impacted by space weather. In this presentation, we will provide new scientific results of space weather conditions that have challenged space weather forecasters, and identify specific areas of research that can lead to improved capabilities. In addition, we will examine examples of customer impacts and requirements as well as the challenges to the operations community to establish metrics that enable the selection and transition of models and observations that can provide the greatest economic and societal benefit. 11. Ionospheric precursors to large earthquakes: A case study of the 2011 Japanese Tohoku Earthquake NASA Astrophysics Data System (ADS) Carter, B. A.; Kellerman, A. C.; Kane, T. A.; Dyson, P. L.; Norman, R.; Zhang, K. 2013-09-01 Researchers have reported ionospheric electron distribution abnormalities, such as electron density enhancements and/or depletions, that they claimed were related to forthcoming earthquakes. In this study, the Tohoku earthquake is examined using ionosonde data to establish whether any otherwise unexplained ionospheric anomalies were detected in the days and hours prior to the event. As the choices for the ionospheric baseline are generally different between previous works, three separate baselines for the peak plasma frequency of the F2 layer, foF2, are employed here; the running 30-day median (commonly used in other works), the International Reference Ionosphere (IRI) model and the Thermosphere Ionosphere Electrodynamic General Circulation Model (TIE-GCM). It is demonstrated that the classification of an ionospheric perturbation is heavily reliant on the baseline used, with the 30-day median, the IRI and the TIE-GCM generally underestimating, approximately describing and overestimating the measured foF2, respectively, in the 1-month period leading up to the earthquake. A detailed analysis of the ionospheric variability in the 3 days before the earthquake is then undertaken, where a simultaneous increase in foF2 and the Es layer peak plasma frequency, foEs, relative to the 30-day median was observed within 1 h before the earthquake. A statistical search for similar simultaneous foF2 and foEs increases in 6 years of data revealed that this feature has been observed on many other occasions without related seismic activity. Therefore, it is concluded that one cannot confidently use this type of ionospheric perturbation to predict an impending earthquake. It is suggested that in order to achieve significant progress in our understanding of seismo-ionospheric coupling, better account must be taken of other known sources of ionospheric variability in addition to solar and geomagnetic activity, such as the thermospheric coupling. 12. A new macroseismic intensity prediction equation and magnitude estimates of the 1811-1812 New Madrid and 1886 Charleston, South Carolina, earthquakes NASA Astrophysics Data System (ADS) Boyd, O. S.; Cramer, C. H. 2013-12-01 We develop an intensity prediction equation (IPE) for the Central and Eastern United States, explore differences between modified Mercalli intensities (MMI) and community internet intensities (CII) and the propensity for reporting, and estimate the moment magnitudes of the 1811-1812 New Madrid, MO, and 1886 Charleston, SC, earthquakes. We constrain the study with North American census data, the National Oceanic and Atmospheric Administration MMI dataset (responses between 1924 and 1985), and the USGS ';Did You Feel It?' CII dataset (responses between June, 2000 and August, 2012). The combined intensity dataset has more than 500,000 felt reports for 517 earthquakes with magnitudes between 2.5 and 7.2. The IPE has the basic form, MMI=c1+c2M+c3exp(λ)+c4λ. where M is moment magnitude and λ is mean log hypocentral distance. Previous IPEs use a limited dataset of MMI, do not differentiate between MMI and CII data in the CEUS, nor account for spatial variations in population. These factors can have an impact at all magnitudes, especially the last factor at large magnitudes and small intensities where the population drops to zero in the Atlantic Ocean and Gulf of Mexico. We assume that the number of reports of a given intensity have hypocentral distances that are log-normally distributed, the distribution of which is modulated by population and the propensity for individuals to report their experience. We do not account for variations in stress drop, regional variations in Q, or distance-dependent geometrical spreading. We simulate the distribution of reports of a given intensity accounting for population and use a grid search method to solve for the fraction of population to report the intensity, the standard deviation of the log-normal distribution and the mean log hypocentral distance, which appears in the above equation. We find that lower intensities, both CII and MMI, are less likely to be reported than greater intensities. Further, there are strong spatial 13. From documentation to prediction: Raising the bar for thermokarst research DOE PAGES Rowland, Joel C.; Coon, Ethan T. 2015-11-12 Here we report that to date the majority of published research on thermokarst has been directed at documenting its form, occurrence, and rates of occurrence. The fundamental processes driving thermokarst have long been largely understood. However, the detailed physical couplings between, water, air, soil, and the thermal dynamics governing freeze-thaw and soil mechanics is less understood and not captured in models aimed at predicting the response of frozen soils to warming and thaw. As computational resources increase more sophisticated mechanistic models can be applied; these show great promise as predictive tools. These models will be capable of simulating the responsemore » of soil deformation to thawing/freezing cycles and the long-term, non-recoverable response of the land surface to the loss of ice. At the same time, advances in remote sensing of permafrost environments also show promise in providing detailed and spatially extensive estimates in the rates and patterns of subsidence. These datasets provide key constraints to calibrate and evaluate the predictive power of mechanistic models. In conclusion, in the coming decade, these emerging technologies will greatly increase our ability to predict when, where, and how thermokarst will occur in a changing climate.« less 14. Researches on High Accuracy Prediction Methods of Earth Orientation Parameters NASA Astrophysics Data System (ADS) Xu, X. Q. 2015-09-01 respectively, which are used to improve/re-evaluate the AR model. Comparing to the single AR model, the AR+Kalman method performs better in the prediction of UT1-UTC and ΔLOD, and the improvement in the prediction of the polar motion is significant. (3) Following the successful Earth Orientation Parameter Prediction Comparison Campaign (EOP PCC), the Earth Orientation Parameter Combination of Prediction Pilot Project (EOPC PPP) was sponsored in 2010. As one of the participants from China, we update and submit the short- and medium-term (1 to 90 days) EOP predictions every day. From the current comparative statistics, our prediction accuracy is on the medium international level. We will carry out more innovative researches to improve the EOP forecast accuracy and enhance our level in EOP forecast. 15. Predicting phonetic transcription agreement: Insights from research in infant vocalizations PubMed Central RAMSDELL, HEATHER L.; OLLER, D. KIMBROUGH; ETHINGTON, CORINNA A. 2010-01-01 The purpose of this study is to provide new perspectives on correlates of phonetic transcription agreement. Our research focuses on phonetic transcription and coding of infant vocalizations. The findings are presumed to be broadly applicable to other difficult cases of transcription, such as found in severe disorders of speech, which similarly result in low reliability for a variety of reasons. We evaluated the predictiveness of two factors not previously documented in the literature as influencing transcription agreement: canonicity and coder confidence. Transcribers coded samples of infant vocalizations, judging both canonicity and confidence. Correlation results showed that canonicity and confidence were strongly related to agreement levels, and regression results showed that canonicity and confidence both contributed significantly to explanation of variance. Specifically, the results suggest that canonicity plays a major role in transcription agreement when utterances involve supraglottal articulation, with coder confidence offering additional power in predicting transcription agreement. PMID:17882695 16. Understanding Earthquakes ERIC Educational Resources Information Center Davis, Amanda; Gray, Ron 2018-01-01 December 26, 2004 was one of the deadliest days in modern history, when a 9.3 magnitude earthquake--the third largest ever recorded--struck off the coast of Sumatra in Indonesia (National Centers for Environmental Information 2014). The massive quake lasted at least 10 minutes and devastated the Indian Ocean. The quake displaced an estimated… 17. Earthquake Testing NASA Technical Reports Server (NTRS) 1979-01-01 During NASA's Apollo program, it was necessary to subject the mammoth Saturn V launch vehicle to extremely forceful vibrations to assure the moonbooster's structural integrity in flight. Marshall Space Flight Center assigned vibration testing to a contractor, the Scientific Services and Systems Group of Wyle Laboratories, Norco, California. Wyle-3S, as the group is known, built a large facility at Huntsville, Alabama, and equipped it with an enormously forceful shock and vibration system to simulate the liftoff stresses the Saturn V would encounter. Saturn V is no longer in service, but Wyle-3S has found spinoff utility for its vibration facility. It is now being used to simulate earthquake effects on various kinds of equipment, principally equipment intended for use in nuclear power generation. Government regulations require that such equipment demonstrate its ability to survive earthquake conditions. In upper left photo, Wyle3S is preparing to conduct an earthquake test on a 25ton diesel generator built by Atlas Polar Company, Ltd., Toronto, Canada, for emergency use in a Canadian nuclear power plant. Being readied for test in the lower left photo is a large circuit breaker to be used by Duke Power Company, Charlotte, North Carolina. Electro-hydraulic and electro-dynamic shakers in and around the pit simulate earthquake forces. 18. Summary of the GK15 ground‐motion prediction equation for horizontal PGA and 5% damped PSA from shallow crustal continental earthquakes USGS Publications Warehouse Graizer, Vladimir;; Kalkan, Erol 2016-01-01 We present a revised ground‐motion prediction equation (GMPE) for computing medians and standard deviations of peak ground acceleration (PGA) and 5% damped pseudospectral acceleration (PSA) response ordinates of the horizontal component of randomly oriented ground motions to be used for seismic‐hazard analyses and engineering applications. This GMPE is derived from the expanded Next Generation Attenuation (NGA)‐West 1 database (see Data and Resources; Chiou et al., 2008). The revised model includes an anelastic attenuation term as a function of quality factor (Q0) to capture regional differences in far‐source (beyond 150 km) attenuation, and a new frequency‐dependent sedimentary‐basin scaling term as a function of depth to the 1.5 km/s shear‐wave velocity isosurface to improve ground‐motion predictions at sites located on deep sedimentary basins. The new Graizer–Kalkan 2015 (GK15) model, developed to be simple, is applicable for the western United States and other similar shallow crustal continental regions in active tectonic environments for earthquakes with moment magnitudes (M) 5.0–8.0, distances 0–250 km, average shear‐wave velocities in the upper 30 m (VS30) 200–1300 m/s, and spectral periods (T) 0.01–5 s. Our aleatory variability model captures interevent (between‐event) variability, which decreases with magnitude and increases with distance. The mixed‐effect residuals analysis reveals that the GK15 has no trend with respect to the independent predictor parameters. Compared to our 2007–2009 GMPE, the PGA values are very similar, whereas spectral ordinates predicted are larger at T<0.2 s and they are smaller at longer periods. 19. The physics of an earthquake NASA Astrophysics Data System (ADS) McCloskey, John 2008-03-01 The Sumatra-Andaman earthquake of 26 December 2004 (Boxing Day 2004) and its tsunami will endure in our memories as one of the worst natural disasters of our time. For geophysicists, the scale of the devastation and the likelihood of another equally destructive earthquake set out a series of challenges of how we might use science not only to understand the earthquake and its aftermath but also to help in planning for future earthquakes in the region. In this article a brief account of these efforts is presented. Earthquake prediction is probably impossible, but earth scientists are now able to identify particularly dangerous places for future events by developing an understanding of the physics of stress interaction. Having identified such a dangerous area, a series of numerical Monte Carlo simulations is described which allow us to get an idea of what the most likely consequences of a future earthquake are by modelling the tsunami generated by lots of possible, individually unpredictable, future events. As this article was being written, another earthquake occurred in the region, which had many expected characteristics but was enigmatic in other ways. This has spawned a series of further theories which will contribute to our understanding of this extremely complex problem. 20. Characteristics of strong motions and damage implications of M S6.5 Ludian earthquake on August 3, 2014 NASA Astrophysics Data System (ADS) Xu, Peibin; Wen, Ruizhi; Wang, Hongwei; Ji, Kun; Ren, Yefei 2015-02-01 The Ludian County of Yunnan Province in southwestern China was struck by an M S6.5 earthquake on August 3, 2014, which was another destructive event following the M S8.0 Wenchuan earthquake in 2008, M S7.1 Yushu earthquake in 2010, and M S7.0 Lushan earthquake in 2013. National Strong-Motion Observation Network System of China collected 74 strong motion recordings, which the maximum peak ground acceleration recorded by the 053LLT station in Longtoushan Town was 949 cm/s2 in E-W component. The observed PGAs and spectral ordinates were compared with ground-motion prediction equation in China and the NGA-West2 developed by Pacific Earthquake Engineering Researcher Center. This earthquake is considered as the first case for testing applicability of NGA-West2 in China. Results indicate that the observed PGAs and the 5 % damped pseudo-response spectral accelerations are significantly lower than the predicted ones. The field survey around some typical strong motion stations verified that the earthquake damage was consistent with the official isoseismal by China Earthquake Administration. 1. If pandas scream. an earthquake is coming SciTech Connect Magida, P. Feature article:Use of the behavior of animals to predict weather has spanned several ages and dozens of countries. While animals may behave in diverse ways to indicate weather changes, they all tend to behave in more or less the same way before earthquakes. The geophysical community in the U.S. has begun testing animal behavior before earthquakes. It has been determined that animals have the potential of acting as accurate geosensors to detect earthquakes before they occur. (5 drawings) 2. Reply to “Earthquake prediction evaluation standards applied to the VAN Method,” by D. D. Jackson NASA Astrophysics Data System (ADS) Varotsos, P.; Lazaridou, M.; Hadjicontis, V. Our earlier publications show that VAN method does not fail requirements (1) and (2) suggested by Jackson [1996]. No subjective ex-post facto decission was necessary for the evaluation of the success because, for the large majority of VAN predictions, the values of ΔM, Δr and Δt were published before the period 1987-1989 under discussion; in a few cases only (three out of 29), related with the observation of the new phenomenon of the SES electrical activity, the value of Δt was determined in 1988. Furthermore, a careful inspection-from physical point of view-shows that the three plausibility criteria, suggested by Jackson (to be obeyed by a candidate prediction technique), are actually met by VAN-method. 3. Comparisons of ground motions from the 1999 Chi-Chi, earthquake with empirical predictions largely based on data from California USGS Publications Warehouse Boore, D.M. 2001-01-01 This article has the modest goal of comparing the ground motions recorded during the 1999 Chi-Chi, Taiwan, mainshock with predictions from four empirical-based equations commonly used for western North America; these empirical predictions are largely based on data from California. Comparisons are made for peak acceleration and 5%-damped response spectra at periods between 0.1 and 4 sec. The general finding is that the Chi-Chi ground motions are smaller than those predicted from the empirically based equations for periods less than about 1 sec by factors averaging about 0.4 but as small as 0.26 (depending on period, on which equation is used, and on whether the sites are assumed to be rock or soil). There is a trend for the observed motions to approach or even exceed the predicted motions for longer periods. Motions at similar distances (30-60 km) to the east and to the west of the fault differ dramatically at periods between about 2 and 20 sec: Long-duration wave trains are present on the motions to the west, and when normalized to similar amplitudes at short periods, the response spectra of the motions at the western stations are as much as five times larger than those of motions from eastern stations. The explanation for the difference is probably related to site and propagation effects; the western stations are on the Coastal Plain, whereas the eastern stations are at the foot of young and steep mountains, either in the relatively narrow Longitudinal Valley or along the eastern coast-the sediments underlying the eastern stations are probably shallower and have higher velocity than those under the western stations. 4. The next new Madrid earthquake SciTech Connect Atkinson, W. 1988-01-01 Scientists who specialize in the study of Mississippi Valley earthquakes say that the region is overdue for a powerful tremor that will cause major damage and undoubtedly some casualties. The inevitability of a future quake and the lack of preparation by both individuals and communities provided the impetus for this book. It brings together applicable information from many disciplines: history, geology and seismology, engineering, zoology, politics and community planning, economics, environmental science, sociology, and psychology and mental health to provide a perspective of the myriad impacts of a major earthquake on the Mississippi Valley. The author addresses such basic questionsmore » as What, actually, are earthquakes How do they occur Can they be predicted, perhaps even prevented He also addresses those steps that individuals can take to improve their chances for survival both during and after an earthquake.« less 5. Comparison of Human Response against Earthquake and Tsunami NASA Astrophysics Data System (ADS) Arikawa, T.; Güler, H. G.; Yalciner, A. C. 2017-12-01 The evacuation response against the earthquake and tsunamis is very important for the reduction of human damages against tsunami. But it is very difficult to predict the human behavior after shaking of the earthquake. The purpose of this research is to clarify the difference of the human response after the earthquake shock in the difference countries and to consider the relation between the response and the safety feeling, knowledge and education. For the objective of this paper, the questionnaire survey was conducted after the 21st July 2017 Gokova earthquake and tsunami. Then, consider the difference of the human behavior by comparison of that in 2015 Chilean earthquake and tsunami and 2011 Japan earthquake and tsunami. The seismic intensity of the survey points was almost 6 to 7. The contents of the questions include the feeling of shaking, recalling of the tsunami, the behavior after shock and so on. The questionnaire was conducted for more than 20 20 people in 10 areas. The results are the following; 1) Most people felt that it was a strong shake not to stand, 2) All of the questionnaires did not recall the tsunami, 3) Depending on the area, they felt that after the earthquake the beach was safer than being at home. 4) After they saw the sea drawing, they thought that a tsunami would come and ran away. Fig. 1 shows the comparison of the evacuation rate within 10 minutes in 2011 Japan, 2015 Chile and 2017 Turkey.. From the education point of view, education for tsunami is not done much in Turkey. From the protection facilities point of view, the high sea walls are constructed only in Japan. From the warning alert point of view, there is no warning system against tsunamis in the Mediterranean Sea. As a result of this survey, the importance of tsunami education is shown, and evacuation tends to be delayed if dependency on facilities and alarms is too high. 6. Response to the great East Japan earthquake of 2011 and the Fukushima nuclear crisis: the case of the Laboratory Animal Research Center at Fukushima Medical University. PubMed Katahira, Kiyoaki; Sekiguchi, Miho 2013-01-01 A magnitude 9.0 great earthquake, the 2011 off the Pacific coast of Tohoku Earthquake, occurred on March 11, 2011, and subsequent Fukushima Daiichi Nuclear Power Station (Fukushima NPS) accidents stirred up natural radiation around the campus of Fukushima Medical University (FMU). FMU is located in Fukushima City, and is 57 km to the northwest of Fukushima NPS. Due to temporary failure of the steam boilers, the air conditioning system for the animal rooms, all autoclaves, and a cage washer could not be used at the Laboratory Animal Research Center (LARC) of FMU. The outside air temperature dropped to zero overnight, and the temperature inside the animal rooms fell to 10°C for several hours. We placed sterilized nesting materials inside all cages to encourage rodents to create nests. The main water supply was cut off for 8 days in all, while supply of steam and hot water remained unavailable for 12 days. It took 20 days to restore the air conditioning system to normal operation at the facility. We measured radiation levels in the animal rooms to confirm the safety of care staff and researchers. On April 21, May 9, and June 17, the average radiation levels at a central work table in the animal rooms with HEPA filters were 46.5, 44.4, and 43.4 cpm, respectively, which is equal to the background level of the equipment. We sincerely hope our experiences will be a useful reference regarding crisis management for many institutes having laboratory animals. 7. A 30-year history of earthquake crisis communication in California and lessons for the future NASA Astrophysics Data System (ADS) Jones, L. 2015-12-01 The first statement from the US Geological Survey to the California Office of Emergency Services quantifying the probability of a possible future earthquake was made in October 1985 about the probability (approximately 5%) that a M4.7 earthquake located directly beneath the Coronado Bay Bridge in San Diego would be a foreshock to a larger earthquake. In the next 30 years, publication of aftershock advisories have become routine and formal statements about the probability of a larger event have been developed in collaboration with the California Earthquake Prediction Evaluation Council (CEPEC) and sent to CalOES more than a dozen times. Most of these were subsequently released to the public. These communications have spanned a variety of approaches, with and without quantification of the probabilities, and using different ways to express the spatial extent and the magnitude distribution of possible future events. The USGS is re-examining its approach to aftershock probability statements and to operational earthquake forecasting with the goal of creating pre-vetted automated statements that can be released quickly after significant earthquakes. All of the previous formal advisories were written during the earthquake crisis. The time to create and release a statement became shorter with experience from the first public advisory (to the 1988 Lake Elsman earthquake) that was released 18 hours after the triggering event, but was never completed in less than 2 hours. As was done for the Parkfield experiment, the process will be reviewed by CEPEC and NEPEC (National Earthquake Prediction Evaluation Council) so the statements can be sent to the public automatically. This talk will review the advisories, the variations in wording and the public response and compare this with social science research about successful crisis communication, to create recommendations for future advisories 8. Connecting slow earthquakes to huge earthquakes. PubMed Obara, Kazushige; Kato, Aitaro 2016-07-15 Slow earthquakes are characterized by a wide spectrum of fault slip behaviors and seismic radiation patterns that differ from those of traditional earthquakes. However, slow earthquakes and huge megathrust earthquakes can have common slip mechanisms and are located in neighboring regions of the seismogenic zone. The frequent occurrence of slow earthquakes may help to reveal the physics underlying megathrust events as useful analogs. Slow earthquakes may function as stress meters because of their high sensitivity to stress changes in the seismogenic zone. Episodic stress transfer to megathrust source faults leads to an increased probability of triggering huge earthquakes if the adjacent locked region is critically loaded. Careful and precise monitoring of slow earthquakes may provide new information on the likelihood of impending huge earthquakes. Copyright © 2016, American Association for the Advancement of Science. 9. Physics of Earthquake Rupture Propagation NASA Astrophysics Data System (ADS) Xu, Shiqing; Fukuyama, Eiichi; Sagy, Amir; Doan, Mai-Linh 2018-05-01 A comprehensive understanding of earthquake rupture propagation requires the study of not only the sudden release of elastic strain energy during co-seismic slip, but also of other processes that operate at a variety of spatiotemporal scales. For example, the accumulation of the elastic strain energy usually takes decades to hundreds of years, and rupture propagation and termination modify the bulk properties of the surrounding medium that can influence the behavior of future earthquakes. To share recent findings in the multiscale investigation of earthquake rupture propagation, we held a session entitled "Physics of Earthquake Rupture Propagation" during the 2016 American Geophysical Union (AGU) Fall Meeting in San Francisco. The session included 46 poster and 32 oral presentations, reporting observations of natural earthquakes, numerical and experimental simulations of earthquake ruptures, and studies of earthquake fault friction. These presentations and discussions during and after the session suggested a need to document more formally the research findings, particularly new observations and views different from conventional ones, complexities in fault zone properties and loading conditions, the diversity of fault slip modes and their interactions, the evaluation of observational and model uncertainties, and comparison between empirical and physics-based models. Therefore, we organize this Special Issue (SI) of Tectonophysics under the same title as our AGU session, hoping to inspire future investigations. Eighteen articles (marked with "this issue") are included in this SI and grouped into the following six categories. 10. Earthquakes triggered by fluid extraction USGS Publications Warehouse Segall, P. 1989-01-01 Seismicity is correlated in space and time with production from some oil and gas fields where pore pressures have declined by several tens of megapascals. Reverse faulting has occurred both above and below petroleum reservoirs, and normal faulting has occurred on the flanks of at least one reservoir. The theory of poroelasticity requires that fluid extraction locally alter the state of stress. Calculations with simple geometries predict stress perturbations that are consistent with observed earthquake locations and focal mechanisms. Measurements of surface displacement and strain, pore pressure, stress, and poroelastic rock properties in such areas could be used to test theoretical predictions and improve our understanding of earthquake mechanics. -Author 11. Directivity in NGA earthquake ground motions: Analysis using isochrone theory USGS Publications Warehouse Spudich, P.; Chiou, B.S.J. 2008-01-01 We present correction factors that may be applied to the ground motion prediction relations of Abrahamson and Silva, Boore and Atkinson, Campbell and Bozorgnia, and Chiou and Youngs (all in this volume) to model the azimuthally varying distribution of the GMRotI50 component of ground motion (commonly called 'directivity') around earthquakes. Our correction factors may be used for planar or nonplanar faults having any dip or slip rake (faulting mechanism). Our correction factors predict directivity-induced variations of spectral acceleration that are roughly half of the strike-slip variations predicted by Somerville et al. (1997), and use of our factors reduces record-to-record sigma by about 2-20% at 5 sec or greater period. ?? 2008, Earthquake Engineering Research Institute. 12. Earthquake Shaking - Finding the "Hot Spots" USGS Publications Warehouse Field, Edward; Jones, Lucile; Jordan, Tom; Benthien, Mark; Wald, Lisa 2001-01-01 A new Southern California Earthquake Center study has quantified how local geologic conditions affect the shaking experienced in an earthquake. The important geologic factors at a site are softness of the rock or soil near the surface and thickness of the sediments above hard bedrock. Even when these 'site effects' are taken into account, however, each earthquake exhibits unique 'hotspots' of anomalously strong shaking. Better predictions of strong ground shaking will therefore require additional geologic data and more comprehensive computer simulations of individual earthquakes. 13. Strong ground motion of the 2016 Kumamoto earthquake NASA Astrophysics Data System (ADS) Aoi, S.; Kunugi, T.; Suzuki, W.; Kubo, H.; Morikawa, N.; Fujiwara, H. 2016-12-01 The 2016 Kumamoto earthquake that is composed of Mw 6.1 and Mw 7.1 earthquakes respectively occurred in the Kumamoto region at 21:26 on April 14 and 28 hours later at 1:25 on April 16, 2016 (JST). These earthquakes are considered to rupture mainly the Hinagu fault zone for the Mw 6.1 event and the Futagawa fault zone for the Mw 7.1 event, respectively, where the Headquarter for Earthquake Research Promotion performed the long-term evaluation as well as seismic hazard assessment prior to the 2016 Kumamoto earthquake. Strong shakings with seismic intensity 7 in the JMA scale were observed at four times in total: Mashiki town for the Mw 6.1 and Mw 7.1 events, Nishihara village for the Mw 7.1 event, and NIED/KiK-net Mashiki (KMMH16) for the Mw 7.1 event. KiK-net Mashiki (KMMH16) recorded peak ground acceleration more than 1000 cm/s/s, and Nishihara village recorded peak ground velocity more than 250 cm/s. Ground motions were observed wider area for the Mw 7.1 event than the Mw 6.1 event. Peak ground accelerations and peak ground velocities of K-NET/KiK-net stations are consistent with the ground motion prediction equations by Si and Midorikawa (1999). Peak ground velocities at longer distance than 200 km attenuate slowly, which can be attributed to the large Love wave with a dominant period around 10 seconds. 5%-damped pseudo spectral velocity of the Mashiki town shows a peak at period of 1-2 s that exceeds ground motion response of JR Takatori of the 1995 Kobe earthquake and the Kawaguchi town of the 2004 Chuetsu earthquake. 5%-damped pseudo spectral velocity of the Nishihara village shows 350 cm/s peak at period of 3-4 s that is similar to the several stations in Kathmandu basin by Takai et al. (2016) during the 2015 Gorkha earthquake in Nepal. Ground motions at several stations in Oita exceed the ground motion prediction equations due to an earthquake induced by the Mw 7.1 event. Peak ground accelerations of K-NET Yufuin (OIT009) records 90 cm/s/s for the Mw 7 14. Urban Earthquake Shaking and Loss Assessment NASA Astrophysics Data System (ADS) Hancilar, U.; Tuzun, C.; Yenidogan, C.; Zulfikar, C.; Durukal, E.; Erdik, M. 2009-04-01 This study, conducted under the JRA-3 component of the EU NERIES Project, develops a methodology and software (ELER) for the rapid estimation of earthquake shaking and losses the Euro-Mediterranean region. This multi-level methodology developed together with researchers from Imperial College, NORSAR and ETH-Zurich is capable of incorporating regional variability and sources of uncertainty stemming from ground motion predictions, fault finiteness, site modifications, inventory of physical and social elements subjected to earthquake hazard and the associated vulnerability relationships. GRM Risk Management, Inc. of Istanbul serves as sub-contractor tor the coding of the ELER software. The methodology encompasses the following general steps: 1. Finding of the most likely location of the source of the earthquake using regional seismotectonic data base and basic source parameters, and if and when possible, by the estimation of fault rupture parameters from rapid inversion of data from on-line stations. 2. Estimation of the spatial distribution of selected ground motion parameters through region specific ground motion attenuation relationships and using shear wave velocity distributions.(Shake Mapping) 4. Incorporation of strong ground motion and other empirical macroseismic data for the improvement of Shake Map 5. Estimation of the losses (damage, casualty and economic) at different levels of sophistication (0, 1 and 2) that commensurate with the availability of inventory of human built environment (Loss Mapping) Level 2 analysis of the ELER Software (similar to HAZUS and SELENA) is essentially intended for earthquake risk assessment (building damage, consequential human casualties and macro economic loss quantifiers) in urban areas. The basic Shake Mapping is similar to the Level 0 and Level 1 analysis however, options are available for more sophisticated treatment of site response through externally entered data and improvement of the shake map through incorporation 15. Statistical tests of simple earthquake cycle models NASA Astrophysics Data System (ADS) DeVries, Phoebe M. R.; Evans, Eileen L. 2016-12-01 A central goal of observing and modeling the earthquake cycle is to forecast when a particular fault may generate an earthquake: a fault late in its earthquake cycle may be more likely to generate an earthquake than a fault early in its earthquake cycle. Models that can explain geodetic observations throughout the entire earthquake cycle may be required to gain a more complete understanding of relevant physics and phenomenology. Previous efforts to develop unified earthquake models for strike-slip faults have largely focused on explaining both preseismic and postseismic geodetic observations available across a few faults in California, Turkey, and Tibet. An alternative approach leverages the global distribution of geodetic and geologic slip rate estimates on strike-slip faults worldwide. Here we use the Kolmogorov-Smirnov test for similarity of distributions to infer, in a statistically rigorous manner, viscoelastic earthquake cycle models that are inconsistent with 15 sets of observations across major strike-slip faults. We reject a large subset of two-layer models incorporating Burgers rheologies at a significance level of α = 0.05 (those with long-term Maxwell viscosities ηM < 4.0 × 1019 Pa s and ηM > 4.6 × 1020 Pa s) but cannot reject models on the basis of transient Kelvin viscosity ηK. Finally, we examine the implications of these results for the predicted earthquake cycle timing of the 15 faults considered and compare these predictions to the geologic and historical record. 16. International Collaboration for Strengthening Capacity to Assess Earthquake Hazard in Indonesia NASA Astrophysics Data System (ADS) Cummins, P. R.; Hidayati, S.; Suhardjono, S.; Meilano, I.; Natawidjaja, D. 2012-12-01 Indonesia has experienced a dramatic increase in earthquake risk due to rapid population growth in the 20th century, much of it occurring in areas near the subduction zone plate boundaries that are prone to earthquake occurrence. While recent seismic hazard assessments have resulted in better building codes that can inform safer building practices, many of the fundamental parameters controlling earthquake occurrence and ground shaking - e.g., fault slip rates, earthquake scaling relations, ground motion prediction equations, and site response - could still be better constrained. In recognition of the need to improve the level of information on which seismic hazard assessments are based, the Australian Agency for International Development (AusAID) and Indonesia's National Agency for Disaster Management (BNPB), through the Australia-Indonesia Facility for Disaster Reduction, have initiated a 4-year project designed to strengthen the Government of Indonesia's capacity to reliably assess earthquake hazard. This project is a collaboration of Australian institutions including Geoscience Australia and the Australian National University, with Indonesian government agencies and universities including the Agency for Meteorology, Climatology and Geophysics, the Geological Agency, the Indonesian Institute of Sciences, and Bandung Institute of Technology. Effective earthquake hazard assessment requires input from many different types of research, ranging from geological studies of active faults, seismological studies of crustal structure, earthquake sources and ground motion, PSHA methodology, and geodetic studies of crustal strain rates. The project is a large and diverse one that spans all these components, and these will be briefly reviewed in this presentation 17. Global earthquake fatalities and population USGS Publications Warehouse Holzer, Thomas L.; Savage, James C. 2013-01-01 Modern global earthquake fatalities can be separated into two components: (1) fatalities from an approximately constant annual background rate that is independent of world population growth and (2) fatalities caused by earthquakes with large human death tolls, the frequency of which is dependent on world population. Earthquakes with death tolls greater than 100,000 (and 50,000) have increased with world population and obey a nonstationary Poisson distribution with rate proportional to population. We predict that the number of earthquakes with death tolls greater than 100,000 (50,000) will increase in the 21st century to 8.7±3.3 (20.5±4.3) from 4 (7) observed in the 20th century if world population reaches 10.1 billion in 2100. Combining fatalities caused by the background rate with fatalities caused by catastrophic earthquakes (>100,000 fatalities) indicates global fatalities in the 21st century will be 2.57±0.64 million if the average post-1900 death toll for catastrophic earthquakes (193,000) is assumed. 18. Analysis of post-earthquake landslide activity and geo-environmental effects NASA Astrophysics Data System (ADS) Tang, Chenxiao; van Westen, Cees; Jetten, Victor 2014-05-01 Large earthquakes can cause huge losses to human society, due to ground shaking, fault rupture and due to the high density of co-seismic landslides that can be triggered in mountainous areas. In areas that have been affected by such large earthquakes, the threat of landslides continues also after the earthquake, as the co-seismic landslides may be reactivated by high intensity rainfall events. Earthquakes create Huge amount of landslide materials remain on the slopes, leading to a high frequency of landslides and debris flows after earthquakes which threaten lives and create great difficulties in post-seismic reconstruction in the earthquake-hit regions. Without critical information such as the frequency and magnitude of landslides after a major earthquake, reconstruction planning and hazard mitigation works appear to be difficult. The area hit by Mw 7.9 Wenchuan earthquake in 2008, Sichuan province, China, shows some typical examples of bad reconstruction planning due to lack of information: huge debris flows destroyed several re-constructed settlements. This research aim to analyze the decay in post-seismic landslide activity in areas that have been hit by a major earthquake. The areas hit by the 2008 Wenchuan earthquake will be taken a study area. The study will analyze the factors that control post-earthquake landslide activity through the quantification of the landslide volume changes well as through numerical simulation of their initiation process, to obtain a better understanding of the potential threat of post-earthquake landslide as a basis for mitigation planning. The research will make use of high-resolution stereo satellite images, UAV and Terrestrial Laser Scanning(TLS) to obtain multi-temporal DEM to monitor the change of loose sediments and post-seismic landslide activities. A debris flow initiation model that incorporates the volume of source materials, vegetation re-growth, and intensity-duration of the triggering precipitation, and that evaluates 19. Earthquake and tsunami forecasts: Relation of slow slip events to subsequent earthquake rupture PubMed Central Dixon, Timothy H.; Jiang, Yan; Malservisi, Rocco; McCaffrey, Robert; Voss, Nicholas; Protti, Marino; Gonzalez, Victor 2014-01-01 The 5 September 2012 Mw 7.6 earthquake on the Costa Rica subduction plate boundary followed a 62-y interseismic period. High-precision GPS recorded numerous slow slip events (SSEs) in the decade leading up to the earthquake, both up-dip and down-dip of seismic rupture. Deeper SSEs were larger than shallower ones and, if characteristic of the interseismic period, release most locking down-dip of the earthquake, limiting down-dip rupture and earthquake magnitude. Shallower SSEs were smaller, accounting for some but not all interseismic locking. One SSE occurred several months before the earthquake, but changes in Mohr–Coulomb failure stress were probably too small to trigger the earthquake. Because many SSEs have occurred without subsequent rupture, their individual predictive value is limited, but taken together they released a significant amount of accumulated interseismic strain before the earthquake, effectively defining the area of subsequent seismic rupture (rupture did not occur where slow slip was common). Because earthquake magnitude depends on rupture area, this has important implications for earthquake hazard assessment. Specifically, if this behavior is representative of future earthquake cycles and other subduction zones, it implies that monitoring SSEs, including shallow up-dip events that lie offshore, could lead to accurate forecasts of earthquake magnitude and tsunami potential. PMID:25404327 20. Earthquake and tsunami forecasts: relation of slow slip events to subsequent earthquake rupture. PubMed Dixon, Timothy H; Jiang, Yan; Malservisi, Rocco; McCaffrey, Robert; Voss, Nicholas; Protti, Marino; Gonzalez, Victor 2014-12-02 The 5 September 2012 M(w) 7.6 earthquake on the Costa Rica subduction plate boundary followed a 62-y interseismic period. High-precision GPS recorded numerous slow slip events (SSEs) in the decade leading up to the earthquake, both up-dip and down-dip of seismic rupture. Deeper SSEs were larger than shallower ones and, if characteristic of the interseismic period, release most locking down-dip of the earthquake, limiting down-dip rupture and earthquake magnitude. Shallower SSEs were smaller, accounting for some but not all interseismic locking. One SSE occurred several months before the earthquake, but changes in Mohr-Coulomb failure stress were probably too small to trigger the earthquake. Because many SSEs have occurred without subsequent rupture, their individual predictive value is limited, but taken together they released a significant amount of accumulated interseismic strain before the earthquake, effectively defining the area of subsequent seismic rupture (rupture did not occur where slow slip was common). Because earthquake magnitude depends on rupture area, this has important implications for earthquake hazard assessment. Specifically, if this behavior is representative of future earthquake cycles and other subduction zones, it implies that monitoring SSEs, including shallow up-dip events that lie offshore, could lead to accurate forecasts of earthquake magnitude and tsunami potential. 1. The Scintillation Prediction Observations Research Task (SPORT) Mission NASA Astrophysics Data System (ADS) Spann, J. F.; Swenson, C.; Durão, O.; Loures, L.; Heelis, R. A.; Bishop, R. L.; Le, G.; Abdu, M. A.; Habash Krause, L.; De Nardin, C. M.; Fonseca, E. 2015-12-01 Structure in the charged particle number density in the equatorial ionosphere can have a profound impact on the fidelity of HF, VHF and UHF radio signals that are used for ground-to-ground and space-to-ground communication and navigation. The degree to which such systems can be compromised depends in large part on the spatial distribution of the structured regions in the ionosphere and the background plasma density in which they are embedded. In order to address these challenges it is necessary to accurately distinguish the background ionospheric conditions that favor the generation of irregularities from those that do not. Additionally we must relate the evolution of those conditions to the subsequent evolution of the irregular plasma regions themselves. The background ionospheric conditions are conveniently described by latitudinal profiles of the plasma density at nearly constant altitude, which describe the effects of ExB drifts and neutral winds, while the appearance and growth of plasma structure requires committed observations from the ground from at least one fixed longitude. This talk will present an international collaborative CubeSat mission called SPORT that stands for Scintillation Prediction Observations Research Task. This mission that will advance our understanding of the nature and evolution of ionospheric structures around sunset to improve predictions of disturbances that affect radio propagation and telecommunication signals. The science goals will be accomplished by a unique combination of satellite observations from a nearly circular middle inclination orbit and the extensive operation of ground based observations from South America near the magnetic equator. This approach promises Explorer class science at a CubeSat price. 2. The Scintillation Prediction Observations Research Task (SPORT) Mission NASA Astrophysics Data System (ADS) Spann, James; Le, Guan; Swenson, Charles; Denardini, Clezio Marcos; Bishop, Rebecca L.; Abdu, Mangalathayil A.; Cupertino Durao, Otavio S.; Heelis, Roderick; Loures, Luis; Krause, Linda; Fonseca, Eloi 2016-07-01 Structure in the charged particle number density in the equatorial ionosphere can have a profound impact on the fidelity of HF, VHF and UHF radio signals that are used for ground-to-ground and space-to-ground communication and navigation. The degree to which such systems can be compromised depends in large part on the spatial distribution of the structured regions in the ionosphere and the background plasma density in which they are embedded. In order to address these challenges it is necessary to accurately distinguish the background ionospheric conditions that favor the generation of irregularities from those that do not. Additionally we must relate the evolution of those conditions to the subsequent evolution of the irregular plasma regions themselves. The background ionospheric conditions are conveniently described by latitudinal profiles of the plasma density at nearly constant altitude, which describe the effects of ExB drifts and neutral winds, while the appearance and growth of plasma structure requires committed observations from the ground from at least one fixed longitude. This talk will present an international collaborative CubeSat mission called SPORT that stands for the Scintillation Prediction Observations Research Task. This mission will advance our understanding of the nature and evolution of ionospheric structures around sunset to improve predictions of disturbances that affect radio propagation and telecommunication signals. The science goals will be accomplished by a unique combination of satellite observations from a nearly circular middle inclination orbit and the extensive operation of ground based observations from South America near the magnetic equator. This approach promises Explorer class science at a CubeSat price. 3. The Scintillation Prediction Observations Research Task (SPORT) Mission NASA Astrophysics Data System (ADS) Spann, James; Swenson, Charles; Durão, Otavio; Loures, Luis; Heelis, Rod; Bishop, Rebecca; Le, Guan; Abdu, Mangalathayil; Krause, Linda; Nardin, Clezio; Fonseca, Eloi 2016-04-01 Structure in the charged particle number density in the equatorial ionosphere can have a profound impact on the fidelity of HF, VHF and UHF radio signals that are used for ground-to-ground and space-to-ground communication and navigation. The degree to which such systems can be compromised depends in large part on the spatial distribution of the structured regions in the ionosphere and the background plasma density in which they are embedded. In order to address these challenges it is necessary to accurately distinguish the background ionospheric conditions that favor the generation of irregularities from those that do not. Additionally we must relate the evolution of those conditions to the subsequent evolution of the irregular plasma regions themselves. The background ionospheric conditions are conveniently described by latitudinal profiles of the plasma density at nearly constant altitude, which describe the effects of ExB drifts and neutral winds, while the appearance and growth of plasma structure requires committed observations from the ground from at least one fixed longitude. This talk will present an international collaborative CubeSat mission called SPORT that stands for the Scintillation Prediction Observations Research Task. This mission will advance our understanding of the nature and evolution of ionospheric structures around sunset to improve predictions of disturbances that affect radio propagation and telecommunication signals. The science goals will be accomplished by a unique combination of satellite observations from a nearly circular middle inclination orbit and the extensive operation of ground based observations from South America near the magnetic equator. This approach promises Explorer class science at a CubeSat price. 4. Earthquake Potential in Myanmar NASA Astrophysics Data System (ADS) Aung, Hla Hla Myanmar region is generally believed to be an area of high earthquake potential from the point of view of seismic activity which has been low compared to the surrounding regions like Indonesia, China, and Pakistan. Geoscientists and seismologists predicted earthquakes to occur in the area north of the Sumatra-Andaman Islands, i.e. the southwest and west part of Myanmar. Myanmar tectonic setting relative to East and SE Asia is rather peculiar and unique with different plate tectonic models but similar to the setting of western part of North America. Myanmar crustal blocks are caught within two lithospheric plates of India and Indochina experiencing oblique subduction with major dextral strike-slip faulting of the Sagaing fault. Seismic tomography and thermal structure of India plate along the Sunda subduction zone vary from south to north. Strong partitioning in central Andaman basin where crustal fragmentation and northward dispersion of Burma plate by back-arc spreading mechanism has been operating since Neogene. Northward motion of Burma plate relative to SE Asia would dock against the major continent further north and might have caused the accumulation of strain which in turn will be released as earthquakes in the future. 5. Earthquake Source Inversion Blindtest: Initial Results and Further Developments NASA Astrophysics Data System (ADS) Mai, P.; Burjanek, J.; Delouis, B.; Festa, G.; Francois-Holden, C.; Monelli, D.; Uchide, T.; Zahradnik, J. 2007-12-01 Images of earthquake ruptures, obtained from modelling/inverting seismic and/or geodetic data exhibit a high degree in spatial complexity. This earthquake source heterogeneity controls seismic radiation, and is determined by the details of the dynamic rupture process. In turn, such rupture models are used for studying source dynamics and for ground-motion prediction. But how reliable and trustworthy are these earthquake source inversions? Rupture models for a given earthquake, obtained by different research teams, often display striking disparities (see http://www.seismo.ethz.ch/srcmod) However, well resolved, robust, and hence reliable source-rupture models are an integral part to better understand earthquake source physics and to improve seismic hazard assessment. Therefore it is timely to conduct a large-scale validation exercise for comparing the methods, parameterization and data-handling in earthquake source inversions.We recently started a blind test in which several research groups derive a kinematic rupture model from synthetic seismograms calculated for an input model unknown to the source modelers. The first results, for an input rupture model with heterogeneous slip but constant rise time and rupture velocity, reveal large differences between the input and inverted model in some cases, while a few studies achieve high correlation between the input and inferred model. Here we report on the statistical assessment of the set of inverted rupture models to quantitatively investigate their degree of (dis-)similarity. We briefly discuss the different inversion approaches, their possible strength and weaknesses, and the use of appropriate misfit criteria. Finally we present new blind-test models, with increasing source complexity and ambient noise on the synthetics. The goal is to attract a large group of source modelers to join this source-inversion blindtest in order to conduct a large-scale validation exercise to rigorously asses the performance and 6. Earthquakes in Arkansas and vicinity 1699-2010 USGS Publications Warehouse Dart, Richard L.; Ausbrooks, Scott M. 2011-01-01 This map summarizes approximately 300 years of earthquake activity in Arkansas. It is one in a series of similar State earthquake history maps. Work on the Arkansas map was done in collaboration with the Arkansas Geological Survey. The earthquake data plotted on the map are from several sources: the Arkansas Geological Survey, the Center for Earthquake Research and Information, the National Center for Earthquake Engineering Research, and the Mississippi Department of Environmental Quality. In addition to earthquake locations, other materials presented include seismic hazard and isoseismal maps and related text. Earthquakes are a legitimate concern in Arkansas and parts of adjacent states. Arkansas has undergone a number of significant felt earthquakes since 1811. At least two of these events caused property damage: a magnitude 4.7 earthquake in 1931, and a magnitude 4.3 earthquake in 1967. The map shows all historical and instrumentally located earthquakes in Arkansas and vicinity between 1811 and 2010. The largest historic earthquake in the vicinity of the State was an intensity XI event, on December 16, 1811; the first earthquake in the New Madrid sequence. This violent event and the earthquakes that followed caused considerable damage to the then sparsely settled region. 7. The wind power prediction research based on mind evolutionary algorithm NASA Astrophysics Data System (ADS) Zhuang, Ling; Zhao, Xinjian; Ji, Tianming; Miao, Jingwen; Cui, Haina 2018-04-01 When the wind power is connected to the power grid, its characteristics of fluctuation, intermittent and randomness will affect the stability of the power system. The wind power prediction can guarantee the power quality and reduce the operating cost of power system. There were some limitations in several traditional wind power prediction methods. On the basis, the wind power prediction method based on Mind Evolutionary Algorithm (MEA) is put forward and a prediction model is provided. The experimental results demonstrate that MEA performs efficiently in term of the wind power prediction. The MEA method has broad prospect of engineering application. 8. Focal mechanisms of earthquakes in Mongolia NASA Astrophysics Data System (ADS) Sodnomsambuu, D.; Natalia, R.; Gangaadorj, B.; Munkhuu, U.; Davaasuren, G.; Danzansan, E.; Yan, R.; Valentina, M.; Battsetseg, B. 2011-12-01 Focal mechanism data provide information on the relative magnitudes of the principal stresses, so that a tectonic regime can be assigned. Especially such information is useful for the study of intraplate seismic active regions. A study of earthquake focal mechanisms in the territory of Mongolia as landlocked and intraplate region was conducted. We present map of focal mechanisms of earthquakes with M4.5 which occurred in Mongolia and neighboring regions. Focal mechanisms solutions were constrained by the first motion solutions, as well as by waveform modeling, particularly CMT solutions. Four earthquakes have been recorded in Mongolia in XX century with magnitude more than 8, the 1905 M7.9 Tsetserleg and M8.4 Bolnai earthquakes, the 1931 M8.0 Fu Yun earthquake, the 1957 M8.1 Gobi-Altai earthquake. However the map of focal mechanisms of earthquakes in Mongolia allows seeing all seismic active structures: Gobi Altay, Mongolian Altay, active fringe of Hangay dome, Hentii range etc. Earthquakes in the most of Mongolian territory and neighboring China regions are characterized by strike-slip and reverse movements. Strike-slip movements also are typical for earthquakes in Altay Range in Russia. The north of Mongolia and south part of the Baikal area is a region where have been occurred earthquakes with different focal mechanisms. This region is a zone of the transition between compressive regime associated to India-Eurasian collision and extensive structures localized in north of the country as Huvsgul area and Baykal rift. Earthquakes in the Baikal basin itself are characterized by normal movements. Earthquakes in Trans-Baikal zone and NW of Mongolia are characterized dominantly by strike-slip movements. Analysis of stress-axis orientations, the tectonic stress tensor is presented. The map of focal mechanisms of earthquakes in Mongolia could be useful tool for researchers in their study on Geodynamics of Central Asia, particularly of Mongolian and Baikal regions. 9. Earthquake Early Warning and Public Policy: Opportunities and Challenges NASA Astrophysics Data System (ADS) Goltz, J. D.; Bourque, L.; Tierney, K.; Riopelle, D.; Shoaf, K.; Seligson, H.; Flores, P. 2003-12-01 Development of an earthquake early warning capability and pilot project were objectives of TriNet, a 5-year (1997-2001) FEMA-funded project to develop a state-of-the-art digital seismic network in southern California. In parallel with research to assemble a protocol for rapid analysis of earthquake data and transmission of a signal by TriNet scientists and engineers, the public policy, communication and educational issues inherent in implementation of an earthquake early warning system were addressed by TriNet's outreach component. These studies included: 1) a survey that identified potential users of an earthquake early warning system and how an earthquake early warning might be used in responding to an event, 2) a review of warning systems and communication issues associated with other natural hazards and how lessons learned might be applied to an alerting system for earthquakes, 3) an analysis of organization, management and public policy issues that must be addressed if a broad-based warning system is to be developed and 4) a plan to provide earthquake early warnings to a small number of organizations in southern California as an experimental prototype. These studies provided needed insights into the social and cultural environment in which this new technology will be introduced, an environment with opportunities to enhance our response capabilities but also an environment with significant barriers to overcome to achieve a system that can be sustained and supported. In this presentation we will address the main public policy issues that were subjects of analysis in these studies. They include a discussion of the possible division of functions among organizations likely to be the principle partners in the management of an earthquake early warning system. Drawing on lessons learned from warning systems for other hazards, we will review the potential impacts of false alarms and missed events on warning system credibility, the acceptability of fully automated 10. Identification of Deep Earthquakes DTIC Science & Technology 2010-09-01 discriminants that will reliably separate small, crustal earthquakes (magnitudes less than about 4 and depths less than about 40 to 50 km) from small...characteristics on discrimination plots designed to separate nuclear explosions from crustal earthquakes. Thus, reliably flagging these small, deep events is...Further, reliably identifying subcrustal earthquakes will allow us to eliminate deep events (previously misidentified as crustal earthquakes) from 11. Modeling of earthquake ground motion in the frequency domain NASA Astrophysics Data System (ADS) Thrainsson, Hjortur In recent years, the utilization of time histories of earthquake ground motion has grown considerably in the design and analysis of civil structures. It is very unlikely, however, that recordings of earthquake ground motion will be available for all sites and conditions of interest. Hence, there is a need for efficient methods for the simulation and spatial interpolation of earthquake ground motion. In addition to providing estimates of the ground motion at a site using data from adjacent recording stations, spatially interpolated ground motions can also be used in design and analysis of long-span structures, such as bridges and pipelines, where differential movement is important. The objective of this research is to develop a methodology for rapid generation of horizontal earthquake ground motion at any site for a given region, based on readily available source, path and site characteristics, or (sparse) recordings. The research includes two main topics: (i) the simulation of earthquake ground motion at a given site, and (ii) the spatial interpolation of earthquake ground motion. In topic (i), models are developed to simulate acceleration time histories using the inverse discrete Fourier transform. The Fourier phase differences, defined as the difference in phase angle between adjacent frequency components, are simulated conditional on the Fourier amplitude. Uniformly processed recordings from recent California earthquakes are used to validate the simulation models, as well as to develop prediction formulas for the model parameters. The models developed in this research provide rapid simulation of earthquake ground motion over a wide range of magnitudes and distances, but they are not intended to replace more robust geophysical models. In topic (ii), a model is developed in which Fourier amplitudes and Fourier phase angles are interpolated separately. A simple dispersion relationship is included in the phase angle interpolation. The accuracy of the interpolation 12. Earthquake likelihood model testing USGS Publications Warehouse Schorlemmer, D.; Gerstenberger, M.C.; Wiemer, S.; Jackson, D.D.; Rhoades, D.A. 2007-01-01 INTRODUCTIONThe Regional Earthquake Likelihood Models (RELM) project aims to produce and evaluate alternate models of earthquake potential (probability per unit volume, magnitude, and time) for California. Based on differing assumptions, these models are produced to test the validity of their assumptions and to explore which models should be incorporated in seismic hazard and risk evaluation. Tests based on physical and geological criteria are useful but we focus on statistical methods using future earthquake catalog data only. We envision two evaluations: a test of consistency with observed data and a comparison of all pairs of models for relative consistency. Both tests are based on the likelihood method, and both are fully prospective (i.e., the models are not adjusted to fit the test data). To be tested, each model must assign a probability to any possible event within a specified region of space, time, and magnitude. For our tests the models must use a common format: earthquake rates in specified “bins” with location, magnitude, time, and focal mechanism limits.Seismology cannot yet deterministically predict individual earthquakes; however, it should seek the best possible models for forecasting earthquake occurrence. This paper describes the statistical rules of an experiment to examine and test earthquake forecasts. The primary purposes of the tests described below are to evaluate physical models for earthquakes, assure that source models used in seismic hazard and risk studies are consistent with earthquake data, and provide quantitative measures by which models can be assigned weights in a consensus model or be judged as suitable for particular regions.In this paper we develop a statistical method for testing earthquake likelihood models. A companion paper (Schorlemmer and Gerstenberger 2007, this issue) discusses the actual implementation of these tests in the framework of the RELM initiative.Statistical testing of hypotheses is a common task and a 13. Tiltmeter studies in earthquake prediction USGS Publications Warehouse Johnston, M. 1978-01-01 tilt measurements give us a means of monitoring vertical displacements or local uplift of the crust. The simplest type of tiltmeter is a stationary pendulum (fig. 1). As the Earth's surface distorts locally, the pendulum housing is tilted while, of course, the pendulum continues to hang vertically (that is, in the direction of the gravity vector). The tilt angle is the angle through which the pendulum housing is tilted. The pendulum is the inertial reference (the force of gravity remains unchanged at the site), and tilting of the instrument housing represents the moving reference frame. We note in passing that the tiltmeter could also be used to measure the force of gravity by using the pendulum in the same way as Henry Kater did in his celebrated measurement of g in 1817. 14. Experimental research on the dam-break mechanisms of the Jiadanwan landslide dam triggered by the Wenchuan earthquake in China. PubMed Xu, Fu-gang; Yang, Xing-guo; Zhou, Jia-wen; Hao, Ming-hui 2013-01-01 Dam breaks of landslide dams are always accompanied by large numbers of casualties, a large loss of property, and negative influences on the downstream ecology and environment. This study uses the Jiadanwan landslide dam, created by the Wenchuan earthquake, as a case study example. Several laboratory experiments are carried out to analyse the dam-break mechanism of the landslide dam. The different factors that impact the dam-break process include upstream flow, the boulder effect, dam size, and channel discharge. The development of the discharge channel and the failure of the landslide dam are monitored by digital video and still cameras. Experimental results show that the upstream inflow and the dam size are the main factors that impact the dam-break process. An excavated discharge channel, especially a trapezoidal discharge channel, has a positive effect on reducing peak flow. The depth of the discharge channel also has a significant impact on the dam-break process. The experimental results are significant for landslide dam management and flood disaster prevention and mitigation. 15. Experimental Research on the Dam-Break Mechanisms of the Jiadanwan Landslide Dam Triggered by the Wenchuan Earthquake in China PubMed Central Xu, Fu-gang; Yang, Xing-guo; Hao, Ming-hui 2013-01-01 Dam breaks of landslide dams are always accompanied by large numbers of casualties, a large loss of property, and negative influences on the downstream ecology and environment. This study uses the Jiadanwan landslide dam, created by the Wenchuan earthquake, as a case study example. Several laboratory experiments are carried out to analyse the dam-break mechanism of the landslide dam. The different factors that impact the dam-break process include upstream flow, the boulder effect, dam size, and channel discharge. The development of the discharge channel and the failure of the landslide dam are monitored by digital video and still cameras. Experimental results show that the upstream inflow and the dam size are the main factors that impact the dam-break process. An excavated discharge channel, especially a trapezoidal discharge channel, has a positive effect on reducing peak flow. The depth of the discharge channel also has a significant impact on the dam-break process. The experimental results are significant for landslide dam management and flood disaster prevention and mitigation. PMID:23844387 16. Understanding earthquake hazards in urban areas - Evansville Area Earthquake Hazards Mapping Project USGS Publications Warehouse Boyd, Oliver S. 2012-01-01 The region surrounding Evansville, Indiana, has experienced minor damage from earthquakes several times in the past 200 years. Because of this history and the proximity of Evansville to the Wabash Valley and New Madrid seismic zones, there is concern among nearby communities about hazards from earthquakes. Earthquakes currently cannot be predicted, but scientists can estimate how strongly the ground is likely to shake as a result of an earthquake and are able to design structures to withstand this estimated ground shaking. Earthquake-hazard maps provide one way of conveying such information and can help the region of Evansville prepare for future earthquakes and reduce earthquake-caused loss of life and financial and structural loss. The Evansville Area Earthquake Hazards Mapping Project (EAEHMP) has produced three types of hazard maps for the Evansville area: (1) probabilistic seismic-hazard maps show the ground motion that is expected to be exceeded with a given probability within a given period of time; (2) scenario ground-shaking maps show the expected shaking from two specific scenario earthquakes; (3) liquefaction-potential maps show how likely the strong ground shaking from the scenario earthquakes is to produce liquefaction. These maps complement the U.S. Geological Survey's National Seismic Hazard Maps but are more detailed regionally and take into account surficial geology, soil thickness, and soil stiffness; these elements greatly affect ground shaking. 17. The Electronic Encyclopedia of Earthquakes NASA Astrophysics Data System (ADS) Benthien, M.; Marquis, J.; Jordan, T. 2003-12-01 The Electronic Encyclopedia of Earthquakes is a collaborative project of the Southern California Earthquake Center (SCEC), the Consortia of Universities for Research in Earthquake Engineering (CUREE) and the Incorporated Research Institutions for Seismology (IRIS). This digital library organizes earthquake information online as a partner with the NSF-funded National Science, Technology, Engineering and Mathematics (STEM) Digital Library (NSDL) and the Digital Library for Earth System Education (DLESE). When complete, information and resources for over 500 Earth science and engineering topics will be included, with connections to curricular materials useful for teaching Earth Science, engineering, physics and mathematics. Although conceived primarily as an educational resource, the Encyclopedia is also a valuable portal to anyone seeking up-to-date earthquake information and authoritative technical sources. "E3" is a unique collaboration among earthquake scientists and engineers to articulate and document a common knowledge base with a shared terminology and conceptual framework. It is a platform for cross-training scientists and engineers in these complementary fields and will provide a basis for sustained communication and resource-building between major education and outreach activities. For example, the E3 collaborating organizations have leadership roles in the two largest earthquake engineering and earth science projects ever sponsored by NSF: the George E. Brown Network for Earthquake Engineering Simulation (CUREE) and the EarthScope Project (IRIS and SCEC). The E3 vocabulary and definitions are also being connected to a formal ontology under development by the SCEC/ITR project for knowledge management within the SCEC Collaboratory. The E3 development system is now fully operational, 165 entries are in the pipeline, and the development teams are capable of producing 20 new, fully reviewed encyclopedia entries each month. Over the next two years teams will 18. Interpreting the strongest deep earthquake ever observed NASA Astrophysics Data System (ADS) Schultz, Colin 2013-12-01 Massive earthquakes that strike deep within the Earth may be more efficient at dissipating pent-up energy than similar quakes near the surface, according to new research by Wei et al. The authors analyzed the rupture of the most powerful deep earthquake ever recorded. 19. ELER software - a new tool for urban earthquake loss assessment NASA Astrophysics Data System (ADS) Hancilar, U.; Tuzun, C.; Yenidogan, C.; Erdik, M. 2010-12-01 Rapid loss estimation after potentially damaging earthquakes is critical for effective emergency response and public information. A methodology and software package, ELER-Earthquake Loss Estimation Routine, for rapid estimation of earthquake shaking and losses throughout the Euro-Mediterranean region was developed under the Joint Research Activity-3 (JRA3) of the EC FP6 Project entitled "Network of Research Infrastructures for European Seismology-NERIES". Recently, a new version (v2.0) of ELER software has been released. The multi-level methodology developed is capable of incorporating regional variability and uncertainty originating from ground motion predictions, fault finiteness, site modifications, inventory of physical and social elements subjected to earthquake hazard and the associated vulnerability relationships. Although primarily intended for quasi real-time estimation of earthquake shaking and losses, the routine is also equally capable of incorporating scenario-based earthquake loss assessments. This paper introduces the urban earthquake loss assessment module (Level 2) of the ELER software which makes use of the most detailed inventory databases of physical and social elements at risk in combination with the analytical vulnerability relationships and building damage-related casualty vulnerability models for the estimation of building damage and casualty distributions, respectively. Spectral capacity-based loss assessment methodology and its vital components are presented. The analysis methods of the Level 2 module, i.e. Capacity Spectrum Method (ATC-40, 1996), Modified Acceleration-Displacement Response Spectrum Method (FEMA 440, 2005), Reduction Factor Method (Fajfar, 2000) and Coefficient Method (ASCE 41-06, 2006), are applied to the selected building types for validation and verification purposes. The damage estimates are compared to the results obtained from the other studies available in the literature, i.e. SELENA v4.0 (Molina et al., 2008) and 20. Investigating Lushan Earthquake Victims' Individual Behavior Response and Rescue Organization. PubMed Kang, Peng; Lv, Yipeng; Deng, Qiangyu; Liu, Yuan; Zhang, Yi; Liu, Xu; Zhang, Lulu 2017-12-11 Research concerning the impact of earthquake victims' individual behavior and its association with earthquake-related injuries is lacking. This study examined this relationship along with effectiveness of earthquake rescue measures. The six most severely destroyed townships during the Lushan earthquake were examined; 28 villages and three earthquake victims' settlement camp areas were selected as research areas. Inclusion criteria comprised living in Lushan county for a longtime, living in Lushan county during the 2013 Lushan earthquake, and having one's home destroyed. Earthquake victims with an intellectual disability or communication problems were excluded. The earthquake victims (N (number) = 5165, male = 2396) completed a questionnaire (response rate: 94.7%). Among them, 209 were injured (5.61%). Teachers (p < 0.0001, OR (odds ratios) = 3.33) and medical staff (p = 0.001, OR = 4.35) were more vulnerable to the earthquake than were farmers. Individual behavior was directly related to injuries, such as the first reaction after earthquake and fear. There is an obvious connection between earthquake-related injury and individual behavior characteristics. It is strongly suggested that victims receive mental health support from medical practitioners and the government to minimize negative effects. The initial reaction after an earthquake also played a vital role in victims' trauma; therefore, earthquake-related experience and education may prevent injuries. Self-aid and mutual help played key roles in emergency, medical rescue efforts. 1. Earthquakes in Mississippi and vicinity 1811-2010 USGS Publications Warehouse Dart, Richard L.; Bograd, Michael B.E. 2011-01-01 This map summarizes two centuries of earthquake activity in Mississippi. Work on the Mississippi map was done in collaboration with the Mississippi Department of Environmental Quality, Office of Geology. The earthquake data plotted on the map are from several sources: the Mississippi Department of Environmental Quality, the Center for Earthquake Research and Information, the National Center for Earthquake Engineering Research, and the Arkansas Geological Survey. In addition to earthquake locations, other materials include seismic hazard and isoseismal maps and related text. Earthquakes are a legitimate concern in Mississippi and parts of adjacent States. Mississippi has undergone a number of felt earthquakes since 1811. At least two of these events caused property damage: a magnitude 4.7 earthquake in 1931, and a magnitude 4.3 earthquake in 1967. The map shows all historical and instrumentally located earthquakes in Mississippi and vicinity between 1811 and 2010. The largest historic earthquake in the vicinity of the State was an intensity XI event, on December 16, 1811; the first earthquake in the New Madrid sequence. This violent event and the earthquakes that followed caused considerable damage to the then sparsely settled region. 2. The Research-to-Operations-to-Research Cycle at NOAA's Space Weather Prediction Center NASA Astrophysics Data System (ADS) Singer, H. J. 2017-12-01 The provision of actionable space weather products and services by NOAA's Space Weather Prediction Center relies on observations, models and scientific understanding of our dynamic space environment. It also depends on a deep understanding of the systems and capabilities that are vulnerable to space weather, as well as national and international partnerships that bring together resources, skills and applications to support space weather forecasters and customers. While these activities have been evolving over many years, in October 2015, with the release of the National Space Weather Strategy and National Space Weather Action Plan (NSWAP) by National Science and Technology Council in the Executive Office of the President, there is a new coordinated focus on ensuring the Nation is prepared to respond to and recover from severe space weather storms. One activity highlighted in the NSWAP is the Operations to Research (O2R) and Research to Operations (R2O) process. In this presentation we will focus on current R2O and O2R activities that advance our ability to serve those affected by space weather and give a vision for future programs. We will also provide examples of recent research results that lead to improved operational capabilities, lessons learned in the transition of research to operations, and challenges for both the science and operations communities. 3. The California Earthquake Advisory Plan: A history USGS Publications Warehouse Roeloffs, Evelyn A.; Goltz, James D. 2017-01-01 Since 1985, the California Office of Emergency Services (Cal OES) has issued advisory statements to local jurisdictions and the public following seismic activity that scientists on the California Earthquake Prediction Evaluation Council view as indicating elevated probability of a larger earthquake in the same area during the next several days. These advisory statements are motivated by statistical studies showing that about 5% of moderate earthquakes in California are followed by larger events within a 10-km, five-day space-time window (Jones, 1985; Agnew and Jones, 1991; Reasenberg and Jones, 1994). Cal OES issued four earthquake advisories from 1985 to 1989. In October, 1990, the California Earthquake Advisory Plan formalized this practice, and six Cal OES Advisories have been issued since then. This article describes that protocol’s scientific basis and evolution. 4. Parallelization of the Coupled Earthquake Model NASA Technical Reports Server (NTRS) Block, Gary; Li, P. Peggy; Song, Yuhe T. 2007-01-01 This Web-based tsunami simulation system allows users to remotely run a model on JPL s supercomputers for a given undersea earthquake. At the time of this reporting, predicting tsunamis on the Internet has never happened before. This new code directly couples the earthquake model and the ocean model on parallel computers and improves simulation speed. Seismometers can only detect information from earthquakes; they cannot detect whether or not a tsunami may occur as a result of the earthquake. When earthquake-tsunami models are coupled with the improved computational speed of modern, high-performance computers and constrained by remotely sensed data, they are able to provide early warnings for those coastal regions at risk. The software is capable of testing NASA s satellite observations of tsunamis. It has been successfully tested for several historical tsunamis, has passed all alpha and beta testing, and is well documented for users. 5. Using prediction markets to estimate the reproducibility of scientific research. PubMed Dreber, Anna; Pfeiffer, Thomas; Almenberg, Johan; Isaksson, Siri; Wilson, Brad; Chen, Yiling; Nosek, Brian A; Johannesson, Magnus 2015-12-15 Concerns about a lack of reproducibility of statistically significant results have recently been raised in many fields, and it has been argued that this lack comes at substantial economic costs. We here report the results from prediction markets set up to quantify the reproducibility of 44 studies published in prominent psychology journals and replicated in the Reproducibility Project: Psychology. The prediction markets predict the outcomes of the replications well and outperform a survey of market participants' individual forecasts. This shows that prediction markets are a promising tool for assessing the reproducibility of published scientific results. The prediction markets also allow us to estimate probabilities for the hypotheses being true at different testing stages, which provides valuable information regarding the temporal dynamics of scientific discovery. We find that the hypotheses being tested in psychology typically have low prior probabilities of being true (median, 9%) and that a "statistically significant" finding needs to be confirmed in a well-powered replication to have a high probability of being true. We argue that prediction markets could be used to obtain speedy information about reproducibility at low cost and could potentially even be used to determine which studies to replicate to optimally allocate limited resources into replications. 6. Using prediction markets to estimate the reproducibility of scientific research PubMed Central Dreber, Anna; Pfeiffer, Thomas; Almenberg, Johan; Isaksson, Siri; Wilson, Brad; Chen, Yiling; Nosek, Brian A.; Johannesson, Magnus 2015-01-01 Concerns about a lack of reproducibility of statistically significant results have recently been raised in many fields, and it has been argued that this lack comes at substantial economic costs. We here report the results from prediction markets set up to quantify the reproducibility of 44 studies published in prominent psychology journals and replicated in the Reproducibility Project: Psychology. The prediction markets predict the outcomes of the replications well and outperform a survey of market participants’ individual forecasts. This shows that prediction markets are a promising tool for assessing the reproducibility of published scientific results. The prediction markets also allow us to estimate probabilities for the hypotheses being true at different testing stages, which provides valuable information regarding the temporal dynamics of scientific discovery. We find that the hypotheses being tested in psychology typically have low prior probabilities of being true (median, 9%) and that a “statistically significant” finding needs to be confirmed in a well-powered replication to have a high probability of being true. We argue that prediction markets could be used to obtain speedy information about reproducibility at low cost and could potentially even be used to determine which studies to replicate to optimally allocate limited resources into replications. PMID:26553988 7. Development of a Low Cost Earthquake Early Warning System in Taiwan NASA Astrophysics Data System (ADS) Wu, Y. M. 2017-12-01 The National Taiwan University (NTU) developed an earthquake early warning (EEW) system for research purposes using low-cost accelerometers (P-Alert) since 2010. As of 2017, a total of 650 stations have been deployed and configured. The NTU system can provide earthquake information within 15 s of an earthquake occurrence. Thus, this system may provide early warnings for cities located more than 50 km from the epicenter. Additionally, the NTU system also has an onsite alert function that triggers a warning for incoming P-waves greater than a certain magnitude threshold, thus providing a 2-3 s lead time before peak ground acceleration (PGA) for regions close to an epicenter. Detailed shaking maps are produced by the NTU system within one or two minutes after an earthquake. Recently, a new module named ShakeAlarm has been developed. Equipped with real-time acceleration signals and the time-dependent anisotropic attenuation relationship of the PGA, ShakingAlarm can provide an accurate PGA estimation immediately before the arrival of the observed PGA. This unique advantage produces sufficient lead time for hazard assessment and emergency response, which is unavailable for traditional shakemap, which are based on only the PGA observed in real time. The performance of ShakingAlarm was tested with six M > 5.5 inland earthquakes from 2013 to 2016. Taking the 2016 M6.4 Meinong earthquake simulation as an example, the predicted PGA converges to a stable value and produces a predicted shake map and an isocontour map of the predicted PGA within 16 seconds of earthquake occurrence. Compared with traditional regional EEW system, ShakingAlarm can effectively identify possible damage regions and provide valuable early warning information (magnitude and PGA) for risk mitigation. 8. Seismic Hazard Assessment for a Characteristic Earthquake Scenario: Probabilistic-Deterministic Method NASA Astrophysics Data System (ADS) mouloud, Hamidatou 2016-04-01 The objective of this paper is to analyze the seismic activity and the statistical treatment of seismicity catalog the Constantine region between 1357 and 2014 with 7007 seismic event. Our research is a contribution to improving the seismic risk management by evaluating the seismic hazard in the North-East Algeria. In the present study, Earthquake hazard maps for the Constantine region are calculated. Probabilistic seismic hazard analysis (PSHA) is classically performed through the Cornell approach by using a uniform earthquake distribution over the source area and a given magnitude range. This study aims at extending the PSHA approach to the case of a characteristic earthquake scenario associated with an active fault. The approach integrates PSHA with a high-frequency deterministic technique for the prediction of peak and spectral ground motion parameters in a characteristic earthquake. The method is based on the site-dependent evaluation of the probability of exceedance for the chosen strong-motion parameter. We proposed five sismotectonique zones. Four steps are necessary: (i) identification of potential sources of future earthquakes, (ii) assessment of their geological, geophysical and geometric, (iii) identification of the attenuation pattern of seismic motion, (iv) calculation of the hazard at a site and finally (v) hazard mapping for a region. In this study, the procedure of the earthquake hazard evaluation recently developed by Kijko and Sellevoll (1992) is used to estimate seismic hazard parameters in the northern part of Algeria. 9. Application of Geostatistical Methods and Machine Learning for spatio-temporal Earthquake Cluster Analysis NASA Astrophysics Data System (ADS) Schaefer, A. M.; Daniell, J. E.; Wenzel, F. 2014-12-01 Earthquake clustering tends to be an increasingly important part of general earthquake research especially in terms of seismic hazard assessment and earthquake forecasting and prediction approaches. The distinct identification and definition of foreshocks, aftershocks, mainshocks and secondary mainshocks is taken into account using a point based spatio-temporal clustering algorithm originating from the field of classic machine learning. This can be further applied for declustering purposes to separate background seismicity from triggered seismicity. The results are interpreted and processed to assemble 3D-(x,y,t) earthquake clustering maps which are based on smoothed seismicity records in space and time. In addition, multi-dimensional Gaussian functions are used to capture clustering parameters for spatial distribution and dominant orientations. Clusters are further processed using methodologies originating from geostatistics, which have been mostly applied and developed in mining projects during the last decades. A 2.5D variogram analysis is applied to identify spatio-temporal homogeneity in terms of earthquake density and energy output. The results are mitigated using Kriging to provide an accurate mapping solution for clustering features. As a case study, seismic data of New Zealand and the United States is used, covering events since the 1950s, from which an earthquake cluster catalogue is assembled for most of the major events, including a detailed analysis of the Landers and Christchurch sequences. 10. Scaling Relations of Earthquakes on Inland Active Mega-Fault Systems NASA Astrophysics Data System (ADS) Murotani, S.; Matsushima, S.; Azuma, T.; Irikura, K.; Kitagawa, S. 2010-12-01 Since 2005, The Headquarters for Earthquake Research Promotion (HERP) has been publishing 'National Seismic Hazard Maps for Japan' to provide useful information for disaster prevention countermeasures for the country and local public agencies, as well as promote public awareness of disaster prevention of earthquakes. In the course of making the year 2009 version of the map, which is the commemorate of the tenth anniversary of the settlement of the Comprehensive Basic Policy, the methods to evaluate magnitude of earthquakes, to predict strong ground motion, and to construct underground structure were investigated in the Earthquake Research Committee and its subcommittees. In order to predict the magnitude of earthquakes occurring on mega-fault systems, we examined the scaling relations for mega-fault systems using 11 earthquakes of which source processes were analyzed by waveform inversion and of which surface information was investigated. As a result, we found that the data fit in between the scaling relations of seismic moment and rupture area by Somerville et al. (1999) and Irikura and Miyake (2001). We also found that maximum displacement of surface rupture is two to three times larger than the average slip on the seismic fault and surface fault length is equal to length of the source fault. Furthermore, compiled data of the source fault shows that displacement saturates at 10m when fault length(L) is beyond 100km, L>100km. By assuming the fault width (W) to be 18km in average of inland earthquakes in Japan, and the displacement saturate at 10m for length of more than 100 km, we derived a new scaling relation between source area and seismic moment, S[km^2] = 1.0 x 10^-17 M0 [Nm] for mega-fault systems that seismic moment (M0) exceeds 1.8×10^20 Nm. 11. Drought Predictability and Prediction in a Changing Climate: Assessing Current Predictive Knowledge and Capabilities, User Requirements and Research Priorities NASA Technical Reports Server (NTRS) Schubert, Siegfried 2011-01-01 Drought is fundamentally the result of an extended period of reduced precipitation lasting anywhere from a few weeks to decades and even longer. As such, addressing drought predictability and prediction in a changing climate requires foremost that we make progress on the ability to predict precipitation anomalies on subseasonal and longer time scales. From the perspective of the users of drought forecasts and information, drought is however most directly viewed through its impacts (e.g., on soil moisture, streamflow, crop yields). As such, the question of the predictability of drought must extend to those quantities as well. In order to make progress on these issues, the WCRP drought information group (DIG), with the support of WCRP, the Catalan Institute of Climate Sciences, the La Caixa Foundation, the National Aeronautics and Space Administration, the National Oceanic and Atmospheric Administration, and the National Science Foundation, has organized a workshop to focus on: 1. User requirements for drought prediction information on sub-seasonal to centennial time scales 2. Current understanding of the mechanisms and predictability of drought on sub-seasonal to centennial time scales 3. Current drought prediction/projection capabilities on sub-seasonal to centennial time scales 4. Advancing regional drought prediction capabilities for variables and scales most relevant to user needs on sub-seasonal to centennial time scales. This introductory talk provides an overview of these goals, and outlines the occurrence and mechanisms of drought world-wide. 12. Security Implications of Induced Earthquakes NASA Astrophysics Data System (ADS) Jha, B.; Rao, A. 2016-12-01 The increase in earthquakes induced or triggered by human activities motivates us to research how a malicious entity could weaponize earthquakes to cause damage. Specifically, we explore the feasibility of controlling the location, timing and magnitude of an earthquake by activating a fault via injection and production of fluids into the subsurface. Here, we investigate the relationship between the magnitude and trigger time of an induced earthquake to the well-to-fault distance. The relationship between magnitude and distance is important to determine the farthest striking distance from which one could intentionally activate a fault to cause certain level of damage. We use our novel computational framework to model the coupled multi-physics processes of fluid flow and fault poromechanics. We use synthetic models representative of the New Madrid Seismic Zone and the San Andreas Fault Zone to assess the risk in the continental US. We fix injection and production flow rates of the wells and vary their locations. We simulate injection-induced Coulomb destabilization of faults and evolution of fault slip under quasi-static deformation. We find that the effect of distance on the magnitude and trigger time is monotonic, nonlinear, and time-dependent. Evolution of the maximum Coulomb stress on the fault provides insights into the effect of the distance on rupture nucleation and propagation. The damage potential of induced earthquakes can be maintained even at longer distances because of the balance between pressure diffusion and poroelastic stress transfer mechanisms. We conclude that computational modeling of induced earthquakes allows us to measure feasibility of weaponzing earthquakes and developing effective defense mechanisms against such attacks. 13. Seismicity map tools for earthquake studies NASA Astrophysics Data System (ADS) Boucouvalas, Anthony; Kaskebes, Athanasios; Tselikas, Nikos 2014-05-01 We report on the development of new and online set of tools for use within Google Maps, for earthquake research. We demonstrate this server based and online platform (developped with PHP, Javascript, MySQL) with the new tools using a database system with earthquake data. The platform allows us to carry out statistical and deterministic analysis on earthquake data use of Google Maps and plot various seismicity graphs. The tool box has been extended to draw on the map line segments, multiple straight lines horizontally and vertically as well as multiple circles, including geodesic lines. The application is demonstrated using localized seismic data from the geographic region of Greece as well as other global earthquake data. The application also offers regional segmentation (NxN) which allows the studying earthquake clustering, and earthquake cluster shift within the segments in space. The platform offers many filters such for plotting selected magnitude ranges or time periods. The plotting facility allows statistically based plots such as cumulative earthquake magnitude plots and earthquake magnitude histograms, calculation of 'b' etc. What is novel for the platform is the additional deterministic tools. Using the newly developed horizontal and vertical line and circle tools we have studied the spatial distribution trends of many earthquakes and we here show for the first time the link between Fibonacci Numbers and spatiotemporal location of some earthquakes. The new tools are valuable for examining visualizing trends in earthquake research as it allows calculation of statistics as well as deterministic precursors. We plan to show many new results based on our newly developed platform. 14. [Resilience, social relations, and pedagogic intervention five years after the earthquake occurred in L'Aquila (Central Italy) in 2009: an action-research in the primary schools]. PubMed Vaccarelli, Alessandro; Ciccozzi, Chiara; Fiorenza, Arianna 2016-01-01 the action-research "Outdoor training and citizenship between children from L'Aquila", carried out from 2014 to 2015 in some schools situated in the municipality of L'Aquila, aimed to answer to the needs emerged in reference to the social and psychological problems among children during the period after the L'Aquila earthquake occurred in 2009. In particular, the article provides documentary evidence about the results regarding the parts related to the study of resilience (cognitive objective) and of social relations (objective tied to the educational intervention), five years after the earthquake. the pedagogical research team, in close cooperation with the Cartography Laboratory of the University of L'Aquila and with the Grupo de Innovación Educativa Areté de la Universidad Politécnica de Madrid, has worked according to the research-action methodology, collecting secondary data and useful data to check the effectiveness of the educational actions put in place in order to promote resilient behaviours and to activate positive group dynamics. the study has been developed in 4 primary schools of the L'Aquila and has involved 83 children from 8 to 12 years. A control group made by 55 subjects, homogeneous for sex and age, has been identified in the primary schools of Borgorose, a little town near Rieti (Central Italy). data about the abilities of resilience and about the response to the stress have been collected in the first phase of the study with the purpose to outline the initial situation and develop an appropriate educational intervention. The comparison with the control group made by 55 subjects who were not from L'Aquila allowed to check that, 5 years after the disaster, the context of life produces a meaningful discrepancy in terms of responses to the stress and to the ability of resilience, and this fact is definitely negative for children from L'Aquila. On the other hand, data related to social relations allowed to verify how the educational intervention 15. Real Time Earthquake Information System in Japan NASA Astrophysics Data System (ADS) Doi, K.; Kato, T. 2003-12-01 monitors earthquake data and analyzes earthquake activities and tsunami occurrence round-the-clock on a real-time basis. In addition to the above, JMA has been developing a system of Nowcast Earthquake Information which can provide its users with occurrence of an earthquake prior to arrival of strong ground motion for a decade. Earthquake Research Institute, the University of Tokyo, is preparing a demonstrative experiment in collaboration with JMA, for a better utilization of Nowcast Earthquake Information to apply actual measures to reduce earthquake disasters caused by strong ground motion. 16. Toward real-time regional earthquake simulation of Taiwan earthquakes NASA Astrophysics Data System (ADS) Lee, S.; Liu, Q.; Tromp, J.; Komatitsch, D.; Liang, W.; Huang, B. 2013-12-01 We developed a Real-time Online earthquake Simulation system (ROS) to simulate regional earthquakes in Taiwan. The ROS uses a centroid moment tensor solution of seismic events from a Real-time Moment Tensor monitoring system (RMT), which provides all the point source parameters including the event origin time, hypocentral location, moment magnitude and focal mechanism within 2 minutes after the occurrence of an earthquake. Then, all of the source parameters are automatically forwarded to the ROS to perform an earthquake simulation, which is based on a spectral-element method (SEM). We have improved SEM mesh quality by introducing a thin high-resolution mesh layer near the surface to accommodate steep and rapidly varying topography. The mesh for the shallow sedimentary basin is adjusted to reflect its complex geometry and sharp lateral velocity contrasts. The grid resolution at the surface is about 545 m, which is sufficient to resolve topography and tomography data for simulations accurate up to 1.0 Hz. The ROS is also an infrastructural service, making online earthquake simulation feasible. Users can conduct their own earthquake simulation by providing a set of source parameters through the ROS webpage. For visualization, a ShakeMovie and ShakeMap are produced during the simulation. The time needed for one event is roughly 3 minutes for a 70 sec ground motion simulation. The ROS is operated online at the Institute of Earth Sciences, Academia Sinica (http://ros.earth.sinica.edu.tw/). Our long-term goal for the ROS system is to contribute to public earth science outreach and to realize seismic ground motion prediction in real-time. 17. Predicting Student Success in College: What Does the Research Say? ERIC Educational Resources Information Center Merante, Joseph A. 1983-01-01 Reviews various methods for predicting college success: correlation of students' high school grades, achievement test scores, and class rank with characteristics of the institution to be attended; examination of demographic variables such as age, sex, birth order, income, parents' education, religious and ethnic background, and geographic factors;… 18. Next-Day Earthquake Forecasts for California NASA Astrophysics Data System (ADS) Werner, M. J.; Jackson, D. D.; Kagan, Y. Y. 2008-12-01 We implemented a daily forecast of m > 4 earthquakes for California in the format suitable for testing in community-based earthquake predictability experiments: Regional Earthquake Likelihood Models (RELM) and the Collaboratory for the Study of Earthquake Predictability (CSEP). The forecast is based on near-real time earthquake reports from the ANSS catalog above magnitude 2 and will be available online. The model used to generate the forecasts is based on the Epidemic-Type Earthquake Sequence (ETES) model, a stochastic model of clustered and triggered seismicity. Our particular implementation is based on the earlier work of Helmstetter et al. (2006, 2007), but we extended the forecast to all of Cali-fornia, use more data to calibrate the model and its parameters, and made some modifications. Our forecasts will compete against the Short-Term Earthquake Probabilities (STEP) forecasts of Gersten-berger et al. (2005) and other models in the next-day testing class of the CSEP experiment in California. We illustrate our forecasts with examples and discuss preliminary results. 19. The 1985 central chile earthquake: a repeat of previous great earthquakes in the region? PubMed Comte, D; Eisenberg, A; Lorca, E; Pardo, M; Ponce, L; Saragoni, R; Singh, S K; Suárez, G 1986-07-25 A great earthquake (surface-wave magnitude, 7.8) occurred along the coast of central Chile on 3 March 1985, causing heavy damage to coastal towns. Intense foreshock activity near the epicenter of the main shock occurred for 11 days before the earthquake. The aftershocks of the 1985 earthquake define a rupture area of 170 by 110 square kilometers. The earthquake was forecast on the basis of the nearly constant repeat time (83 +/- 9 years) of great earthquakes in this region. An analysis of previous earthquakes suggests that the rupture lengths of great shocks in the region vary by a factor of about 3. The nearly constant repeat time and variable rupture lengths cannot be reconciled with time- or slip-predictable models of earthquake recurrence. The great earthquakes in the region seem to involve a variable rupture mode and yet, for unknown reasons, remain periodic. Historical data suggest that the region south of the 1985 rupture zone should now be considered a gap of high seismic potential that may rupture in a great earthquake in the next few tens of years. 20. Fault failure with moderate earthquakes USGS Publications Warehouse Johnston, M.J.S.; Linde, A.T.; Gladwin, M.T.; Borcherdt, R.D. 1987-01-01 High resolution strain and tilt recordings were made in the near-field of, and prior to, the May 1983 Coalinga earthquake (ML = 6.7, ?? = 51 km), the August 4, 1985, Kettleman Hills earthquake (ML = 5.5, ?? = 34 km), the April 1984 Morgan Hill earthquake (ML = 6.1, ?? = 55 km), the November 1984 Round Valley earthquake (ML = 5.8, ?? = 54 km), the January 14, 1978, Izu, Japan earthquake (ML = 7.0, ?? = 28 km), and several other smaller magnitude earthquakes. These recordings were made with near-surface instruments (resolution 10-8), with borehole dilatometers (resolution 10-10) and a 3-component borehole strainmeter (resolution 10-9). While observed coseismic offsets are generally in good agreement with expectations from elastic dislocation theory, and while post-seismic deformation continued, in some cases, with a moment comparable to that of the main shock, preseismic strain or tilt perturbations from hours to seconds (or less) before the main shock are not apparent above the present resolution. Precursory slip for these events, if any occurred, must have had a moment less than a few percent of that of the main event. To the extent that these records reflect general fault behavior, the strong constraint on the size and amount of slip triggering major rupture makes prediction of the onset times and final magnitudes of the rupture zones a difficult task unless the instruments are fortuitously installed near the rupture initiation point. These data are best explained by an inhomogeneous failure model for which various areas of the fault plane have either different stress-slip constitutive laws or spatially varying constitutive parameters. Other work on seismic waveform analysis and synthetic waveforms indicates that the rupturing process is inhomogeneous and controlled by points of higher strength. These models indicate that rupture initiation occurs at smaller regions of higher strength which, when broken, allow runaway catastrophic failure. ?? 1987. 1. Statistical earthquake focal mechanism forecasts NASA Astrophysics Data System (ADS) Kagan, Yan Y.; Jackson, David D. 2014-04-01 Forecasts of the focal mechanisms of future shallow (depth 0-70 km) earthquakes are important for seismic hazard estimates and Coulomb stress, and other models of earthquake occurrence. Here we report on a high-resolution global forecast of earthquake rate density as a function of location, magnitude and focal mechanism. In previous publications we reported forecasts of 0.5° spatial resolution, covering the latitude range from -75° to +75°, based on the Global Central Moment Tensor earthquake catalogue. In the new forecasts we have improved the spatial resolution to 0.1° and the latitude range from pole to pole. Our focal mechanism estimates require distance-weighted combinations of observed focal mechanisms within 1000 km of each gridpoint. Simultaneously, we calculate an average rotation angle between the forecasted mechanism and all the surrounding mechanisms, using the method of Kagan & Jackson proposed in 1994. This average angle reveals the level of tectonic complexity of a region and indicates the accuracy of the prediction. The procedure becomes problematical where longitude lines are not approximately parallel, and where shallow earthquakes are so sparse that an adequate sample spans very large distances. North or south of 75°, the azimuths of points 1000 km away may vary by about 35°. We solved this problem by calculating focal mechanisms on a plane tangent to the Earth's surface at each forecast point, correcting for the rotation of the longitude lines at the locations of earthquakes included in the averaging. The corrections are negligible between -30° and +30° latitude, but outside that band uncorrected rotations can be significantly off. Improved forecasts at 0.5° and 0.1° resolution are posted at http://eq.ess.ucla.edu/kagan/glob_gcmt_index.html. 2. Izmit, Turkey 1999 Earthquake Interferogram NASA Image and Video Library 2001-03-30 This image is an interferogram that was created using pairs of images taken by Synthetic Aperture Radar (SAR). The images, acquired at two different times, have been combined to measure surface deformation or changes that may have occurred during the time between data acquisition. The images were collected by the European Space Agency's Remote Sensing satellite (ERS-2) on 13 August 1999 and 17 September 1999 and were combined to produce these image maps of the apparent surface deformation, or changes, during and after the 17 August 1999 Izmit, Turkey earthquake. This magnitude 7.6 earthquake was the largest in 60 years in Turkey and caused extensive damage and loss of life. Each of the color contours of the interferogram represents 28 mm (1.1 inches) of motion towards the satellite, or about 70 mm (2.8 inches) of horizontal motion. White areas are outside the SAR image or water of seas and lakes. The North Anatolian Fault that broke during the Izmit earthquake moved more than 2.5 meters (8.1 feet) to produce the pattern measured by the interferogram. Thin red lines show the locations of fault breaks mapped on the surface. The SAR interferogram shows that the deformation and fault slip extended west of the surface faults, underneath the Gulf of Izmit. Thick black lines mark the fault rupture inferred from the SAR data. Scientists are using the SAR interferometry along with other data collected on the ground to estimate the pattern of slip that occurred during the Izmit earthquake. This then used to improve computer models that predict how this deformation transferred stress to other faults and to the continuation of the North Anatolian Fault, which extends to the west past the large city of Istanbul. These models show that the Izmit earthquake further increased the already high probability of a major earthquake near Istanbul. http://photojournal.jpl.nasa.gov/catalog/PIA00557 3. Izmit, Turkey 1999 Earthquake Interferogram NASA Technical Reports Server (NTRS) 2001-01-01 This image is an interferogram that was created using pairs of images taken by Synthetic Aperture Radar (SAR). The images, acquired at two different times, have been combined to measure surface deformation or changes that may have occurred during the time between data acquisition. The images were collected by the European Space Agency's Remote Sensing satellite (ERS-2) on 13 August 1999 and 17 September 1999 and were combined to produce these image maps of the apparent surface deformation, or changes, during and after the 17 August 1999 Izmit, Turkey earthquake. This magnitude 7.6 earthquake was the largest in 60 years in Turkey and caused extensive damage and loss of life. Each of the color contours of the interferogram represents 28 mm (1.1 inches) of motion towards the satellite, or about 70 mm (2.8 inches) of horizontal motion. White areas are outside the SAR image or water of seas and lakes. The North Anatolian Fault that broke during the Izmit earthquake moved more than 2.5 meters (8.1 feet) to produce the pattern measured by the interferogram. Thin red lines show the locations of fault breaks mapped on the surface. The SAR interferogram shows that the deformation and fault slip extended west of the surface faults, underneath the Gulf of Izmit. Thick black lines mark the fault rupture inferred from the SAR data. Scientists are using the SAR interferometry along with other data collected on the ground to estimate the pattern of slip that occurred during the Izmit earthquake. This then used to improve computer models that predict how this deformation transferred stress to other faults and to the continuation of the North Anatolian Fault, which extends to the west past the large city of Istanbul. These models show that the Izmit earthquake further increased the already high probability of a major earthquake near Istanbul. 4. Safety and survival in an earthquake USGS Publications Warehouse , 1969-01-01 Many earth scientists in this country and abroad are focusing their studies on the search for means of predicting impending earthquakes, but, as yet, an accurate prediction of the time and place of such an event cannot be made. From past experience, however, one can assume that earthquakes will continue to harass mankind and that they will occur most frequently in the areas where they have been relatively common in the past. In the United States, earthquakes can be expected to occur most frequently in the western states, particularly in Alaska, California, Washington, Oregon, Nevada, Utah, and Montana. The danger, however, is not confined to any one part of the country; major earthquakes have occurred at widely scattered locations. 5. The relationship between earthquake exposure and posttraumatic stress disorder in 2013 Lushan earthquake NASA Astrophysics Data System (ADS) Wang, Yan; Lu, Yi 2018-01-01 The objective of this study is to explore the relationship between earthquake exposure and the incidence of PTSD. A stratification random sample survey was conducted to collect data in the Longmenshan thrust fault after Lushan earthquake three years. We used the Children's Revised Impact of Event Scale (CRIES-13) and the Earthquake Experience Scale. Subjects in this study included 3944 school student survivors in local eleven schools. The prevalence of probable PTSD is relatively higher, when the people was trapped in the earthquake, was injured in the earthquake or have relatives who died in the earthquake. It concluded that researchers need to pay more attention to the children and adolescents. The government should pay more attention to these people and provide more economic support. 6. Seismic activity preceding the 2016 Kumamoto earthquakes: Multiple approaches to recognizing possible precursors NASA Astrophysics Data System (ADS) Nanjo, K.; Izutsu, J.; Orihara, Y.; Furuse, N.; Togo, S.; Nitta, H.; Okada, T.; Tanaka, R.; Kamogawa, M.; Nagao, T. 2016-12-01 We show the first results of recognizing seismic patterns as possible precursory episodes to the 2016 Kumamoto earthquakes, using existing four different methods: b-value method (e.g., Schorlemmer and Wiemer, 2005; Nanjo et al., 2012), two kinds of seismic quiescence evaluation methods (RTM-algorithm, Nagao et al., 2011; Z-value method, Wiemer and Wyss, 1994), and foreshock seismic density analysis based on Lippiello et al. (2012). We used the earthquake catalog maintained by the Japan Meteorological Agency (JMA). To ensure data quality, we performed catalog completeness check as a pre-processing step of individual analyses. Our finding indicates the methods we adopted do not allow the Kumamoto earthquakes to be predicted exactly. However, we found that the spatial extent of possible precursory patterns differs from one method to the other and ranges from local scales (typically asperity size), to regional scales (e.g., 2° × 3° around the source zone). The earthquakes are preceded by periods of pronounced anomalies, which lasted decade scales (e.g., 20 years or longer) to yearly scales (e.g., 1 2 years). Our results demonstrate that combination of multiple methods detects different signals prior to the Kumamoto earthquakes with more considerable reliability than if measured by single method. This strongly suggests great potential to reduce the possible future sites of earthquakes relative to long-term seismic hazard assessment. This study was partly supported by MEXT under its Earthquake and Volcano Hazards Observation and Research Program and Grant-in-Aid for Scientific Research (C), No. 26350483, 2014-2017, by Chubu University under the Collaboration Research Program of IDEAS, IDEAS201614, and by Tokai University under Project Resarch of IORD. A part of this presentation is given in Nanjo et al. (2016, submitted). 7. Local Deformation Precursors of Large Earthquakes Derived from GNSS Observation Data NASA Astrophysics Data System (ADS) Kaftan, Vladimir; Melnikov, Andrey 2017-12-01 Research on deformation precursors of earthquakes was of immediate interest from the middle to the end of the previous century. The repeated conventional geodetic measurements, such as precise levelling and linear-angular networks, were used for the study. Many examples of studies referenced to strong seismic events using conventional geodetic techniques are presented in [T. Rikitake, 1976]. One of the first case studies of geodetic earthquake precursors was done by Yu.A. Meshcheryakov [1968]. Rare repetitions, insufficient densities and locations of control geodetic networks made difficult predicting future places and times of earthquakes occurrences. Intensive development of Global Navigation Satellite Systems (GNSS) during the recent decades makes research more effective. The results of GNSS observations in areas of three large earthquakes (Napa M6.1, USA, 2014; El Mayor Cucapah M7.2, USA, 2010; and Parkfield M6.0, USA, 2004) are treated and presented in the paper. The characteristics of land surface deformation before, during, and after earthquakes have been obtained. The results prove the presence of anomalous deformations near their epicentres. The temporal character of dilatation and shear strain changes show existence of spatial heterogeneity of deformation of the Earth’s surface from months to years before the main shock close to it and at some distance from it. The revealed heterogeneities can be considered as deformation precursors of strong earthquakes. According to historical data and proper research values of critical deformations which are offered to be used for seismic danger scale creation based on continuous GNSS observations are received in a reference to the mentioned large earthquakes. It is shown that the approach has restrictions owing to uncertainty of the moment in the beginning of deformation accumulation and the place of expectation of another seismic event. Verification and clarification of the derived conclusions are proposed. 8. Ethics and epistemology of accurate prediction in clinical research. PubMed Hey, Spencer Phillips 2015-07-01 All major research ethics policies assert that the ethical review of clinical trial protocols should include a systematic assessment of risks and benefits. But despite this policy, protocols do not typically contain explicit probability statements about the likely risks or benefits involved in the proposed research. In this essay, I articulate a range of ethical and epistemic advantages that explicit forecasting would offer to the health research enterprise. I then consider how some particular confidence levels may come into conflict with the principles of ethical research. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions. 9. Remote monitoring of the earthquake cycle using satellite radar interferometry. PubMed Wright, Tim J 2002-12-15 The earthquake cycle is poorly understood. Earthquakes continue to occur on previously unrecognized faults. Earthquake prediction seems impossible. These remain the facts despite nearly 100 years of intensive study since the earthquake cycle was first conceptualized. Using data acquired from satellites in orbit 800 km above the Earth, a new technique, radar interferometry (InSAR), has the potential to solve these problems. For the first time, detailed maps of the warping of the Earth's surface during the earthquake cycle can be obtained with a spatial resolution of a few tens of metres and a precision of a few millimetres. InSAR does not need equipment on the ground or expensive field campaigns, so it can gather crucial data on earthquakes and the seismic cycle from some of the remotest areas of the planet. In this article, I review some of the remarkable observations of the earthquake cycle already made using radar interferometry and speculate on breakthroughs that are tantalizingly close. 10. Possible seasonality in large deep-focus earthquakes NASA Astrophysics Data System (ADS) Zhan, Zhongwen; Shearer, Peter M. 2015-09-01 Large deep-focus earthquakes (magnitude > 7.0, depth > 500 km) have exhibited strong seasonality in their occurrence times since the beginning of global earthquake catalogs. Of 60 such events from 1900 to the present, 42 have occurred in the middle half of each year. The seasonality appears strongest in the northwest Pacific subduction zones and weakest in the Tonga region. Taken at face value, the surplus of northern hemisphere summer events is statistically significant, but due to the ex post facto hypothesis testing, the absence of seasonality in smaller deep earthquakes, and the lack of a known physical triggering mechanism, we cannot rule out that the observed seasonality is just random chance. However, we can make a testable prediction of seasonality in future large deep-focus earthquakes, which, given likely earthquake occurrence rates, should be verified or falsified within a few decades. If confirmed, deep earthquake seasonality would challenge our current understanding of deep earthquakes. 11. e-Science on Earthquake Disaster Mitigation by EUAsiaGrid NASA Astrophysics Data System (ADS) Yen, Eric; Lin, Simon; Chen, Hsin-Yen; Chao, Li; Huang, Bor-Shoh; Liang, Wen-Tzong 2010-05-01 Although earthquake is not predictable at this moment, with the aid of accurate seismic wave propagation analysis, we could simulate the potential hazards at all distances from possible fault sources by understanding the source rupture process during large earthquakes. With the integration of strong ground-motion sensor network, earthquake data center and seismic wave propagation analysis over gLite e-Science Infrastructure, we could explore much better knowledge on the impact and vulnerability of potential earthquake hazards. On the other hand, this application also demonstrated the e-Science way to investigate unknown earth structure. Regional integration of earthquake sensor networks could aid in fast event reporting and accurate event data collection. Federation of earthquake data center entails consolidation and sharing of seismology and geology knowledge. Capability building of seismic wave propagation analysis implies the predictability of potential hazard impacts. With gLite infrastructure and EUAsiaGrid collaboration framework, earth scientists from Taiwan, Vietnam, Philippine, Thailand are working together to alleviate potential seismic threats by making use of Grid technologies and also to support seismology researches by e-Science. A cross continental e-infrastructure, based on EGEE and EUAsiaGrid, is established for seismic wave forward simulation and risk estimation. Both the computing challenge on seismic wave analysis among 5 European and Asian partners, and the data challenge for data center federation had been exercised and verified. Seismogram-on-Demand service is also developed for the automatic generation of seismogram on any sensor point to a specific epicenter. To ease the access to all the services based on users workflow and retain the maximal flexibility, a Seismology Science Gateway integating data, computation, workflow, services and user communities would be implemented based on typical use cases. In the future, extension of the 12. Redefining Earthquakes and the Earthquake Machine ERIC Educational Resources Information Center Hubenthal, Michael; Braile, Larry; Taber, John 2008-01-01 The Earthquake Machine (EML), a mechanical model of stick-slip fault systems, can increase student engagement and facilitate opportunities to participate in the scientific process. This article introduces the EML model and an activity that challenges ninth-grade students' misconceptions about earthquakes. The activity emphasizes the role of models… 13. 2016 National Earthquake Conference Science.gov Websites Thank you to our Presenting Sponsor, California Earthquake Authority. What's New? What's Next ? What's Your Role in Building a National Strategy? The National Earthquake Conference (NEC) is a , state government leaders, social science practitioners, U.S. State and Territorial Earthquake Managers 14. Earthquake and Schools. [Videotape]. ERIC Educational Resources Information Center Federal Emergency Management Agency, Washington, DC. Designing schools to make them more earthquake resistant and protect children from the catastrophic collapse of the school building is discussed in this videotape. It reveals that 44 of the 50 U.S. states are vulnerable to earthquake, but most schools are structurally unprepared to take on the stresses that earthquakes exert. The cost to the… 15. Children's Ideas about Earthquakes ERIC Educational Resources Information Center Simsek, Canan Lacin 2007-01-01 Earthquake, a natural disaster, is among the fundamental problems of many countries. If people know how to protect themselves from earthquake and arrange their life styles in compliance with this, damage they will suffer will reduce to that extent. In particular, a good training regarding earthquake to be received in primary schools is considered… 16. Statistical tests of simple earthquake cycle models USGS Publications Warehouse Devries, Phoebe M. R.; Evans, Eileen 2016-01-01 A central goal of observing and modeling the earthquake cycle is to forecast when a particular fault may generate an earthquake: a fault late in its earthquake cycle may be more likely to generate an earthquake than a fault early in its earthquake cycle. Models that can explain geodetic observations throughout the entire earthquake cycle may be required to gain a more complete understanding of relevant physics and phenomenology. Previous efforts to develop unified earthquake models for strike-slip faults have largely focused on explaining both preseismic and postseismic geodetic observations available across a few faults in California, Turkey, and Tibet. An alternative approach leverages the global distribution of geodetic and geologic slip rate estimates on strike-slip faults worldwide. Here we use the Kolmogorov-Smirnov test for similarity of distributions to infer, in a statistically rigorous manner, viscoelastic earthquake cycle models that are inconsistent with 15 sets of observations across major strike-slip faults. We reject a large subset of two-layer models incorporating Burgers rheologies at a significance level of α = 0.05 (those with long-term Maxwell viscosities ηM <~ 4.0 × 1019 Pa s and ηM >~ 4.6 × 1020 Pa s) but cannot reject models on the basis of transient Kelvin viscosity ηK. Finally, we examine the implications of these results for the predicted earthquake cycle timing of the 15 faults considered and compare these predictions to the geologic and historical record. 17. Pore-fluid migration and the timing of the 2005 M8.7 Nias earthquake USGS Publications Warehouse Hughes, K.L.H.; Masterlark, Timothy; Mooney, W.D. 2011-01-01 Two great earthquakes have occurred recently along the Sunda Trench, the 2004 M9.2 Sumatra-Andaman earthquake and the 2005 M8.7 Nias earthquake. These earthquakes ruptured over 1600 km of adjacent crust within 3 mo of each other. We quantitatively present poroelastic deformation analyses suggesting that postseismic fluid flow and recovery induced by the Sumatra-Andaman earthquake advanced the timing of the Nias earthquake. Simple back-slip simulations indicate that the megapascal (MPa)-scale pore-pressure recovery is equivalent to 7 yr of interseismic Coulomb stress accumulation near the Nias earthquake hypocenter, implying that pore-pressure recovery of the Sumatra-Andaman earthquake advanced the timing of the Nias earthquake by ~7 yr. That is, in the absence of postseismic pore-pressure recovery, we predict that the Nias earthquake would have occurred in 2011 instead of 2005. ?? 2011 Geological Society of America. 18. Probing failure susceptibilities of earthquake faults using small-quake tidal correlations. PubMed Brinkman, Braden A W; LeBlanc, Michael; Ben-Zion, Yehuda; Uhl, Jonathan T; Dahmen, Karin A 2015-01-27 Mitigating the devastating economic and humanitarian impact of large earthquakes requires signals for forecasting seismic events. Daily tide stresses were previously thought to be insufficient for use as such a signal. Recently, however, they have been found to correlate significantly with small earthquakes, just before large earthquakes occur. Here we present a simple earthquake model to investigate whether correlations between daily tidal stresses and small earthquakes provide information about the likelihood of impending large earthquakes. The model predicts that intervals of significant correlations between small earthquakes and ongoing low-amplitude periodic stresses indicate increased fault susceptibility to large earthquake generation. The results agree with the recent observations of large earthquakes preceded by time periods of significant correlations between smaller events and daily tide stresses. We anticipate that incorporating experimentally determined parameters and fault-specific details into the model may provide new tools for extracting improved probabilities of impending large earthquakes. 19. Earthquake and Tsunami booklet based on two Indonesia earthquakes NASA Astrophysics Data System (ADS) Hayashi, Y.; Aci, M. 2014-12-01 Many destructive earthquakes occurred during the last decade in Indonesia. These experiences are very important precepts for the world people who live in earthquake and tsunami countries. We are collecting the testimonies of tsunami survivors to clarify successful evacuation process and to make clear the characteristic physical behaviors of tsunami near coast. We research 2 tsunami events, 2004 Indian Ocean tsunami and 2010 Mentawai slow earthquake tsunami. Many video and photographs were taken by people at some places in 2004 Indian ocean tsunami disaster; nevertheless these were few restricted points. We didn't know the tsunami behavior in another place. In this study, we tried to collect extensive information about tsunami behavior not only in many places but also wide time range after the strong shake. In Mentawai case, the earthquake occurred in night, so there are no impressive photos. To collect detail information about evacuation process from tsunamis, we contrived the interview method. This method contains making pictures of tsunami experience from the scene of victims' stories. In 2004 Aceh case, all survivors didn't know tsunami phenomena. Because there were no big earthquakes with tsunami for one hundred years in Sumatra region, public people had no knowledge about tsunami. This situation was highly improved in 2010 Mentawai case. TV programs and NGO or governmental public education programs about tsunami evacuation are widespread in Indonesia. Many people know about fundamental knowledge of earthquake and tsunami disasters. We made drill book based on victim's stories and painted impressive scene of 2 events. We used the drill book in disaster education event in school committee of west Java. About 80 % students and teachers evaluated that the contents of the drill book are useful for correct understanding. 20. Supercomputing meets seismology in earthquake exhibit ScienceCinema Blackwell, Matt; Rodger, Arthur; Kennedy, Tom 2018-02-14 When the California Academy of Sciences created the "Earthquake: Evidence of a Restless Planet" exhibit, they called on Lawrence Livermore to help combine seismic research with the latest data-driven visualization techniques. The outcome is a series of striking visualizations of earthquakes, tsunamis and tectonic plate evolution. Seismic-wave research is a core competency at Livermore. While most often associated with earthquakes, the research has many other applications of national interest, such as nuclear explosion monitoring, explosion forensics, energy exploration, and seismic acoustics. For the Academy effort, Livermore researchers simulated the San Andreas and Hayward fault events at high resolutions. Such calculations require significant computational resources. To simulate the 1906 earthquake, for instance, visualizing 125 seconds of ground motion required over 1 billion grid points, 10,000 time steps, and 7.5 hours of processor time on 2,048 cores of Livermore's Sierra machine. 1. Napa Earthquake impact on water systems NASA Astrophysics Data System (ADS) Wang, J. 2014-12-01 South Napa earthquake occurred in Napa, California on August 24 at 3am, local time, and the magnitude is 6.0. The earthquake was the largest in SF Bay Area since the 1989 Loma Prieta earthquake. Economic loss topped$ 1 billion. Wine makers cleaning up and estimated the damage on tourism. Around 15,000 cases of lovely cabernet were pouring into the garden at the Hess Collection. Earthquake potentially raise water pollution risks, could cause water crisis. CA suffered water shortage recent years, and it could be helpful on how to prevent underground/surface water pollution from earthquake. This research gives a clear view on drinking water system in CA, pollution on river systems, as well as estimation on earthquake impact on water supply. The Sacramento-San Joaquin River delta (close to Napa), is the center of the state's water distribution system, delivering fresh water to more than 25 million residents and 3 million acres of farmland. Delta water conveyed through a network of levees is crucial to Southern California. The drought has significantly curtailed water export, and salt water intrusion reduced fresh water outflows. Strong shaking from a nearby earthquake can cause saturated, loose, sandy soils liquefaction, and could potentially damage major delta levee systems near Napa. Napa earthquake is a wake-up call for Southern California. It could potentially damage freshwater supply system.

2. New ideas about the physics of earthquakes

Rundle, John B.; Klein, William

1995-07-01

It may be no exaggeration to claim that this most recent quaddrenium has seen more controversy and thus more progress in understanding the physics of earthquakes than any in recent memory. The most interesting development has clearly been the emergence of a large community of condensed matter physicists around the world who have begun working on the problem of earthquake physics. These scientists bring to the study of earthquakes an entirely new viewpoint, grounded in the physics of nucleation and critical phenomena in thermal, magnetic, and other systems. Moreover, a surprising technology transfer from geophysics to other fields has been made possible by the realization that models originally proposed to explain self-organization in earthquakes can also be used to explain similar processes in problems as disparate as brain dynamics in neurobiology (Hopfield, 1994), and charge density waves in solids (Brown and Gruner, 1994). An entirely new sub-discipline is emerging that is focused around the development and analysis of large scale numerical simulations of the dynamics of faults. At the same time, intriguing new laboratory and field data, together with insightful physical reasoning, has led to significant advances in our understanding of earthquake source physics. As a consequence, we can anticipate substantial improvement in our ability to understand the nature of earthquake occurrence. Moreover, while much research in the area of earthquake physics is fundamental in character, the results have many potential applications (Cornell et al., 1993) in the areas of earthquake risk and hazard analysis, and seismic zonation.

3. Why local people did not present a problem in the 2016 Kumamoto earthquake, Japan though people accused in the 2009 L'Aquila earthquake?

Sugimoto, M.

2016-12-01

Risk communication is a big issues among seismologists after the 2009 L'Aquila earthquake all over the world. A lot of people remember 7 researchers as "L'Aquila 7" were accused in Italy. Seismologists said it is impossible to predict an earthquake by science technology today and join more outreach activities. "In a subsequent inquiry of the handling of the disaster, seven members of the Italian National Commission for the Forecast and Prevention of Major Risks were accused of giving "inexact, incomplete and contradictory" information about the danger of the tremors prior to the main quake. On 22 October 2012, six scientists and one ex-government official were convicted of multiple manslaughter for downplaying the likelihood of a major earthquake six days before it took place. They were each sentenced to six years' imprisonment (Wikipedia)". Finally 6 scientists are not guilty. The 2016 Kumamoto earthquake hit Kyushu, Japan in April. They are very similar seismological situations between the 2016 Kumamoto earthquake and the 2009 L'Aquila earthquake. The foreshock was Mj6.5 and Mw6.2 in 14 April 2016. The main shock was Mj7.3 and Mw7.0. Japan Metrological Agency (JMA) misleaded foreshock as mainshock before main shock occured. 41 people died by the main shock in Japan. However local people did not accused scientists in Japan. It has been less big earhquakes around 100 years in Kumamoto. Poeple was not so matured that they treated earthquake information in Kyushu, Japan. How are there differences between Japan and Italy? We learn about outreach activities for sciencits from this case.

4. The TeraShake Computational Platform for Large-Scale Earthquake Simulations

Cui, Yifeng; Olsen, Kim; Chourasia, Amit; Moore, Reagan; Maechling, Philip; Jordan, Thomas

Geoscientific and computer science researchers with the Southern California Earthquake Center (SCEC) are conducting a large-scale, physics-based, computationally demanding earthquake system science research program with the goal of developing predictive models of earthquake processes. The computational demands of this program continue to increase rapidly as these researchers seek to perform physics-based numerical simulations of earthquake processes for larger meet the needs of this research program, a multiple-institution team coordinated by SCEC has integrated several scientific codes into a numerical modeling-based research tool we call the TeraShake computational platform (TSCP). A central component in the TSCP is a highly scalable earthquake wave propagation simulation program called the TeraShake anelastic wave propagation (TS-AWP) code. In this chapter, we describe how we extended an existing, stand-alone, wellvalidated, finite-difference, anelastic wave propagation modeling code into the highly scalable and widely used TS-AWP and then integrated this code into the TeraShake computational platform that provides end-to-end (initialization to analysis) research capabilities. We also describe the techniques used to enhance the TS-AWP parallel performance on TeraGrid supercomputers, as well as the TeraShake simulations phases including input preparation, run time, data archive management, and visualization. As a result of our efforts to improve its parallel efficiency, the TS-AWP has now shown highly efficient strong scaling on over 40K processors on IBM’s BlueGene/L Watson computer. In addition, the TSCP has developed into a computational system that is useful to many members of the SCEC community for performing large-scale earthquake simulations.

5. The effects of earthquake measurement concepts and magnitude anchoring on individuals' perceptions of earthquake risk

USGS Publications Warehouse

Celsi, R.; Wolfinbarger, M.; Wald, D.

2005-01-01

The purpose of this research is to explore earthquake risk perceptions in California. Specifically, we examine the risk beliefs, feelings, and experiences of lay, professional, and expert individuals to explore how risk is perceived and how risk perceptions are formed relative to earthquakes. Our results indicate that individuals tend to perceptually underestimate the degree that earthquake (EQ) events may affect them. This occurs in large part because individuals' personal felt experience of EQ events are generally overestimated relative to experienced magnitudes. An important finding is that individuals engage in a process of "cognitive anchoring" of their felt EQ experience towards the reported earthquake magnitude size. The anchoring effect is moderated by the degree that individuals comprehend EQ magnitude measurement and EQ attenuation. Overall, the results of this research provide us with a deeper understanding of EQ risk perceptions, especially as they relate to individuals' understanding of EQ measurement and attenuation concepts. ?? 2005, Earthquake Engineering Research Institute.

6. Forty Days after the Great East Japan Earthquake: Field Research Investigating Community Engagement and Traumatic Stress Screening in a Post-Disaster Community Mental Health Training

PubMed Central

Tuerk, Peter W.; Hall, Brian; Nagae, Nobukazu; McCauley, Jenna L.; Yoder, Matthew; Rauch, Sheila A.M.; Acierno, Ron; Dussich, John

2016-01-01

The current paper describes the results of posttraumatic stress educational outreach and screening offered to 141 citizens of Japan who attended a public-service mental health training regarding post-disaster coping 40 days after a 6.8 Richter Scale earthquake, local and regional deaths, and an ongoing nuclear radiation threat. Attendees were given access to anonymous questionnaires that were integrated into the training as a tool to help enhance mental health literacy and bridge communication gaps. Questionnaires were turned in by a third of those in attendance. Among respondents, multiple exposures to potentially-traumatic events were common. More than a quarter of respondents met criteria for probable PTSD. Physical health and loss of sense of community were related to PTSD symptoms. Associations and diagnosis rates represented in these data are not generalizable to the population as a whole or intended for epidemiological purposes; rather, they are evidence of a potentially useful approach to post-disaster clinical screening, education, and engagement. Results are presented in the context of previous findings in Japan and ecologically-supportive post-disaster field research is discussed. PMID:23977819

7. Source mechanism inversion and ground motion modeling of induced earthquakes in Kuwait - A Bayesian approach

Gu, C.; Toksoz, M. N.; Marzouk, Y.; Al-Enezi, A.; Al-Jeri, F.; Buyukozturk, O.

2016-12-01

The increasing seismic activity in the regions of oil/gas fields due to fluid injection/extraction and hydraulic fracturing has drawn new attention in both academia and industry. Source mechanism and triggering stress of these induced earthquakes are of great importance for understanding the physics of the seismic processes in reservoirs, and predicting ground motion in the vicinity of oil/gas fields. The induced seismicity data in our study are from Kuwait National Seismic Network (KNSN). Historically, Kuwait has low local seismicity; however, in recent years the KNSN has monitored more and more local earthquakes. Since 1997, the KNSN has recorded more than 1000 earthquakes (Mw < 5). In 2015, two local earthquakes - Mw4.5 in 03/21/2015 and Mw4.1 in 08/18/2015 - have been recorded by both the Incorporated Research Institutions for Seismology (IRIS) and KNSN, and widely felt by people in Kuwait. These earthquakes happen repeatedly in the same locations close to the oil/gas fields in Kuwait (see the uploaded image). The earthquakes are generally small (Mw < 5) and are shallow with focal depths of about 2 to 4 km. Such events are very common in oil/gas reservoirs all over the world, including North America, Europe, and the Middle East. We determined the location and source mechanism of these local earthquakes, with the uncertainties, using a Bayesian inversion method. The triggering stress of these earthquakes was calculated based on the source mechanisms results. In addition, we modeled the ground motion in Kuwait due to these local earthquakes. Our results show that most likely these local earthquakes occurred on pre-existing faults and were triggered by oil field activities. These events are generally smaller than Mw 5; however, these events, occurring in the reservoirs, are very shallow with focal depths less than about 4 km. As a result, in Kuwait, where oil fields are close to populated areas, these induced earthquakes could produce ground accelerations high

8. Regional Triggering of Volcanic Activity Following Large Magnitude Earthquakes

Hill-Butler, Charley; Blackett, Matthew; Wright, Robert

2015-04-01

conditions for response and gauge the effect of each variable on the relationship between earthquakes and volcanic activity. Finally, a volcanic forecast model will be assessed to evaluate the use of earthquakes as a precursory indicator to volcanic activity. If proven, the relationship between earthquakes and volcanic activity has the potential to aid our understanding of the conditions that influence triggering following an earthquake and provide vital clues for volcanic activity prediction and the identification of precursors. Hill-Butler, C.; Blackett, M.; Wright, R. and Trodd, N. (2014) Global Heat Flux Response to Large Earthquakes in the 21st Century. Geology in preparation. Kaufman, Y. J.; Justice, C.; Flynn, L.; Kendall, J.; Prins, E.; Ward, D. E.; Menzel, P. and Setzer, A. (1998) Monitoring Global Fires from EOS-MODIS. Journal of Geophysical Research 103, 32,215-32,238 Wright, R.; Blackett, M. and Hill-Butler, C. (2014) Some observations regarding the thermal flux from Earth's erupting volcanoes for the period 2000 to 2014. Geophysical Research Letters in review.

9. Predicting Phonetic Transcription Agreement: Insights from Research in Infant Vocalizations

ERIC Educational Resources Information Center

Ramsdell, Heather L.; Oller, D. Kimbrough; Ethington, Corinna A.

2007-01-01

The purpose of this study is to provide new perspectives on correlates of phonetic transcription agreement. Our research focuses on phonetic transcription and coding of infant vocalizations. The findings are presumed to be broadly applicable to other difficult cases of transcription, such as found in severe disorders of speech, which similarly…

10. Research on cardiovascular disease prediction based on distance metric learning

Ni, Zhuang; Liu, Kui; Kang, Guixia

2018-04-01

Distance metric learning algorithm has been widely applied to medical diagnosis and exhibited its strengths in classification problems. The k-nearest neighbour (KNN) is an efficient method which treats each feature equally. The large margin nearest neighbour classification (LMNN) improves the accuracy of KNN by learning a global distance metric, which did not consider the locality of data distributions. In this paper, we propose a new distance metric algorithm adopting cosine metric and LMNN named COS-SUBLMNN which takes more care about local feature of data to overcome the shortage of LMNN and improve the classification accuracy. The proposed methodology is verified on CVDs patient vector derived from real-world medical data. The Experimental results show that our method provides higher accuracy than KNN and LMNN did, which demonstrates the effectiveness of the Risk predictive model of CVDs based on COS-SUBLMNN.

11. Regional Earthquake Shaking and Loss Estimation

Sesetyan, K.; Demircioglu, M. B.; Zulfikar, C.; Durukal, E.; Erdik, M.

2009-04-01

This study, conducted under the JRA-3 component of the EU NERIES Project, develops a methodology and software (ELER) for the rapid estimation of earthquake shaking and losses in the Euro-Mediterranean region. This multi-level methodology developed together with researchers from Imperial College, NORSAR and ETH-Zurich is capable of incorporating regional variability and sources of uncertainty stemming from ground motion predictions, fault finiteness, site modifications, inventory of physical and social elements subjected to earthquake hazard and the associated vulnerability relationships. GRM Risk Management, Inc. of Istanbul serves as sub-contractor tor the coding of the ELER software. The methodology encompasses the following general steps: 1. Finding of the most likely location of the source of the earthquake using regional seismotectonic data base and basic source parameters, and if and when possible, by the estimation of fault rupture parameters from rapid inversion of data from on-line stations. 2. Estimation of the spatial distribution of selected ground motion parameters through region specific ground motion attenuation relationships and using shear wave velocity distributions.(Shake Mapping) 4. Incorporation of strong ground motion and other empirical macroseismic data for the improvement of Shake Map 5. Estimation of the losses (damage, casualty and economic) at different levels of sophistication (0, 1 and 2) that commensurate with the availability of inventory of human built environment (Loss Mapping) Both Level 0 (similar to PAGER system of USGS) and Level 1 analyses of the ELER routine are based on obtaining intensity distributions analytically and estimating total number of casualties and their geographic distribution either using regionally adjusted intensity-casualty or magnitude-casualty correlations (Level 0) of using regional building inventory data bases (Level 1). Level 0 analysis is similar to the PAGER system being developed by USGS. For given

12. Crowdsourced earthquake early warning

PubMed Central

Minson, Sarah E.; Brooks, Benjamin A.; Glennie, Craig L.; Murray, Jessica R.; Langbein, John O.; Owen, Susan E.; Heaton, Thomas H.; Iannucci, Robert A.; Hauser, Darren L.

2015-01-01

Earthquake early warning (EEW) can reduce harm to people and infrastructure from earthquakes and tsunamis, but it has not been implemented in most high earthquake-risk regions because of prohibitive cost. Common consumer devices such as smartphones contain low-cost versions of the sensors used in EEW. Although less accurate than scientific-grade instruments, these sensors are globally ubiquitous. Through controlled tests of consumer devices, simulation of an Mw (moment magnitude) 7 earthquake on California’s Hayward fault, and real data from the Mw 9 Tohoku-oki earthquake, we demonstrate that EEW could be achieved via crowdsourcing. PMID:26601167

13. Crowdsourced earthquake early warning.

PubMed

Minson, Sarah E; Brooks, Benjamin A; Glennie, Craig L; Murray, Jessica R; Langbein, John O; Owen, Susan E; Heaton, Thomas H; Iannucci, Robert A; Hauser, Darren L

2015-04-01

Earthquake early warning (EEW) can reduce harm to people and infrastructure from earthquakes and tsunamis, but it has not been implemented in most high earthquake-risk regions because of prohibitive cost. Common consumer devices such as smartphones contain low-cost versions of the sensors used in EEW. Although less accurate than scientific-grade instruments, these sensors are globally ubiquitous. Through controlled tests of consumer devices, simulation of an M w (moment magnitude) 7 earthquake on California's Hayward fault, and real data from the M w 9 Tohoku-oki earthquake, we demonstrate that EEW could be achieved via crowdsourcing.

14. Crowdsourced earthquake early warning

USGS Publications Warehouse

Minson, Sarah E.; Brooks, Benjamin A.; Glennie, Craig L.; Murray, Jessica R.; Langbein, John O.; Owen, Susan E.; Heaton, Thomas H.; Iannucci, Robert A.; Hauser, Darren L.

2015-01-01

Earthquake early warning (EEW) can reduce harm to people and infrastructure from earthquakes and tsunamis, but it has not been implemented in most high earthquake-risk regions because of prohibitive cost. Common consumer devices such as smartphones contain low-cost versions of the sensors used in EEW. Although less accurate than scientific-grade instruments, these sensors are globally ubiquitous. Through controlled tests of consumer devices, simulation of an Mw (moment magnitude) 7 earthquake on California’s Hayward fault, and real data from the Mw 9 Tohoku-oki earthquake, we demonstrate that EEW could be achieved via crowdsourcing.

15. Intensity earthquake scenario (scenario event - a damaging earthquake with higher probability of occurrence) for the city of Sofia

Aleksandrova, Irena; Simeonova, Stela; Solakov, Dimcho; Popova, Maria

2014-05-01

. The usable and realistic ground motion maps for urban areas are generated: - either from the assumption of a "reference earthquake" - or directly, showing values of macroseimic intensity generated by a damaging, real earthquake. In the study, applying deterministic approach, earthquake scenario in macroseismic intensity ("model" earthquake scenario) for the city of Sofia is generated. The deterministic "model" intensity scenario based on assumption of a "reference earthquake" is compared with a scenario based on observed macroseimic effects caused by the damaging 2012 earthquake (MW5.6). The difference between observed (Io) and predicted (Ip) intensities values is analyzed.

16. Perception of earthquake risk in Taiwan: effects of gender and past earthquake experience.

PubMed

Kung, Yi-Wen; Chen, Sue-Huei

2012-09-01

This study explored how individuals in Taiwan perceive the risk of earthquake and the relationship of past earthquake experience and gender to risk perception. Participants (n= 1,405), including earthquake survivors and those in the general population without prior direct earthquake exposure, were selected and interviewed through a computer-assisted telephone interviewing procedure using a random sampling and stratification method covering all 24 regions of Taiwan. A factor analysis of the interview data yielded a two-factor structure of risk perception in regard to earthquake. The first factor, "personal impact," encompassed perception of threat and fear related to earthquakes. The second factor, "controllability," encompassed a sense of efficacy of self-protection in regard to earthquakes. The findings indicated prior earthquake survivors and females reported higher scores on the personal impact factor than males and those with no prior direct earthquake experience, although there were no group differences on the controllability factor. The findings support that risk perception has multiple components, and suggest that past experience (survivor status) and gender (female) affect the perception of risk. Exploration of potential contributions of other demographic factors such as age, education, and marital status to personal impact, especially for females and survivors, is discussed. Future research on and intervention program with regard to risk perception are suggested accordingly. © 2012 Society for Risk Analysis.

17. High Attenuation Rate for Shallow, Small Earthquakes in Japan

Si, Hongjun; Koketsu, Kazuki; Miyake, Hiroe

2017-09-01

We compared the attenuation characteristics of peak ground accelerations (PGAs) and velocities (PGVs) of strong motion from shallow, small earthquakes that occurred in Japan with those predicted by the equations of Si and Midorikawa (J Struct Constr Eng 523:63-70, 1999). The observed PGAs and PGVs at stations far from the seismic source decayed more rapidly than the predicted ones. The same tendencies have been reported for deep, moderate, and large earthquakes, but not for shallow, moderate, and large earthquakes. This indicates that the peak values of ground motion from shallow, small earthquakes attenuate more steeply than those from shallow, moderate or large earthquakes. To investigate the reason for this difference, we numerically simulated strong ground motion for point sources of M w 4 and 6 earthquakes using a 2D finite difference method. The analyses of the synthetic waveforms suggested that the above differences are caused by surface waves, which are predominant at stations far from the seismic source for shallow, moderate earthquakes but not for shallow, small earthquakes. Thus, although loss due to reflection at the boundaries of the discontinuous Earth structure occurs in all shallow earthquakes, the apparent attenuation rate for a moderate or large earthquake is essentially the same as that of body waves propagating in a homogeneous medium due to the dominance of surface waves.

18. 76 FR 11821 - Submission for OMB Review; Comment Request Survey of Principal Investigators on Earthquake...

Federal Register 2010, 2011, 2012, 2013, 2014

2011-03-03

...: Survey of Principal Investigators on Earthquake Engineering Research Awards Made by the National Science... survey of Principal Investigators on NSF earthquake engineering research awards, including but not... NATIONAL SCIENCE FOUNDATION Submission for OMB Review; Comment Request Survey of Principal...

19. Social, not physical, infrastructure: the critical role of civil society after the 1923 Tokyo earthquake.

PubMed

Aldrich, Daniel P

2012-07-01

Despite the tremendous destruction wrought by catastrophes, social science holds few quantitative assessments of explanations for the rate of recovery. This article illuminates four factors-damage, population density, human capital, and economic capital-that are thought to explain the variation in the pace of population recovery following disaster; it also explores the popular but relatively untested factor of social capital. Using time-series, cross-sectional models and propensity score matching, it tests these approaches using new data from the rebuilding of 39 neighbourhoods in Tokyo after its 1923 earthquake. Social capital, more than earthquake damage, population density, human capital, or economic capital, best predicts population recovery in post-earthquake Tokyo. These findings suggest new approaches for research on social capital and disasters as well as public policy avenues for handling catastrophes. © 2012 The Author(s). Journal compilation © Overseas Development Institute, 2012.

20. Research of performance prediction to energy on hydraulic turbine

Quan, H.; Li, R. N.; Li, Q. F.; Han, W.; Su, Q. M.

2012-11-01

Refer to the low specific speed Francis turbine blade design principle and double-suction pump structure. Then, design a horizontal double-channel hydraulic turbine Francis. Through adding different guide vane airfoil and and no guide vane airfoil on the hydraulic conductivity components to predict hydraulic turbine energy and using Fluent software to numerical simulation that the operating conditions and point. The results show that the blade pressure surface and suction surface pressure is low when the hydraulic turbine installation is added standard positive curvature of the guide vane and modified positive curvature of guide vane. Therefore, the efficiency of energy recovery is low. However, the pressure of negative curvature guide vane and symmetric guide vane added on hydraulic turbine installations is larger than that of the former ones, and it is conducive to working of runner. With the decreasing of guide vane opening, increasing of inlet angle, flow state gets significantly worse. Then, others obvious phenomena are that the reflux and horizontal flow appeared in blade pressure surface. At the same time, the vortex was formed in Leaf Road, leading to the loss of energy. Through analyzing the distribution of pressure, velocity, flow lines of over-current flow in the the back hydraulic conductivity components in above programs we can known that the hydraulic turbine installation added guide vane is more reasonable than without guide vanes, it is conducive to improve efficiency of energy conversion.

1. Collaborative Research: Separating Forced and Unforced Decadal Predictability in Models and Observations

SciTech Connect

Tippett, Michael K.

2014-04-09

This report is a progress report of the accomplishments of the research grant “Collaborative Research: Separating Forced and Unforced Decadal Predictability in Models and Observa- tions” during the period 1 May 2011- 31 August 2013. This project is a collaborative one between Columbia University and George Mason University. George Mason University will submit a final technical report at the conclusion of their no-cost extension. The purpose of the proposed research is to identify unforced predictable components on decadal time scales, distinguish these components from forced predictable components, and to assess the reliability of model predictions of these components. Components ofmore » unforced decadal predictability will be isolated by maximizing the Average Predictability Time (APT) in long, multimodel control runs from state-of-the-art climate models. Components with decadal predictability have large APT, so maximizing APT ensures that components with decadal predictability will be detected. Optimal fingerprinting techniques, as used in detection and attribution analysis, will be used to separate variations due to natural and anthropogenic forcing from those due to unforced decadal predictability. This methodology will be applied to the decadal hindcasts generated by the CMIP5 project to assess the reliability of model projections. The question of whether anthropogenic forcing changes decadal predictability, or gives rise to new forms of decadal predictability, also will be investigated.« less

2. Defining "Acceptable Risk" for Earthquakes Worldwide

Tucker, B.

2001-05-01

The greatest and most rapidly growing earthquake risk for mortality is in developing countries. Further, earthquake risk management actions of the last 50 years have reduced the average lethality of earthquakes in earthquake-threatened industrialized countries. (This is separate from the trend of the increasing fiscal cost of earthquakes there.) Despite these clear trends, every new earthquake in developing countries is described in the media as a "wake up" call, announcing the risk these countries face. GeoHazards International (GHI) works at both the community and the policy levels to try to reduce earthquake risk. GHI reduces death and injury by helping vulnerable communities recognize their risk and the methods to manage it, by raising awareness of its risk, building local institutions to manage that risk, and strengthening schools to protect and train the community's future generations. At the policy level, GHI, in collaboration with research partners, is examining whether "acceptance" of these large risks by people in these countries and by international aid and development organizations explains the lack of activity in reducing these risks. The goal of this pilot project - The Global Earthquake Safety Initiative (GESI) - is to develop and evaluate a means of measuring the risk and the effectiveness of risk mitigation actions in the world's largest, most vulnerable cities: in short, to develop an earthquake risk index. One application of this index is to compare the risk and the risk mitigation effort of "comparable" cities. By this means, Lima, for example, can compare the risk of its citizens dying due to earthquakes with the risk of citizens in Santiago and Guayaquil. The authorities of Delhi and Islamabad can compare the relative risk from earthquakes of their school children. This index can be used to measure the effectiveness of alternate mitigation projects, to set goals for mitigation projects, and to plot progress meeting those goals. The preliminary

3. Incubation of Chile's 1960 Earthquake

Atwater, B. F.; Cisternas, M.; Salgado, I.; Machuca, G.; Lagos, M.; Eipert, A.; Shishikura, M.

2003-12-01

trees. We sampled 45 such trees, some of them completely dead and the rest surviving only from shoots near the ground. One-third of these trees lived through the 1837 earthquake; they contain over 180 annual rings. Five of the trees also contain rings earlier than 1737. From this evidence, we tentatively infer that the islands underwent more subsidence in 1960 than they did in 1737 or 1837. Comparisons with old Chilean documents for the estuary further suggest that subsidence in 1837 did not approach that of 1960. In their depiction and description of the Misquihue islands in 1874, surveyor Francisco Vidal and botanist Carlos Juliet show nothing like the ghost forests seen today. Twice in the first 37 years after the 1837 earthquake, surveyors mapped as emergent several islands that the 1960 earthquake would lower into tidal water. Today, 43 years after they subsided in 1960, these islands remain submerged as barren intertidal flats. Research supported by Fondecyt 1020224.

4. Promise and problems in using stress triggering models for time-dependent earthquake hazard assessment

Cocco, M.

2001-12-01

Earthquake stress changes can promote failures on favorably oriented faults and modify the seismicity pattern over broad regions around the causative faults. Because the induced stress perturbations modify the rate of production of earthquakes, they alter the probability of seismic events in a specified time window. Comparing the Coulomb stress changes with the seismicity rate changes and aftershock patterns can statistically test the role of stress transfer in earthquake occurrence. The interaction probability may represent a further tool to test the stress trigger or shadow model. The probability model, which incorporate stress transfer, has the main advantage to include the contributions of the induced stress perturbation (a static step in its present formulation), the loading rate and the fault constitutive properties. Because the mechanical conditions of the secondary faults at the time of application of the induced load are largely unkown, stress triggering can only be tested on fault populations and not on single earthquake pairs with a specified time delay. The interaction probability can represent the most suitable tool to test the interaction between large magnitude earthquakes. Despite these important implications and the stimulating perspectives, there exist problems in understanding earthquake interaction that should motivate future research but at the same time limit its immediate social applications. One major limitation is that we are unable to predict how and if the induced stress perturbations modify the ratio between small versus large magnitude earthquakes. In other words, we cannot distinguish between a change in this ratio in favor of small events or of large magnitude earthquakes, because the interaction probability is independent of magnitude. Another problem concerns the reconstruction of the stressing history. The interaction probability model is based on the response to a static step; however, we know that other processes contribute to

5. Building vulnerability and human loss assessment in different earthquake intensity and time: a case study of the University of the Philippines, Los Baños (UPLB) Campus

Rusydy, I.; Faustino-Eslava, D. V.; Muksin, U.; Gallardo-Zafra, R.; Aguirre, J. J. C.; Bantayan, N. C.; Alam, L.; Dakey, S.

2017-02-01

Study on seismic hazard, building vulnerability and human loss assessment become substantial for building education institutions since the building are used by a lot of students, lecturers, researchers, and guests. The University of the Philippines, Los Banos (UPLB) located in an earthquake prone area. The earthquake could cause structural damage and injury of the UPLB community. We have conducted earthquake assessment in different magnitude and time to predict the posibility of ground shaking, building vulnerability and estimated the number of casualty of the UPLB community. The data preparation in this study includes the earthquake scenario modeling using Intensity Prediction Equations (IPEs) for shallow crustal shaking attenuation to produce intensity map of bedrock and surface. Earthquake model was generated from the segment IV and the segment X of the Valley Fault System (VFS). Building vulnerability of different type of building was calculated using fragility curve of the Philippines building. The population data for each building in various occupancy time, damage ratio, and injury ratio data were used to compute the number of casualties. The result reveals that earthquake model from the segment IV and the segment X of the VFS could generate earthquake intensity between 7.6 - 8.1 MMI in the UPLB campus. The 7.7 Mw earthquake (scenario I) from the segment IV could cause 32% - 51% damage of building and 6.5 Mw earthquake (scenario II) occurring in the segment X could cause 18% - 39% structural damage of UPLB buildings. If the earthquake occurs at 2 PM (day-time), it could injure 10.2% - 18.8% for the scenario I and could injure 7.2% - 15.6% of UPLB population in scenario II. The 5 Pm event, predicted will injure 5.1%-9.4% in the scenario I, and 3.6%-7.8% in scenario II. A nighttime event (2 Am) cause injury to students and guests who stay in dormitories. The earthquake is predicted to injure 13 - 66 students and guests in the scenario I and 9 - 47 people in the

6. Earthquake warning system for Japan Railways’ bullet train; implications for disaster prevention in California

USGS Publications Warehouse

Nakamura, Y.; Tucker, B. E.

1988-01-01

Today, Japanese society is well aware of the prediction of the Tokai earthquake. It is estimated by the Tokyo earthquake. It is estimated by the Tokyo muncipal government that this predicted earthquake could kill 30,000 people. (this estimate is viewed by many as conservative; other Japanese government agencies have made estimates but they have not been published.) Reduction in the number deaths from 120,000 to 30,000 between the Kanto earthquake and the predicted Tokai earthquake is due in large part to the reduction in the proportion of wooden construction (houses).

7. Metaphors Developed by Secondary School Students towards "Earthquake" Concept

ERIC Educational Resources Information Center

Kaya, Huseyin

2010-01-01

This research was conducted to reveal the metaphors of Secondary school students about "earthquake" concept. About 105 students in two schools in Karabuk city centre participated in the research within 2009-2010 academic year. The research Data were obtained by students' completing the statement "Earthquake is like...,…

8. Adaptive vibration control of structures under earthquakes

Lew, Jiann-Shiun; Juang, Jer-Nan; Loh, Chin-Hsiung

2017-04-01

techniques, for structural vibration suppression under earthquakes. Various control strategies have been developed to protect structures from natural hazards and improve the comfort of occupants in buildings. However, there has been little development of adaptive building control with the integration of real-time system identification and control design. Generalized predictive control, which combines the process of real-time system identification and the process of predictive control design, has received widespread acceptance and has been successfully applied to various test-beds. This paper presents a formulation of the predictive control scheme for adaptive vibration control of structures under earthquakes. Comprehensive simulations are performed to demonstrate and validate the proposed adaptive control technique for earthquake-induced vibration of a building.

9. Missing great earthquakes

USGS Publications Warehouse

Hough, Susan E.

2013-01-01

The occurrence of three earthquakes with moment magnitude (Mw) greater than 8.8 and six earthquakes larger than Mw 8.5, since 2004, has raised interest in the long-term global rate of great earthquakes. Past studies have focused on the analysis of earthquakes since 1900, which roughly marks the start of the instrumental era in seismology. Before this time, the catalog is less complete and magnitude estimates are more uncertain. Yet substantial information is available for earthquakes before 1900, and the catalog of historical events is being used increasingly to improve hazard assessment. Here I consider the catalog of historical earthquakes and show that approximately half of all Mw ≥ 8.5 earthquakes are likely missing or underestimated in the 19th century. I further present a reconsideration of the felt effects of the 8 February 1843, Lesser Antilles earthquake, including a first thorough assessment of felt reports from the United States, and show it is an example of a known historical earthquake that was significantly larger than initially estimated. The results suggest that incorporation of best available catalogs of historical earthquakes will likely lead to a significant underestimation of seismic hazard and/or the maximum possible magnitude in many regions, including parts of the Caribbean.

10. The pathway to earthquake early warning in the US

Allen, R. M.; Given, D. D.; Heaton, T. H.; Vidale, J. E.; West Coast Earthquake Early Warning Development Team

2013-05-01

The development of earthquake early warning capabilities in the United States is now accelerating and expanding as the technical capability to provide warning is demonstrated and additional funding resources are making it possible to expand the current testing region to the entire west coast (California, Oregon and Washington). Over the course of the next two years we plan to build a prototype system that will provide a blueprint for a full public system in the US. California currently has a demonstrations warning system, ShakeAlert, that provides alerts to a group of test users from the public and private sector. These include biotech companies, technology companies, the entertainment industry, the transportation sector, and the emergency planning and response community. Most groups are currently in an evaluation mode, receiving the alerts and developing protocols for future response. The Bay Area Rapid Transit (BART) system is the one group who has now implemented an automated response to the warning system. BART now stops trains when an earthquake of sufficient size is detected. Research and development also continues to develop improved early warning algorithms to better predict the distribution of shaking in large earthquakes when the finiteness of the source becomes important. The algorithms under development include the use of both seismic and GPS instrumentation and integration with existing point source algorithms. At the same time, initial testing and development of algorithms in and for the Pacific Northwest is underway. In this presentation we will review the current status of the systems, highlight the new research developments, and lay out a pathway to a full public system for the US west coast. The research and development described is ongoing at Caltech, UC Berkeley, University of Washington, ETH Zurich, Southern California Earthquake Center, and the US Geological Survey, and is funded by the Gordon and Betty Moore Foundation and the US Geological

11. Predicting Strong Ground-Motion Seismograms for Magnitude 9 Cascadia Earthquakes Using 3D Simulations with High Stress Drop Sub-Events

Frankel, A. D.; Wirth, E. A.; Stephenson, W. J.; Moschetti, M. P.; Ramirez-Guzman, L.

2015-12-01

We have produced broadband (0-10 Hz) synthetic seismograms for magnitude 9.0 earthquakes on the Cascadia subduction zone by combining synthetics from simulations with a 3D velocity model at low frequencies (≤ 1 Hz) with stochastic synthetics at high frequencies (≥ 1 Hz). We use a compound rupture model consisting of a set of M8 high stress drop sub-events superimposed on a background slip distribution of up to 20m that builds relatively slowly. The 3D simulations were conducted using a finite difference program and the finite element program Hercules. The high-frequency (≥ 1 Hz) energy in this rupture model is primarily generated in the portion of the rupture with the M8 sub-events. In our initial runs, we included four M7.9-8.2 sub-events similar to those that we used to successfully model the strong ground motions recorded from the 2010 M8.8 Maule, Chile earthquake. At periods of 2-10 s, the 3D synthetics exhibit substantial amplification (about a factor of 2) for sites in the Puget Lowland and even more amplification (up to a factor of 5) for sites in the Seattle and Tacoma sedimentary basins, compared to rock sites outside of the Puget Lowland. This regional and more localized basin amplification found from the simulations is supported by observations from local earthquakes. There are substantial variations in the simulated M9 time histories and response spectra caused by differences in the hypocenter location, slip distribution, down-dip extent of rupture, coherence of the rupture front, and location of sub-events. We examined the sensitivity of the 3D synthetics to the velocity model of the Seattle basin. We found significant differences in S-wave focusing and surface wave conversions between a 3D model of the basin from a spatially-smoothed tomographic inversion of Rayleigh-wave phase velocities and a model that has an abrupt southern edge of the Seattle basin, as observed in seismic reflection profiles.

12. Stressors of Korean Disaster Relief Team Members during the Nepal Earthquake Dispatch: a Consensual Qualitative Research Analysis.

PubMed

Lee, Kangeui; Lee, So Hee; Park, Taejin; Lee, Ji Yeon

2017-03-01

We conducted in-depth interviews with 11 Korean Disaster Relief Team (KDRT) members about stress related to disaster relief work and analyzed the interview data using the Consensual Qualitative Research (CQR) method in order to evaluate difficulties in disaster relief work and to develop solutions to these problems in cooperation with related organizations. Results showed that members typically experienced stress related to untrained team members, ineffective cooperation, and the shock and aftermath of aftershock experiences. Stress tended to stem from several factors: difficulties related to cooperation with new team members, the frightening disaster experience, and the aftermath of the disaster. Other stressors included conflict with the control tower, diverse problems at the disaster relief work site, and environmental factors. The most common reason that members participated in KDRT work despite all the stressors and difficulties was pride about the kind of work it involved. Many subjects in this study suffered from various stresses after the relief work, but they had no other choice than to attempt to forget about their experiences over time. It is recommended that the mental health of disaster relief workers will improve through the further development of effective treatment and surveillance programs in the future.

13. Stressors of Korean Disaster Relief Team Members during the Nepal Earthquake Dispatch: a Consensual Qualitative Research Analysis

PubMed Central

2017-01-01

We conducted in-depth interviews with 11 Korean Disaster Relief Team (KDRT) members about stress related to disaster relief work and analyzed the interview data using the Consensual Qualitative Research (CQR) method in order to evaluate difficulties in disaster relief work and to develop solutions to these problems in cooperation with related organizations. Results showed that members typically experienced stress related to untrained team members, ineffective cooperation, and the shock and aftermath of aftershock experiences. Stress tended to stem from several factors: difficulties related to cooperation with new team members, the frightening disaster experience, and the aftermath of the disaster. Other stressors included conflict with the control tower, diverse problems at the disaster relief work site, and environmental factors. The most common reason that members participated in KDRT work despite all the stressors and difficulties was pride about the kind of work it involved. Many subjects in this study suffered from various stresses after the relief work, but they had no other choice than to attempt to forget about their experiences over time. It is recommended that the mental health of disaster relief workers will improve through the further development of effective treatment and surveillance programs in the future. PMID:28145656

14. Using Smartphones to Detect Earthquakes

Kong, Q.; Allen, R. M.

2012-12-01

We are using the accelerometers in smartphones to record earthquakes. In the future, these smartphones may work as a supplement network to the current traditional network for scientific research and real-time applications. Given the potential number of smartphones, and small separation of sensors, this new type of seismic dataset has significant potential provides that the signal can be separated from the noise. We developed an application for android phones to record the acceleration in real time. These records can be saved on the local phone or transmitted back to a server in real time. The accelerometers in the phones were evaluated by comparing performance with a high quality accelerometer while located on controlled shake tables for a variety of tests. The results show that the accelerometer in the smartphone can reproduce the characteristic of the shaking very well, even the phone left freely on the shake table. The nature of these datasets is also quite different from traditional networks due to the fact that smartphones are moving around with their owners. Therefore, we must distinguish earthquake signals from other daily use. In addition to the shake table tests that accumulated earthquake records, we also recorded different human activities such as running, walking, driving etc. An artificial neural network based approach was developed to distinguish these different records. It shows a 99.7% successful rate of distinguishing earthquakes from the other typical human activities in our database. We are now at the stage ready to develop the basic infrastructure for a smartphone seismic network.

15. Spatial and Temporal Stress Drop Variations of the 2011 Tohoku Earthquake Sequence

Miyake, H.

2013-12-01

The 2011 Tohoku earthquake sequence consists of foreshocks, mainshock, aftershocks, and repeating earthquakes. To quantify spatial and temporal stress drop variations is important for understanding M9-class megathrust earthquakes. Variability and spatial and temporal pattern of stress drop is a basic information for rupture dynamics as well as useful to source modeling. As pointed in the ground motion prediction equations by Campbell and Bozorgnia [2008, Earthquake Spectra], mainshock-aftershock pairs often provide significant decrease of stress drop. We here focus strong motion records before and after the Tohoku earthquake, and analyze source spectral ratios considering azimuth- and distance dependency [Miyake et al., 2001, GRL]. Due to the limitation of station locations on land, spatial and temporal stress drop variations are estimated by adjusting shifts from the omega-squared source spectral model. The adjustment is based on the stochastic Green's function simulations of source spectra considering azimuth- and distance dependency. We assumed the same Green's functions for event pairs for each station, both the propagation path and site amplification effects are cancelled out. Precise studies of spatial and temporal stress drop variations have been performed [e.g., Allmann and Shearer, 2007, JGR], this study targets the relations between stress drop vs. progression of slow slip prior to the Tohoku earthquake by Kato et al. [2012, Science] and plate structures. Acknowledgement: This study is partly supported by ERI Joint Research (2013-B-05). We used the JMA unified earthquake catalogue and K-NET, KiK-net, and F-net data provided by NIED.

16. Investigation of atmospheric anomalies associated with Kashmir and Awaran Earthquakes

2017-02-01

The earthquake precursors' anomalies at diverse elevation ranges over the seismogenic region and prior to the seismic events are perceived using Satellite Remote Sensing (SRS) techniques and reanalysis datasets. In the current research, seismic precursors are obtained by analyzing anomalies in Outgoing Longwave Radiation (OLR), Air Temperature (AT), and Relative Humidity (RH) before the two strong Mw>7 earthquakes in Pakistan occurred on 8th October 2005 in Azad Jammu Kashmir with Mw 7.6, and 24th September 2013 in Awaran, Balochistan with Mw 7.7. Multi-parameter data were computed based on multi-year background data for anomalies computation. Results indicate significant transient variations in observed parameters before the main event. Detailed analysis suggests presence of pre-seismic activities one to three weeks prior to the main earthquake event that vanishes after the event. These anomalies are due to increase in temperature after release of gases and physical and chemical interactions on earth surface before the earthquake. The parameter variations behavior for both Kashmir and Awaran earthquake events are similar to other earthquakes in different regions of the world. This study suggests that energy release is not concentrated to a single fault but instead is released along the fault zone. The influence of earthquake events on lightning were also investigated and it was concluded that there is a significant atmospheric lightning activity after the earthquake suggesting a strong possibility for an earthquake induced thunderstorm. This study is valuable for identifying earthquake precursors especially in earthquake prone areas.

17. Anomalies of rupture velocity in deep earthquakes

Suzuki, M.; Yagi, Y.

2010-12-01

Explaining deep seismicity is a long-standing challenge in earth science. Deeper than 300 km, the occurrence rate of earthquakes with depth remains at a low level until ~530 km depth, then rises until ~600 km, finally terminate near 700 km. Given the difficulty of estimating fracture properties and observing the stress field in the mantle transition zone (410-660 km), the seismic source processes of deep earthquakes are the most important information for understanding the distribution of deep seismicity. However, in a compilation of seismic source models of deep earthquakes, the source parameters for individual deep earthquakes are quite varied [Frohlich, 2006]. Rupture velocities for deep earthquakes estimated using seismic waveforms range from 0.3 to 0.9Vs, where Vs is the shear wave velocity, a considerably wider range than the velocities for shallow earthquakes. The uncertainty of seismic source models prevents us from determining the main characteristics of the rupture process and understanding the physical mechanisms of deep earthquakes. Recently, the back projection method has been used to derive a detailed and stable seismic source image from dense seismic network observations [e.g., Ishii et al., 2005; Walker et al., 2005]. Using this method, we can obtain an image of the seismic source process from the observed data without a priori constraints or discarding parameters. We applied the back projection method to teleseismic P-waveforms of 24 large, deep earthquakes (moment magnitude Mw ≥ 7.0, depth ≥ 300 km) recorded since 1994 by the Data Management Center of the Incorporated Research Institutions for Seismology (IRIS-DMC) and reported in the U.S. Geological Survey (USGS) catalog, and constructed seismic source models of deep earthquakes. By imaging the seismic rupture process for a set of recent deep earthquakes, we found that the rupture velocities are less than about 0.6Vs except in the depth range of 530 to 600 km. This is consistent with the depth

18. High Speed Research Noise Prediction Code (HSRNOISE) User's and Theoretical Manual

NASA Technical Reports Server (NTRS)

Golub, Robert (Technical Monitor); Rawls, John W., Jr.; Yeager, Jessie C.

2004-01-01

This report describes a computer program, HSRNOISE, that predicts noise levels for a supersonic aircraft powered by mixed flow turbofan engines with rectangular mixer-ejector nozzles. It fully documents the noise prediction algorithms, provides instructions for executing the HSRNOISE code, and provides predicted noise levels for the High Speed Research (HSR) program Technology Concept (TC) aircraft. The component source noise prediction algorithms were developed jointly by Boeing, General Electric Aircraft Engines (GEAE), NASA and Pratt & Whitney during the course of the NASA HSR program. Modern Technologies Corporation developed an alternative mixer ejector jet noise prediction method under contract to GEAE that has also been incorporated into the HSRNOISE prediction code. Algorithms for determining propagation effects and calculating noise metrics were taken from the NASA Aircraft Noise Prediction Program.

19. RNA Secondary Structure Prediction by Using Discrete Mathematics: An Interdisciplinary Research Experience for Undergraduate Students

ERIC Educational Resources Information Center

Ellington, Roni; Wachira, James; Nkwanta, Asamoah

2010-01-01

The focus of this Research Experience for Undergraduates (REU) project was on RNA secondary structure prediction by using a lattice walk approach. The lattice walk approach is a combinatorial and computational biology method used to enumerate possible secondary structures and predict RNA secondary structure from RNA sequences. The method uses…

20. Stress drops of induced and tectonic earthquakes in the central United States are indistinguishable.

PubMed

Huang, Yihe; Ellsworth, William L; Beroza, Gregory C

2017-08-01

Induced earthquakes currently pose a significant hazard in the central United States, but there is considerable uncertainty about the severity of their ground motions. We measure stress drops of 39 moderate-magnitude induced and tectonic earthquakes in the central United States and eastern North America. Induced earthquakes, more than half of which are shallower than 5 km, show a comparable median stress drop to tectonic earthquakes in the central United States that are dominantly strike-slip but a lower median stress drop than that of tectonic earthquakes in the eastern North America that are dominantly reverse-faulting. This suggests that ground motion prediction equations developed for tectonic earthquakes can be applied to induced earthquakes if the effects of depth and faulting style are properly considered. Our observation leads to the notion that, similar to tectonic earthquakes, induced earthquakes are driven by tectonic stresses.

1. Stress drops of induced and tectonic earthquakes in the central United States are indistinguishable

PubMed Central

Huang, Yihe; Ellsworth, William L.; Beroza, Gregory C.

2017-01-01

Induced earthquakes currently pose a significant hazard in the central United States, but there is considerable uncertainty about the severity of their ground motions. We measure stress drops of 39 moderate-magnitude induced and tectonic earthquakes in the central United States and eastern North America. Induced earthquakes, more than half of which are shallower than 5 km, show a comparable median stress drop to tectonic earthquakes in the central United States that are dominantly strike-slip but a lower median stress drop than that of tectonic earthquakes in the eastern North America that are dominantly reverse-faulting. This suggests that ground motion prediction equations developed for tectonic earthquakes can be applied to induced earthquakes if the effects of depth and faulting style are properly considered. Our observation leads to the notion that, similar to tectonic earthquakes, induced earthquakes are driven by tectonic stresses. PMID:28782040

USGS Publications Warehouse

Haeussler, Peter J.; Plafker, George

1995-01-01

Earthquake risk is high in much of the southern half of Alaska, but it is not the same everywhere. This map shows the overall geologic setting in Alaska that produces earthquakes. The Pacific plate (darker blue) is sliding northwestward past southeastern Alaska and then dives beneath the North American plate (light blue, green, and brown) in southern Alaska, the Alaska Peninsula, and the Aleutian Islands. Most earthquakes are produced where these two plates come into contact and slide past each other. Major earthquakes also occur throughout much of interior Alaska as a result of collision of a piece of crust with the southern margin.

3. Earthquakes, November-December 1973

USGS Publications Warehouse

Person, W.J.

1974-01-01

Other parts of the world suffered fatalities and significant damage from earthquakes. In Iran, an earthquake killed one person, injured many, and destroyed a number of homes. Earthquake fatalities also occurred in the Azores and in Algeria.

4. A Method for Estimation of Death Tolls in Disastrous Earthquake

Pai, C.; Tien, Y.; Teng, T.

2004-12-01

whether the districts are more urbanized or not. As the present researches are concerned, there were not a good and reliable relationship between the mortality and the characteristics of ground motions. We propose the concept of Equal Population Gaps to resolve the influence of mortality in a rural or urban district and decision of the weighting function to each district. The relationship between PGA Index and the mortality determined in this study can be expressed as:\$M=28.9/[1+exp{(1.67-0.0029 \\times PGA)}] \$ Here M is mortality in %, and PGA is PGA Index in gals. The corresponding curve matches the data reasonably well, with R2=0.91. We process the estimation for districts in different scales to verify the feasibility of the method. The mortality-based on PGA Index is particularly useful in real-time application for death tolls prediction and assessment--a piece of information most critical for post earthquake emergency response operation.

5. Predicting self-reported research misconduct and questionable research practices in university students using an augmented Theory of Planned Behavior

PubMed Central

Rajah-Kanagasabai, Camilla J.; Roberts, Lynne D.

2015-01-01

This study examined the utility of the Theory of Planned Behavior model, augmented by descriptive norms and justifications, for predicting self-reported research misconduct and questionable research practices in university students. A convenience sample of 205 research active Western Australian university students (47 male, 158 female, ages 18–53 years, M = 22, SD = 4.78) completed an online survey. There was a low level of