1-20 of 278 Results  for:

  • Keywords: models x
Clear all

Article

Peter Robinson

Long memory models are statistical models that describe strong correlation or dependence across time series data. This kind of phenomenon is often referred to as “long memory” or “long-range dependence.” It refers to persisting correlation between distant observations in a time series. For scalar time series observed at equal intervals of time that are covariance stationary, so that the mean, variance, and autocovariances (between observations separated by a lag j) do not vary over time, it typically implies that the autocovariances decay so slowly, as j increases, as not to be absolutely summable. However, it can also refer to certain nonstationary time series, including ones with an autoregressive unit root, that exhibit even stronger correlation at long lags. Evidence of long memory has often been been found in economic and financial time series, where the noted extension to possible nonstationarity can cover many macroeconomic time series, as well as in such fields as astronomy, agriculture, geophysics, and chemistry. As long memory is now a technically well developed topic, formal definitions are needed. But by way of partial motivation, long memory models can be thought of as complementary to the very well known and widely applied stationary and invertible autoregressive and moving average (ARMA) models, whose autocovariances are not only summable but decay exponentially fast as a function of lag j. Such models are often referred to as “short memory” models, becuse there is negligible correlation across distant time intervals. These models are often combined with the most basic long memory ones, however, because together they offer the ability to describe both short and long memory feartures in many time series.

Article

Can inclusion and special education achieve education for all? The answer: It depends. What has been called “special education” began its rounds in schools as early as the late 19th century. Inclusive education first appeared in policy documents and mission statements nearly a century later, most notably and possibly most influentially in UNESCO documents and goals of Education For All, beginning in 2002. Both vary extensively in terms of approaches to instruction, service location, vocational background and training for teachers and support personnel, and in terms of who gets included and who gets excluded, to name a few variables. The views of both also often vary by roles; for example, parents, teachers, administrators, government officials. Both also evince major differences depending on the cultural contexts, economic resources, and historical traditions and views regarding education writ large. Exploring these variations and conditions provides insights for addressing the difficulties that face collaboration or merger of special education and inclusive education in order to achieve education for all. After these difficulties have been acknowledged, an essential starting point for change in the direction of education for all entails finding common ground between special education and inclusive education in terms of purposes and end-goals. A human rights approach to common ground, purposes, and end goals provides an essential framework.

Article

Many nonlinear time series models have been around for a long time and have originated outside of time series econometrics. The stochastic models popular univariate, dynamic single-equation, and vector autoregressive are presented and their properties considered. Deterministic nonlinear models are not reviewed. The use of nonlinear vector autoregressive models in macroeconometrics seems to be increasing, and because this may be viewed as a rather recent development, they receive somewhat more attention than their univariate counterparts. Vector threshold autoregressive, smooth transition autoregressive, Markov-switching, and random coefficient autoregressive models are covered along with nonlinear generalizations of vector autoregressive models with cointegrated variables. Two nonlinear panel models, although they cannot be argued to be typically macroeconometric models, have, however, been frequently applied to macroeconomic data as well. The use of all these models in macroeconomics is highlighted with applications in which model selection, an often difficult issue in nonlinear models, has received due attention. Given the large amount of nonlinear time series models, no unique best method of choosing between them seems to be available.

Article

Tropical cyclones (TCs) in their most intense expression (hurricanes or typhoons) are the main natural hazards known to humankind. The impressive socioeconomic consequences for countries dealing with TCs make our ability to model these organized convective structures a key issue to better understanding their nature and their interaction with the climate system. The destructive effects of TCs are mainly caused by three factors: strong wind, storm surge, and extreme precipitation. These TC-induced effects contribute to the annual worldwide damage of the order of billions of dollars and a death toll of thousands of people. Together with the development of tools able to simulate TCs, an accurate estimate of the impact of global warming on TC activity is thus not only of academic interest but also has important implications from a societal and economic point of view. The aim of this article is to provide a description of the TC modeling implementations available to investigate present and future climate scenarios. The two main approaches to dynamically model TCs under a climate perspective are through hurricane models and climate models. Both classes of models evaluate the numerical equations governing the climate system. A hurricane model is an objective tool, designed to simulate the behavior of a tropical cyclone representing the detailed time evolution of the vortex. Considering the global scale, a climate model can be an atmosphere (or ocean)-only general circulation model (GCM) or a fully coupled general circulation model (CGCM). To improve the ability of a climate model in representing small-scale features, instead of a general circulation model, a regional model (RM) can be used: this approach makes it possible to increase the spatial resolution, reducing the extension of the domain considered. In order to be able to represent the tropical cyclone structure, a climate model needs a sufficiently high horizontal resolution (of the order of tens of kilometers) leading to the usage of a great deal of computational power. Both tools can be used to evaluate TC behavior under different climate conditions. The added value of a climate model is its ability to represent the interplay of TCs with the climate system, namely two-way relationships with both atmosphere and ocean dynamics and thermodynamics. In particular, CGCMs are able to take into account the well-known feedback between atmosphere and ocean components induced by TC activity and also the TC–related remote impacts on large-scale atmospheric circulation. The science surrounding TCs has developed in parallel with the increasing complexity of the mentioned tools, both in terms of progress in explaining the physical processes involved and the increased availability of computational power. Many climate research groups around the world, dealing with such numerical models, continuously provide data sets to the scientific community, feeding this branch of climate change science.

Article

Mahesh Prakash, James Hilton, Claire Miller, Vincent Lemiale, Raymond Cohen, and Yunze Wang

Remotely sensed data for the observation and analysis of natural hazards is becoming increasingly commonplace and accessible. Furthermore, the accuracy and coverage of such data is rapidly improving. In parallel with this growth are ongoing developments in computational methods to store, process, and analyze these data for a variety of geospatial needs. One such use of this geospatial data is for input and calibration for the modeling of natural hazards, such as the spread of wildfires, flooding, tidal inundation, and landslides. Computational models for natural hazards show increasing real-world applicability, and it is only recently that the full potential of using remotely sensed data in these models is being understood and investigated. Some examples of geospatial data required for natural hazard modeling include: • elevation models derived from RADAR and Light Detection and Ranging (LIDAR) techniques for flooding, landslide, and wildfire spread models • accurate vertical datum calculations from geodetic measurements for flooding and tidal inundation models • multispectral imaging techniques to provide land cover information for fuel types in wildfire models or roughness maps for flood inundation studies Accurate modeling of such natural hazards allows a qualitative and quantitative estimate of risks associated with such events. With increasing spatial and temporal resolution, there is also an opportunity to investigate further value-added usage of remotely sensed data in the disaster modeling context. Improving spatial data resolution allows greater fidelity in models allowing, for example, the impact of fires or flooding on individual households to be determined. Improving temporal data allows short and long-term trends to be incorporated into models, such as the changing conditions through a fire season or the changing depth and meander of a water channel.

Article

Michael Colaresi and Jude C. Hays

Time and space are two dimensions that are likely to provide the paths—either singly or in tandem—by which international policy decisions are interdependent. There are several reasons to expect international relations processes to be interdependent across space, time, or both dimensions. Theoretical approaches such as rational expectations models, bureaucratic models of decision-making, and psychological explanations of international phenomena at least implicitly assume—and in many cases explicitly predict—dependence structures within data. One approach that researchers can use to test whether their international processes of interest are marked by dependence across time, space, or both time and space, is to explicitly model and interpret the hypothesized underlying dependence structures. There are two areas of spatial modeling at the research frontier: spatial models with qualitative and limited dependent variables, an co-evolution models of structure and behavior. These models have theoretical implications that are likely to be useful for international relations research. However, a gap remains between the kinds of empirical models demanded by international relations data and theory and the supply of time series and spatial econometric models that are available to those doing applied research. There is a need to develop appropriate models of temporal and spatial interdependence for qualitative and limited dependent variables, and for better models in which outcomes and structures of interdependence are jointly endogenous.

Article

Romel W. Mackelprang

Characteristics that we contemporarily define as disabilities have existed in the human population from earliest recorded history. Societal explanations for disability have varied greatly by time and populations in which disabilities have occurred. At various times in history, disability has been viewed as a blessing from deity or the deities, a punishment for sin, or a medical problem. Social workers have worked with persons with disabilities from the inception of the profession, and in recent years, social work has begun to embrace the concept of disability as diversity and to treat disability as diversity and welcome disabled persons as fully participating members of society. Social work has begun welcoming persons with disabilities as fully participating members of society, including valuable members of the profession.

Article

Shenyang Guo

This entry describes the definition, history, theories, and applications of quantitative methods in social work research. Unlike qualitative research, quantitative research emphasizes precise, objective, and generalizable findings. Quantitative methods are based on numerous probability and statistical theories, with rigorous proofs and support from both simulated and empirical data. Regression analysis plays a paramountly important role in contemporary statistical methods, which include event history analysis, generalized linear modeling, hierarchical linear modeling, propensity score matching, and structural equation modeling. Quantitative methods can be employed in all stages of a scientific inquiry ranging from sample selection to final data analysis.

Article

Erik Kjellström and Ole Bøssing Christensen

Regional climate models (RCMs) are commonly used to provide detailed regional to local information for climate change assessments, impact studies, and work on climate change adaptation. The Baltic Sea region is well suited for RCM evaluation due to its complexity and good availability of observations. Evaluation of RCM performance over the Baltic Sea region suggests that: • Given appropriate boundary conditions, RCMs can reproduce many aspects of the climate in the Baltic Sea region. • High resolution improves the ability of RCMs to simulate significant processes in a realistic way. • When forced by global climate models (GCMs) with errors in their representation of the large-scale atmospheric circulation and/or sea surface conditions, performance of RCMs deteriorates. • Compared to GCMs, RCMs can add value on the regional scale, related to both the atmosphere and other parts of the climate system, such as the Baltic Sea, if appropriate coupled regional model systems are used. Future directions for regional climate modeling in the Baltic Sea region would involve testing and applying even more high-resolution, convection permitting, models to generally better represent climate features like heavy precipitation extremes. Also, phenomena more specific to the Baltic Sea region are expected to benefit from higher resolution (these include, for example, convective snowbands over the sea in winter). Continued work on better describing the fully coupled regional climate system involving the atmosphere and its interaction with the sea surface and land areas is also foreseen as beneficial. In this respect, atmospheric aerosols are important components that deserve more attention.

Article

A growing body of research uses computational models to study political decision making and behavior such as voter turnout, vote choice, party competition, social networks, and cooperation in social dilemmas. Advances in the computational modeling of political decision making are closely related to the idea of bounded rationality. In effect, models of full rationality can usually be analyzed by hand, but models of bounded rationality are complex and require computer-assisted analysis. Most computational models used in the literature are agent based, that is, they specify how decisions are made by autonomous, interacting computational objects called “agents.” However, an important distinction can be made between two classes of models based on the approaches they take: behavioral and information processing. Behavioral models specify relatively simple behavioral rules to relax the standard rationality assumption and investigate the system-level consequences of these rules in conjunction with deductive, game-theoretic analysis. In contrast, information-processing models specify the underlying information processes of decision making—the way political actors receive, store, retrieve, and use information to make judgment and choice—within the structural constraints on human cognition, and examine whether and how these processes produce the observed behavior in question at the individual or aggregate level. Compared to behavioral models, information-processing computational models are relatively rare, new to political scientists, and underexplored. However, focusing on the underlying mental processes of decision making that must occur within the structural constraints on human cognition, they have the potential to provide a more general, psychologically realistic account for political decision making and behavior.

Article

Longitudinal structural equation modeling (LSEM) is used to answer lifespan relevant questions such as (a) what is the effect of one variable on change in and other, (b) what is the average trajectory or growth rate of some psychological variable, and (c) what variability is there in average trajectories and what predicts this variability. The first of these questions is often answered by a LSEM called an autoregressive cross-lagged (ACL) model. The other two questions are most typically answered by an LSEM called a latent growth curve (LGC). These models can be applied to a few time waves (measured over several years) or to many time waves (such as present in diary studies) and can be altered, expanded, or even integrated. However, decisions on what model to use must be driven by the research question. The right tool for the job is not always the most complex. And, more importantly, the right tool must be matched to the best possible research design. Sometimes in lifespan research the right tool is LSEM. However, researchers should prioritize research design as well as careful specification of the processes and mechanisms they are interested in rather than simply choosing the most complicated LSEM they can find.

Article

Gabriele Gramelsberger

Climate and simulation have become interwoven concepts during the past decades because, on the one hand, climate scientists shouldn’t experiment with real climate and, on the other hand, societies want to know how climate will change in the next decades. Both in-silico experiments for a better understanding of climatic processes as well as forecasts of possible futures can be achieved only by using climate models. The article investigates possibilities and problems of model-mediated knowledge for science and societies. It explores historically how climate became a subject of science and of simulation, what kind of infrastructure is required to apply models and simulations properly, and how model-mediated knowledge can be evaluated. In addition to an overview of the diversity and variety of models in climate science, the article focuses on quasiheuristic climate models, with an emphasis on atmospheric models.

Article

Agent-based computational modeling (ABM, for short) is a formal and supplementary methodological approach used in international relations (IR) theory and research, based on the general ABM paradigm and computational methodology as applied to IR phenomena. ABM of such phenomena varies according to three fundamental dimensions: scale of organization—spanning foreign policy, international relations, regional systems, and global politics—as well as by geospatial and temporal scales. ABM is part of the broader complexity science paradigm, although ABMs can also be applied without complexity concepts. There have been scores of peer-reviewed publications using ABM to develop IR theory in recent years, based on earlier pioneering work in computational IR that originated in the 1960s that was pre-agent based. Main areas of theory and research using ABM in IR theory include dynamics of polity formation (politogenesis), foreign policy decision making, conflict dynamics, transnational terrorism, and environment impacts such as climate change. Enduring challenges for ABM in IR theory include learning the applicable ABM methodology itself, publishing sufficiently complete models, accumulation of knowledge, evolving new standards and methodology, and the special demands of interdisciplinary research, among others. Besides further development of main themes identified thus far, future research directions include ABM applied to IR in political interaction domains of space and cyber; new integrated models of IR dynamics across domains of land, sea, air, space, and cyber; and world order and long-range models.

Article

Regional models were originally developed to serve weather forecasting and regional process studies. Typical simulations encompass time periods in the order of days or weeks. Thereafter regional models were also used more and more as regional climate models for longer integrations and climate change downscaling. Regional climate modeling or regional dynamic downscaling, which are used interchangeably, developed as its own branch in climate research since the end of the 1990s out of the need to bridge the obvious inconsistencies at the interface of global climate research and climate impact research. The primary aim of regional downscaling is to provide consistent regional climate change scenarios with relevant spatial resolution to serve detailed climate impact assessments. Similar to global climate modeling, the early attempts at regional climate modeling were based on uncoupled atmospheric models or stand-alone ocean models, an approach that is still maintained as the most common on the regional scale. However, this approach has some fundamental limitations, since regional air-sea interaction remains unresolved and regional feedbacks are neglected. This is crucial when assessing climate change impacts in the coastal zone or the regional marine environment. To overcome these limitations, regional climate modeling is currently in a transition from uncoupled regional models into coupled atmosphere-ocean models, leading to fully integrated earth system models. Coupled ice-ocean-atmosphere models have been developed during the last decade and are currently robust and well established on the regional scale. Their added value has been demonstrated for regional climate modeling in marine regions, and the importance of regional air-sea interaction became obvious. Coupled atmosphere-ice-ocean models, but also coupled physical-biogeochemical modeling approaches are increasingly used for the marine realm. First attempts to couple these two approaches together with land surface models are underway. Physical coupled atmosphere-ocean modeling is also developing further and first model configurations resolving wave effects at the atmosphere-ocean interface are now available. These new developments now open up for improved regional assessment under broad consideration of local feedbacks and interactions between the regional atmosphere, cryosphere, hydrosphere, and biosphere.

Article

Alice Lieberman

Ann Weick was the dean of the School of Social Welfare, University of Kansas (1987–2006) and a principal developer of the underlying rationale for the strengths perspective in social work practice.

Article

Yoosun Park

This overview of the Japanese American community includes a brief history of the community in the United States, an overview of some distinct characteristics of the community, and a review of current literature highlighting the particular issues of the community salient to social work research and intervention.

Article

Haluk Soydan

This entry regards intervention research as an essential part of social work as a profession and research discipline. A brief history of intervention research reveals that use of intervention research for the betterment of human conditions is contemporary with the genesis of modern social science. Advances in intervention research are attributed to the comprehensive social programs launched during the 1960s in the United States. A contemporary and generic model of intervention research is described. It is argued that it is ethical to use intervention research and unethical not to use it. Assessment of some of the recent advances in policy making and science gives an optimistic picture of the future of intervention research.

Article

Gretchen J. Van Dyke

The United Nations and the European Union are extraordinarily complex institutions that pose considerable challenges for international studies faculty who work to expose their students to the theoretical, conceptual, and factual material associated with both entities. One way that faculty across the academic spectrum are bringing the two institutions “alive” for their students is by utilizing in-class and multi-institutional simulations of both the UN and the EU. Model United Nations (MUN) and Model European Union simulations are experiential learning tools used by an ever-increasing number of students. The roots of Model UN simulations can be traced to the student-led Model League of Nations simulations that began at Harvard University in the 1920s. Current secondary school MUN offerings include two initiatives, Global Classrooms and the Montessori Model Union Nations (Montessori-MUN). Compared to the institutionalized MUNs, Model EU programs are relatively young. There are three long-standing, distinct, intercollegiate EU simulations in the United States: one in New York, one in the Mid-Atlantic region, and one in the Mid-West. As faculty continue to engage their students with Model UN and Model EU simulations, new scholarship is expected to continue documenting their experiences while emphasizing the value of active and experiential learning pedagogies. In addition, future research will highlight new technologies as critical tools in the Model UN and Model EU preparatory processes and offer quantitative data that supports well-established qualitative conclusions about the positive educational value of these simulations.

Article

Lifespan development is embedded in multiple social systems and social relationships. Lifespan developmental and relationship researchers study individual codevelopment in various dyadic social relationships, such as dyads of parents and children or romantic partners. Dyadic data refers to types of data for which observations from both members of a dyad are available. The analysis of dyadic data requires the use of appropriate data-analytic methods that account for such interdependencies. The standard actor-partner interdependence model, the dyadic growth curve model, and the dyadic dual change score model can be used to analyze data from dyads. These models allow examination of questions related to dyadic associations such as whether individual differences in an outcome can be predicted by one’s own (actor effects) and the other dyad member’s (partner effects) level in another variable, correlated change between dyad members, and cross-lagged dyadic associations, that is, whether one dyad member’s change can be predicted by the previous levels of the other dyad member. The choice of a specific model should be guided by theoretical and conceptual considerations as well as by features of the data, such as the type of dyad, the number and spacing of observations, or distributional properties of variables.

Article

Speech production is an important aspect of linguistic competence. An attempt to understand linguistic morphology without speech production would be incomplete. A central research question develops from this perspective: what is the role of morphology in speech production. Speech production researchers collect many different types of data and much of that data has informed how linguists and psycholinguists characterize the role of linguistic morphology in speech production. Models of speech production play an important role in the investigation of linguistic morphology. These models provide a framework, which allows researchers to explore the role of morphology in speech production. However, models of speech production generally focus on different aspects of the production process. These models are split between phonetic models (which attempt to understand how the brain creates motor commands for uttering and articulating speech) and psycholinguistic models (which attempt to understand the cognitive processes and representation of the production process). Models that merge these two model types, phonetic and psycholinguistic models, have the potential to allow researchers the possibility to make specific predictions about the effects of morphology on speech production. Many studies have explored models of speech production, but the investigation of the role of morphology and how morphological properties may be represented in merged speech production models is limited.