1-7 of 7 Results

  • Keywords: optimization x
Clear all


Test-Based Accountability in England  

Diego Santori

Since the 1980s, the English education system has been a site of experimentation and reform, with test-driven accountability as the predominant form of quality control. The high-stakes accountability system in England is the result of a complex articulation of standardized assessments, end-of-secondary high-stakes examination, and a consequential inspection system that combines public display of performance data via rating systems and league tables. In primary, Standard Assessment Tests (SATs) in English and math are used to measure pupils’ progress between Year 2 and Year 6, and schools’ effectiveness are determined on the basis of these scores, which are publicly available. In addition, there is a range of ad hoc focused tests or “checks” scattered across primary schooling, such as the Phonics Screening Check (Year 1) and the Multiplication Tables Check (Year 4). The main assessment for KS4 is a tiered exit qualification known as General Certificate of Secondary Education (GCSEs), which determine school and college sixth-form options (A levels) and subsequent eligibility for university courses. As data and metrics are increasingly privileged over teacher expertise and professional judgment, schools face tremendous pressure to comply with mounting data and inspection demands, resulting in homogenous and rigid practices. Arguably, recent policy reforms at both ends of compulsory schooling, such as the Reception Baseline Assessment (2020) and Progress 8 (2016), were introduced with the aim of mitigating some of the negative effects that layers of test-based accountability had on teaching and learning. However, a closer look at the internal logic of these reforms reveals further intensification of output-driven pedagogy at the expense of equity, well-being, and justice.


Bioeconomic Models  

Ihtiyor Bobojonov

Bioeconomic models are analytical tools that integrate biophysical and economic models. These models allow for analysis of the biological and economic changes caused by human activities. The biophysical and economic components of these models are developed based on historical observations or theoretical relations. Technically these models may have various levels of complexity in terms of equation systems considered in the model, modeling activities, and programming languages. Often, biophysical components of the models include crop or hydrological models. The core economic components of these models are optimization or simulation models established according to neoclassical economic theories. The models are often developed at farm, country, and global scales, and are used in various fields, including agriculture, fisheries, forestry, and environmental sectors. Bioeconomic models are commonly used in research on environmental externalities associated with policy reforms and technological modernization, including climate change impact analysis, and also explore the negative consequences of global warming. A large number of studies and reports on bioeconomic models exist, yet there is a lack of studies describing the multiple uses of these models across different disciplines.


Dynamic Integration Theory  

Manfred Diehl, Eden Griffin, and Allyson Brothers

Dynamic integration theory (DIT) describes emotion development across the lifespan, from childhood to old age. In doing so, DIT draws on a number of perspectives, such as equilibrium theories, theories of cognitive development, and theories of behavioral adaptation, and takes a strong cognitive-developmental view on emotion experience and emotion regulation. Two propositions are at the core of DIT. First, the development of emotion experience and emotion regulation proceeds from simple and automatic reactions to increasingly complex and integrated cognitive-affective structures (i.e., schemas). These cognitive-affective structures can be ordered in terms of increasing levels of cognitive complexity and integration, with integration referring to a person’s ability to acknowledge both positive and negative affect states and to tolerate and reconcile the contradictions and tensions that these states generate. Second, DIT also postulates that the efficiency with which cognitive-affective systems work is a result of the dynamic interplay between contextual variables and person-specific characteristics. Three key factors contribute to this dynamic interplay between person and context: (1) the strength of the affective arousal, (2) the person’s cognitive resources for dealing with different affect states, and (3) pre-existing trait-like dispositions and reaction tendencies that may either hinder or facilitate emotion regulation. Thus, a person’s emotion experience and emotion regulation in a given situation are the product of the dynamic interaction of these factors. Considerable empirical evidence supports the theoretical propositions of DIT, including findings speaking to changes in emotion experience and emotion regulation in later life when declines in cognitive functioning tend to become normative.


Perceptual Learning: Perception and Experience  

Barbara Anne Dosher and Zhong-Lin Lu

Perceptual learning is the training-induced improvement in the accuracy or speed of relevant perceptual decisions about what is seen, heard, or felt. It occurs in all sensory modalities and in most tasks. The magnitude and generalizability of this learning may, however, depend on the stimulus modality, the level of sensory representation most aligned to the task, and the methods of training, including attention, feedback, reward, and the training protocol. What is known about perceptual learning in multiple modalities has been advanced based on behavioral studies and consideration of physiology and brain imaging, and the theoretical and computational models that systematize and promote understanding of the complex patterns of perceptual learning. Perceptual training might be used in translational applications, such as education, remediation of perceptual deficits, or maintenance of performance.


The Sociotechnical Approach to Work Organization  

David E. Guest

The sociotechnical approach, developed by psychologists at the Tavistock Institute of Human Relations in the 1950s, proposes that the design of work should seek to optimize both the social and the technical systems within organizations, offering a counter to ideas of technological determinism. It further suggests that organizations should be viewed as open systems, subject to sometimes unpredictable external and internal influences leading to a need for adaptability. The work group is viewed as the most relevant unit of analysis resulting in advocacy of autonomous work groups offering group members high levels of control over their work. Workers should participate in the design of their work and receive training and support to enable their involvement. This influential concept stimulated a large body of research in many countries. Despite some notable positive examples, outcomes were often mixed, reflecting the challenges of managing and sustaining significant change. The concept of joint optimization has also proved problematic, with psychologists tending to focus on the social system, while engineers give greater emphasis to the technical system. The advent of digital technologies is providing a new impetus to the need to design work to optimize both the social and technical systems, provoking renewed interest in the approach.


Water Resources Planning Under (Deep) Uncertainty  

Riddhi Singh

Public investments in water infrastructure continue to grow where developed countries prioritize investments in operation and maintenance while developing countries focus on infrastructure expansion. The returns from these investments are contingent on carefully assessed designs and operating strategies that consider the complexities inherent in water management problems. These complexities arise due to several factors, including, but not limited to, the presence of multiple stakeholders with potentially conflicting preferences, lack of knowledge about appropriate systems models or parameterizations, and large uncertainties regarding the evolution of future conditions that will confront these projects. The water resources planning literature has therefore developed a variety of approaches for a quantitative treatment of planning problems. Beginning in the mid-20th century, quantitative design evaluations were based on a stochastic treatment of uncertainty using probability distributions to determine expected costs or risk of failure. Several simulation–optimization frameworks were developed to identify optimal designs with techniques such as linear programming, dynamic programming, stochastic dynamic programming, and evolutionary algorithms. Uncertainty was incorporated within existing frameworks using probability theory, using fuzzy theory to represent ambiguity, or via scenario analysis to represent discrete possibilities for the future. As the effects of climate change became palpable and rapid socioeconomic transformations emerged as the norm, it became evident that existing techniques were not likely to yield reliable designs. The conditions under which an optimal design is developed and tested may differ significantly from those that it will face during its lifetime. These uncertainties, wherein the analyst cannot identify the distributional forms of parameters or the models and forcing variables, are termed “deep uncertainties.” The concept of “robustness” was introduced around the 1980s to identify designs that trade off optimality with reduced sensitivity to such assumptions. However, it was not until the 21st century that robustness analysis became mainstream in water resource planning literature and robustness definitions were expanded to include preferences of multiple actors and sectors as well as their risk attitudes. Decision analytical frameworks that focused on robustness evaluations included robust decision-making, decision scaling, multi-objective robust decision-making, info-gap theory, and so forth. A complementary set of approaches focused on dynamic planning that allowed designs to respond to new information over time. Examples included adaptive policymaking, dynamic adaptive policy pathways, and engineering options analysis, among others. These novel frameworks provide a posteriori decision support to planners aiding in the design of water resources projects under deep uncertainties.


A Century of Evolution of Modeling for River Basin Planning to the Next Generation of Models, Methods, and Concepts  

Caroline Rosello, Sondoss Elsawah, Joseph Guillaume, and Anthony Jakeman

River Basin models to inform planning decisions have continued to evolve, largely based on predominant planning paradigms and progress in the sciences and technology. From the Industrial Revolution to the first quarter of the 21st century, such modeling tools have shifted from supporting water resources development to integrated and adaptive water resources management. To account for the increasing complexity and uncertainty associated with the relevant socioecological systems in which planning should be embedded, river basin models have shifted from a supply development focus during the 19th century to include, by thes 2000s–2020s, demand management approaches and all aspects of consumptive and non-consumptive uses, addressing sociocultural and environmental issues. With technological and scientific developments, the modeling has become increasingly quantitative, integrated and interdisciplinary, attempting to capture, more holistically, multiple river basin issues, relevant cross-sectoral policy influences, and disciplinary perspectives. Additionally, in acknowledging the conflicts around ecological degradation and human impacts associated with intensive water resource developments, the modeling has matured to embrace the need for adequate stakeholder engagement processes that support knowledge-sharing and trust-building and facilitate the appreciation of trade-offs across multiple types of impacts and associated uncertainties. River basin models are now evolving to anticipate uncertainty around plausible alternative futures such as climate change and rapid sociotechnical transformations. The associated modeling now embraces the challenge of shifting from predictive to exploratory tools to support learning and reflection and better inform adaptive management and planning. Managing so-called deep uncertainty presents new challenges for river basin modeling associated with imperfect knowledge, integrating sociotechnical scales, regime shifts and human factors, and enabling collaborative modeling, infrastructure support, and management systems.