1-15 of 15 Results

  • Keywords: computational model x
Clear all

Article

Remote Sensing and Physical Modeling of Fires, Floods, and Landslides  

Mahesh Prakash, James Hilton, Claire Miller, Vincent Lemiale, Raymond Cohen, and Yunze Wang

Remotely sensed data for the observation and analysis of natural hazards is becoming increasingly commonplace and accessible. Furthermore, the accuracy and coverage of such data is rapidly improving. In parallel with this growth are ongoing developments in computational methods to store, process, and analyze these data for a variety of geospatial needs. One such use of this geospatial data is for input and calibration for the modeling of natural hazards, such as the spread of wildfires, flooding, tidal inundation, and landslides. Computational models for natural hazards show increasing real-world applicability, and it is only recently that the full potential of using remotely sensed data in these models is being understood and investigated. Some examples of geospatial data required for natural hazard modeling include: • elevation models derived from RADAR and Light Detection and Ranging (LIDAR) techniques for flooding, landslide, and wildfire spread models • accurate vertical datum calculations from geodetic measurements for flooding and tidal inundation models • multispectral imaging techniques to provide land cover information for fuel types in wildfire models or roughness maps for flood inundation studies Accurate modeling of such natural hazards allows a qualitative and quantitative estimate of risks associated with such events. With increasing spatial and temporal resolution, there is also an opportunity to investigate further value-added usage of remotely sensed data in the disaster modeling context. Improving spatial data resolution allows greater fidelity in models allowing, for example, the impact of fires or flooding on individual households to be determined. Improving temporal data allows short and long-term trends to be incorporated into models, such as the changing conditions through a fire season or the changing depth and meander of a water channel.

Article

Computational Models of Political Decision Making  

Sung-youn Kim

A growing body of research uses computational models to study political decision making and behavior such as voter turnout, vote choice, party competition, social networks, and cooperation in social dilemmas. Advances in the computational modeling of political decision making are closely related to the idea of bounded rationality. In effect, models of full rationality can usually be analyzed by hand, but models of bounded rationality are complex and require computer-assisted analysis. Most computational models used in the literature are agent based, that is, they specify how decisions are made by autonomous, interacting computational objects called “agents.” However, an important distinction can be made between two classes of models based on the approaches they take: behavioral and information processing. Behavioral models specify relatively simple behavioral rules to relax the standard rationality assumption and investigate the system-level consequences of these rules in conjunction with deductive, game-theoretic analysis. In contrast, information-processing models specify the underlying information processes of decision making—the way political actors receive, store, retrieve, and use information to make judgment and choice—within the structural constraints on human cognition, and examine whether and how these processes produce the observed behavior in question at the individual or aggregate level. Compared to behavioral models, information-processing computational models are relatively rare, new to political scientists, and underexplored. However, focusing on the underlying mental processes of decision making that must occur within the structural constraints on human cognition, they have the potential to provide a more general, psychologically realistic account for political decision making and behavior.

Article

Computing in Precollege Science, Engineering, and Mathematics Education  

Amy Voss Farris and Gözde Tosun

Computing is essential to disciplinary practices and discourses of science, engineering, and mathematics. In each of these broad disciplinary areas, technology creates new ways of making sense of the world and designing solutions to problems. Computation and computational thinking are synergistic with ways of knowing in mathematics and in science, a relationship known as reflexivity, first proposed by Harel and Papert. In precollege educational contexts (e.g., K-12 schooling), learners’ production of computational artifacts is deeply complementary to learning and participating in science, mathematics, and engineering, rather than an isolated set of competencies. In K-12 contexts of teaching and learning, students’ data practices, scientific modeling, and modeling with mathematics are primary forms through which computing mediates the epistemic work of science, mathematics, and engineering. Related literature in this area has contributed to scholarship concerning students’ development of computational literacies––the multiple literacies involved in the use and creation of computational tools and computer languages to support participation in particular communities. Computational thinking is a term used to describe analytic approaches to posing problems and solving them that are based on principles and practices in computer science. Computational thinking is frequently discussed as a key target for learning. However, reflexivity refocuses computational thinking on the synergistic nature between learning computing and the epistemic (knowledge-making) work of STEM disciplines. This refocusing is useful for building an understanding of computing in relation to how students generate and work with data in STEM disciplines and how they participate in scientific modeling and modeling in mathematics, and contributes to generative computational abstractions for learning and teaching in STEM domains. A heterogeneous vision of computational literacies within STEM education is essential for the advancement of a more just and more equitable STEM education for all students. Generative computational abstractions must engage learners’ personal and phenomenological recontextualizations of the problems that they are making sense of. A democratic vision of computing in STEM education also entails that teacher education must advance a more heterogeneous vision of computing for knowledge-making aims. Teachers’ ability to facilitate authentic learning experiences in which computing is positioned as reflexive, humane, and used authentically in service of learning goals in STEM domains is of central importance to learners’ understanding of the relationship of computing with STEM fields.

Article

Digital Shakespeare  

Toby Malone and Brett Greatley-Hirsch

Digital publishing, from early ventures in fixed media (diskette and CD-ROM) through to editions designed for the Web, tablets, and phones, radically transforms the creation, remediation, and dissemination of Shakespearean texts. Likewise, digital technologies reshape the performance of William Shakespeare’s plays through the introduction of new modes of capture and delivery, as well as the adaptation of social media, virtual reality, video gaming, and motion capture in stage and screen productions. With the aid of the computer, Shakespearean texts, places, and spaces can be “modeled” in new and sophisticated ways, including algorithmic approaches to questions of Shakespearean authorship and chronology, the virtual 3D reconstruction of now-lost playhouses, and historical geospatial mapping of Shakespeare’s London.

Article

Peripheral Vision: A Critical Component of Many Visual Tasks  

Ruth Rosenholtz

In understanding human visual perception, an important component consists of what people can perceive at a glance. If that glance provides the observer with sufficient task-relevant information, this affords efficient processing. If not, one must move one’s eyes and integrate information across glances and over time, which is necessarily slower and limited by both working memory and the ability to integrate that information. Vision at a glance has to do in large part with the strengths and limitations of peripheral vision, and in particular with visual crowding. Understanding peripheral vision has helped unify a number of aspects of vision.

Article

Agent-Based Computational Modeling and International Relations Theory: Quo Vadis?  

Claudio Cioffi-Revilla

Agent-based computational modeling (ABM, for short) is a formal and supplementary methodological approach used in international relations (IR) theory and research, based on the general ABM paradigm and computational methodology as applied to IR phenomena. ABM of such phenomena varies according to three fundamental dimensions: scale of organization—spanning foreign policy, international relations, regional systems, and global politics—as well as by geospatial and temporal scales. ABM is part of the broader complexity science paradigm, although ABMs can also be applied without complexity concepts. There have been scores of peer-reviewed publications using ABM to develop IR theory in recent years, based on earlier pioneering work in computational IR that originated in the 1960s that was pre-agent based. Main areas of theory and research using ABM in IR theory include dynamics of polity formation (politogenesis), foreign policy decision making, conflict dynamics, transnational terrorism, and environment impacts such as climate change. Enduring challenges for ABM in IR theory include learning the applicable ABM methodology itself, publishing sufficiently complete models, accumulation of knowledge, evolving new standards and methodology, and the special demands of interdisciplinary research, among others. Besides further development of main themes identified thus far, future research directions include ABM applied to IR in political interaction domains of space and cyber; new integrated models of IR dynamics across domains of land, sea, air, space, and cyber; and world order and long-range models.

Article

New Computational Methods and the Study of the Romance Languages  

Basilio Calderone and Vito Pirrelli

Nowadays, computer models of human language are instrumental to millions of people, who use them every day with little if any awareness of their existence and role. Their exponential development has had a huge impact on daily life through practical applications like machine translation or automated dialogue systems. It has also deeply affected the way we think about language as an object of scientific inquiry. Computer modeling of Romance languages has helped scholars develop new theoretical frameworks and new ways of looking at traditional approaches. In particular, computer modeling of lexical phenomena has had a profound influence on some fundamental issues in human language processing, such as the purported dichotomy between rules and exceptions, or grammar and lexicon, the inherently probabilistic nature of speakers’ perception of analogy and word internal structure, and their ability to generalize to novel items from attested evidence. Although it is probably premature to anticipate and assess the prospects of these models, their current impact on language research can hardly be overestimated. In a few years, data-driven assessment of theoretical models is expected to play an irreplaceable role in pacing progress in all branches of language sciences, from typological and pragmatic approaches to cognitive and formal ones.

Article

Multisensory Integration and the Perception of Self-Motion  

Kathleen E. Cullen

As we go about our everyday activities, our brain computes accurate estimates of both our motion relative to the world, and of our orientation relative to gravity. Essential to this computation is the information provided by the vestibular system; it detects the rotational velocity and linear acceleration of our heads relative to space, making a fundamental contribution to our perception of self-motion and spatial orientation. Additionally, in everyday life, our perception of self-motion depends on the integration of both vestibular and nonvestibular cues, including visual and proprioceptive information. Furthermore, the integration of motor-related information is also required for perceptual stability, so that the brain can distinguish whether the experienced sensory inflow was a result of active self-motion through the world or if instead self-motion that was externally generated. To date, understanding how the brain encodes and integrates sensory cues with motor signals for the perception of self-motion during natural behaviors remains a major goal in neuroscience. Recent experiments have (i) provided new insights into the neural code used to represent sensory information in vestibular pathways, (ii) established that vestibular pathways are inherently multimodal at the earliest stages of processing, and (iii) revealed that self-motion information processing is adjusted to meet the needs of specific tasks. Our current level of understanding of how the brain integrates sensory information and motor-related signals to encode self-motion and ensure perceptual stability during everyday activities is reviewed.

Article

Adiabatic Quantum Computing and Quantum Annealing  

Erica K. Grant and Travis S. Humble

Adiabatic quantum computing (AQC) is a model of computation that uses quantum mechanical processes operating under adiabatic conditions. As a form of universal quantum computation, AQC employs the principles of superposition, tunneling, and entanglement that manifest in quantum physical systems. The AQC model of quantum computing is distinguished by the use of dynamical evolution that is slow with respect to the time and energy scales of the underlying physical systems. This adiabatic condition enforces the promise that the quantum computational state will remain well-defined and controllable thus enabling the development of new algorithmic approaches. Several notable algorithms developed within the AQC model include methods for solving unstructured search and combinatorial optimization problems. In an idealized setting, the asymptotic complexity analyses of these algorithms indicate computational speed-ups may be possible relative to state-of-the-art conventional methods. However, the presence of non-ideal conditions, including non-adiabatic dynamics, residual thermal excitations, and physical noise complicate the assessment of the potential computational performance. A relaxation of the adiabatic condition is captured in the complementary computational heuristic of quantum annealing, which accommodates physical systems operating at finite temperature and in open environments. While quantum annealing (QA) provides a more accurate model for the behavior of actual quantum physical systems, the possibility of non-adiabatic effects obscures a clear separation with conventional computing complexity. A series of technological advances in the control of quantum physical systems have enabled experimental AQC and QA. Prominent examples include demonstrations using superconducting electronics, which encode quantum information in the magnetic flux induced by a weak current operating at cryogenic temperatures. A family of devices developed specifically for unconstrained optimization problems has been applied to solve problems in specific domains including logistics, finance, material science, machine learning, and numerical analysis. An accompanying infrastructure has also developed to support these experimental demonstrations and to enable access of a broader community of users. Although AQC is most commonly applied in superconducting technologies, alternative approaches include optically trapped neutral atoms and ion-trap systems. The significant progress in the understanding of AQC has revealed several open topics that continue to motivate research into this model of quantum computation. Foremost is the development of methods for fault-tolerant operation that will ensure the scalability of AQC for solving large-scale problems. In addition, unequivocal experimental demonstrations that differentiate the computational power of AQC and its variants from conventional computing approaches are needed. This will also require advances in the fabrication and control of quantum physical systems under the adiabatic restrictions.

Article

Computer Simulations in the Classroom  

Andrew Blum

Computer simulations can be defined in three categories: computational modeling simulations, human-computer simulations, and computer-mediated simulations. These categories of simulations are defined primarily by the role computers take and by the role humans take in the implementation of the simulation. The literature on the use of simulations in the international studies classroom considers under what circumstances and in what ways the use of simulations creates pedagogical benefits when compared with other teaching methods. But another issue to consider is under what circumstances and in what ways the use of computers can add (or subtract) pedagogical value when compared to other methods for implementing simulations. There are six alleged benefits of using simulation: encouraging cognitive and affective learning, enhancing student motivation, creating opportunities for longer-term learning, increasing personal efficiency, and promoting student-teacher relations. Moreover, in regard to the use of computer simulations, there are a set of good practices to consider. The first good practice emerges out of a realization of the unequal level of access to technology. The second good practice emerges from a clear understanding of the strengths and weaknesses of a computer-assisted simulation. The final and perhaps most fundamental good practice emerges from the idea that computers and technology more generally are not ends in themselves, but a means to help instructors reach a set of pedagogical goals.

Article

Measurement-Based Quantum Computation  

Tzu-Chieh Wei

Measurement-based quantum computation is a framework of quantum computation, where entanglement is used as a resource and local measurements on qubits are used to drive the computation. It originates from the one-way quantum computer of Raussendorf and Briegel, who introduced the so-called cluster state as the underlying entangled resource state and showed that any quantum circuit could be executed by performing only local measurement on individual qubits. The randomness in the measurement outcomes can be dealt with by adapting future measurement axes so that computation is deterministic. Subsequent works have expanded the discussions of the measurement-based quantum computation to various subjects, including the quantification of entanglement for such a measurement-based scheme, the search for other resource states beyond cluster states and computational phases of matter. In addition, the measurement-based framework also provides useful connections to the emergence of time ordering, computational complexity and classical spin models, blind quantum computation, and so on, and has given an alternative, resource-efficient approach to implement the original linear-optic quantum computation of Knill, Laflamme, and Milburn. Cluster states and a few other resource states have been created experimentally in various physical systems, and the measurement-based approach offers a potential alternative to the standard circuit approach to realize a practical quantum computer.

Article

Models of Human Sentence Comprehension in Computational Psycholinguistics  

John Hale

Computational models of human sentence comprehension help researchers reason about how grammar might actually be used in the understanding process. Taking a cognitivist approach, this article relates computational psycholinguistics to neighboring fields (such as linguistics), surveys important precedents, and catalogs open problems.

Article

Managing Critical Infrastructures in Crisis  

Louise K. Comfort

The management of critical infrastructures presents a specific set of challenges to crisis managers. Critical infrastructures include electrical power; communications; transportation; and water, wastewater, and gas line distribution systems. Designed for efficiency, these technical systems operate interdependently, thus making them vulnerable to the stress of extreme events. Changes in population, demographics, land use, and economic and social conditions of communities exposed to hazards have resulted in a significantly increased number of people dependent on critical infrastructures in regions at risk. Advances in science, technology, and engineering have introduced new possibilities for the redesign, maintenance, and retrofit of built infrastructure to withstand extreme events. However, most public and private agencies are not capable of anticipating the potential risk and making investments needed to upgrade infrastructures before damage occurs. Computational modeling facilitates the exploration of alternative approaches to managing risk. Sensors, telemetry, and graphic display of changing performance for critical infrastructure provide accurate information to reduce uncertainty in crisis events. These technologies enable crisis managers to track more accurately the impact of extreme events on the populations and infrastructures of communities at risk and to anticipate the likely consequences of future hazardous events. Crisis managers strive to create a continual learning process that enables residents to monitor their changing environment, use systematically collected data as the basis for analysis and change, and modify policies and practice based on valid evidence from actual environments at risk. For communities seeking to reduce risk, investment in information technologies to enable rapid, community-wide access to interactive communication constitutes a major step toward building capacity not only for managing risk to critical infrastructure but also in maintaining continuity of operations for the whole community in extreme events.

Article

Computational Phonology  

Jane Chandlee and Jeffrey Heinz

Computational phonology studies the nature of the computations necessary and sufficient for characterizing phonological knowledge. As a field it is informed by the theories of computation and phonology. The computational nature of phonological knowledge is important because at a fundamental level it is about the psychological nature of memory as it pertains to phonological knowledge. Different types of phonological knowledge can be characterized as computational problems, and the solutions to these problems reveal their computational nature. In contrast to syntactic knowledge, there is clear evidence that phonological knowledge is computationally bounded to the so-called regular classes of sets and relations. These classes have multiple mathematical characterizations in terms of logic, automata, and algebra with significant implications for the nature of memory. In fact, there is evidence that phonological knowledge is bounded by particular subregular classes, with more restrictive logical, automata-theoretic, and algebraic characterizations, and thus by weaker models of memory.

Article

Discriminative Learning and the Lexicon: NDL and LDL  

Yu-Ying Chuang and R. Harald Baayen

Naive discriminative learning (NDL) and linear discriminative learning (LDL) are simple computational algorithms for lexical learning and lexical processing. Both NDL and LDL assume that learning is discriminative, driven by prediction error, and that it is this error that calibrates the association strength between input and output representations. Both words’ forms and their meanings are represented by numeric vectors, and mappings between forms and meanings are set up. For comprehension, form vectors predict meaning vectors. For production, meaning vectors map onto form vectors. These mappings can be learned incrementally, approximating how children learn the words of their language. Alternatively, optimal mappings representing the end state of learning can be estimated. The NDL and LDL algorithms are incorporated in a computational theory of the mental lexicon, the ‘discriminative lexicon’. The model shows good performance both with respect to production and comprehension accuracy, and for predicting aspects of lexical processing, including morphological processing, across a wide range of experiments. Since, mathematically, NDL and LDL implement multivariate multiple regression, the ‘discriminative lexicon’ provides a cognitively motivated statistical modeling approach to lexical processing.