Dyslexia, or a reading disability, occurs when an individual has great difficulty at the level of word reading and decoding. Comprehension of text, writing, and spelling are also affected. The diagnosis of dyslexia involves the use of reading tests, but the continuum of reading performance means that any cutoff point is arbitrary. The IQ score does not play a role in the diagnosis of dyslexia. Dyslexia is a language-based learning disability. The cognitive difficulties of dyslexics include problems with recognizing and manipulating the basic sounds in a language, language memory, and learning the sounds of letters. Dyslexia is a neurological condition with a genetic basis. There are abnormalities in the brains of dyslexic individuals. There are also differences in the electrophysiological and structural characteristics of the brains of dyslexics. Hope for dyslexia involves early detection and intervention and evidence-based instruction.
Philip Parker and Robert Brockman
Longitudinal structural equation modeling (LSEM) is used to answer lifespan relevant questions such as (a) what is the effect of one variable on change in and other, (b) what is the average trajectory or growth rate of some psychological variable, and (c) what variability is there in average trajectories and what predicts this variability. The first of these questions is often answered by a LSEM called an autoregressive cross-lagged (ACL) model. The other two questions are most typically answered by an LSEM called a latent growth curve (LGC). These models can be applied to a few time waves (measured over several years) or to many time waves (such as present in diary studies) and can be altered, expanded, or even integrated. However, decisions on what model to use must be driven by the research question. The right tool for the job is not always the most complex. And, more importantly, the right tool must be matched to the best possible research design. Sometimes in lifespan research the right tool is LSEM. However, researchers should prioritize research design as well as careful specification of the processes and mechanisms they are interested in rather than simply choosing the most complicated LSEM they can find.
Kathleen Someah, Christopher Edwards, and Larry E. Beutler
There are many approaches to psychotherapy, commonly called “schools” or “theories.” These schools range from psychoanalytic, to variations of insight- and conflict-based approaches, through behavioral and cognitive behavioral approaches, to humanistic/existential approaches, and finally to integrative and eclectic approaches. Different and seemingly new approaches typically have been informed by older and more established ones. For instance, cognitive behavioral therapy (CBT), one of the more widely used approaches, evolved from traditional behavior therapy but has become sufficiently distinct by adding its own complex variations so as functionally to represent an approach of its own. New approaches abound both in number and in complexity. Modern clinicians have had to become increasingly widely read and creative in trying to understand the ways in which patients may be helped. The sheer number of approaches, which has climbed into the hundreds, has challenged the field to find ways of ensuring that the treatments presented are effective. The advent of Evidence Based Practices (EBP) throughout the healthcare fields has placed the responsibility on those who advocate for particular types of treatment scientifically to demonstrate their efficacy and effectiveness. While this movement has brought standards to the field and has offered some assurance that psychotherapy is usually helpful, there remains much debate about whether the many different schools produce different results from one another. The debate about how best to optimize positive effects of psychotherapy continues, and there remain many questions to be asked of psychotherapy theories and of research on these approaches.
David R. Shanks, Hilary J. Don, Shaun Boustani, and Chunliang Yang
Tests following learning serve several important functions, including enabling students to monitor their progress and identify knowledge gaps, but they are also learning events in their own right. Testing is a powerful strategy to consolidate retention of studied information, by comparison with restudying and other elaborative strategies, and facilitates subsequent learning of new information. Moreover, the testing effect generalizes to different test formats, study-test intervals, and material types, and has been robustly demonstrated not only in the laboratory but also in classroom settings. Pretesting can promote subsequent learning of tested information, but its effect on non-pre-questioned information remains unclear. Although the beneficial effects of testing on learning and memory are substantial, learners tend to underappreciate the merits of practice tests, leading to their underemployment. Lack of motivation or insufficient knowledge about how best to exploit testing may be factors that suppress its use. However, some promising interventions have been developed to promote learners’ employment of self-testing. Whether these interventions can be effective in high-stakes classroom or online learning is an important issue for future research. Importantly, research suggests that frequent low-stakes testing may be an effective method of reducing test anxiety. Although the testing effect is very general, testing can also have negative consequences, such as when choosing an incorrect answer in a multiple-choice test stamps that incorrect answer into memory and increases its likelihood of being recalled later. Understanding the conditions in which positive or negative consequences of testing are observed bears considerable importance regarding the theoretical understanding of test-enhanced learning. Characterizing, understanding, and exploiting the multifaceted effects of tests on long-term learning has provided a rich and deep challenge to researchers in psychology, education, cognitive science, neuroscience, and related fields.
Various self-concepts constitute major keywords in both psychological science and liberal political discourse. They have been central to psychology’s public-facing, policy-oriented role in the United States, dating back to the mid-19th century. Psychologists’ articulations of self-concept include an understanding of the individual, society, and the interventions needed to augment them both. Psychologists’ early enthusiasm for self-esteem has given way to competing concepts of the individual, namely self-regulation and self-control. Self-esteem in a modern sense coalesced out of the deprivation of the Great Depression and the political crises it provoked. The fate of self-esteem became tied to the capacities of the liberal welfare state to improve the psychic capacities of its citizens, in order to render them both more equal under the law and more productive in their daily existence. Western democracies, especially the United States, hit peak self-esteem in early 1990s. Since then, psychologists lost faith in the capacity of giving away self-worth to improve society. Instead, psychologists in the 21st century preached a neo-Victorian gospel of self-reliance. At the very historical juncture when social mobility became more difficult, when inherited social inequality became more entrenched, psychologists abandoned their Keynesian model of human capital and embraced its neoliberal counterpart.
Ekaterina Zavershneva and René van der Veer
Lev Semyonovich Vygotsky (real name Lev Simkhovich Vygodsky; Orsha 1896–Moscow 1934) was a Russian psychologist who created cultural-historical theory, which proved influential in developmental psychology and other psychological disciplines. Vygotsky characterized his approach as “height psychology” (as opposed to “depth psychology”) and posited that the higher forms of mind should be the starting point for the study of human development. In his view it was essential to study psychological processes in their historical dynamics; these dynamics could be unraveled with the causal-genetic approach he developed, which involved the guided formation of mind in the course of its study or the experimental unfolding of ontogeny. Vygotsky claimed that the mechanisms of human development are not genetically determined and that we must find its source in culture and the social environment. Human development is mediated by cultural artifacts and sign systems, which are mastered in a dialogue with other people in spontaneous or guided interaction, which stimulates development by creating a zone of proximal development. The major means of the transformation of innate mind into higher mind is language, which enables us to preserve and transmit the experience of generations. In this process of cultural development the person develops a system of higher psychological functions that are social in origin, voluntary and mediated in nature, and form part of a systemic whole. The process of ontogeny goes through a series of stable periods and crises that correspond with specific conditions of the social situation of development and the developmental tasks. Age periods are completed with the development of neoformations, which do not just form results but are also prerequisites for further development. With the development of verbal thinking and the mastery of cultural means of behavior the person masters her/his innate mind and becomes a personality, whose main characteristic is freedom of behavior.
Tom Hartley and Graham J. Hitch
Working memory is an aspect of human memory that permits the maintenance and manipulation of temporary information in the service of goal-directed behavior. Its apparently inelastic capacity limits impose constraints on a huge range of activities from language learning to planning, problem-solving, and decision-making. A substantial body of empirical research has revealed reliable benchmark effects that extend to a wide range of different tasks and modalities. These effects support the view that working memory comprises distinct components responsible for attention-like control and for short-term storage. However, the nature of these components, their potential subdivision, and their interrelationships with long-term memory and other aspects of cognition, such as perception and action, remain controversial and are still under investigation. Although working memory has so far resisted theoretical consensus and even a clear-cut definition, research findings demonstrate its critical role in both enabling and limiting human cognition and behavior.