1-2 of 2 Results

  • Keywords: historical trends x
Clear all


Eileen S. Johnson

Action research has become a common practice among educational administrators. The term “action research” was first coined by Kurt Lewin in the 1930s, although teachers and school administrators have long engaged in the process described by and formally named by Lewin. Alternatively known as practitioner research, self-study, action science, site-based inquiry, emancipatory praxis, etc., action research is essentially a collaborative, democratic, and participatory approach to systematic inquiry into a problem of practice within a local context. Action research has become prevalent in many fields and disciplines, including education, health sciences, nursing, social work, and anthropology. This prevalence can be understood in the way action research lends itself to action-based inquiry, participation, collaboration, and the development of solutions to problems of everyday practice in local contexts. In particular, action research has become commonplace in educational administration preparation programs due to its alignment and natural fit with the nature of education and the decision making and action planning necessary within local school contexts. Although there is not one prescribed way to engage in action research, and there are multiple approaches to action research, it generally follows a systematic and cyclical pattern of reflection, planning, action, observation, and data collection, evaluation that then repeats in an iterative and ongoing manner. The goal of action research is not to add to a general body of knowledge but, rather, to inform local practice, engage in professional learning, build a community practice, solve a problem or understand a process or phenomenon within a particular context, or empower participants to generate self-knowledge.


During the 18th and 19th centuries, medical spending in the United States rose slowly, on average about .25% faster than gross domestic product (GDP), and varied widely between rural and urban regions. Accumulating scientific advances caused spending to accelerate by 1910. From 1930 to 1955, rapid per-capita income growth accommodated major medical expansion while keeping the health share of GDP almost constant. During the 1950s and 1960s, prosperity and investment in research, the workforce, and hospitals caused a rapid surge in spending and consolidated a truly national health system. Excess growth rates (above GDP growth) were above +5% per year from 1966 to 1970, which would have doubled the health-sector share in fifteen years had it not moderated, falling under +3% in the 1980s, +2% in 1990s, and +1.5% since 2005. The question of when national health expenditure growth can be brought into line with GDP and made sustainable for the long run is still open. A review of historical data over three centuries forces confrontation with issues regarding what to include and how long events continue to effect national health accounting and policy. Empirical analysis at a national scale over multiple decades fails to support a position that many of the commonly discussed variables (obesity, aging, mortality rates, coinsurance) do cause significant shifts in expenditure trends. What does become clear is that there are long and variable lags before macroeconomic and technological events affect spending: three to six years for business cycles and multiple decades for major recessions, scientific discoveries, and organizational change. Health-financing mechanisms, such as employer-based health insurance, Medicare, and the Affordable Care Act (Obamacare) are seen to be both cause and effect, taking years to develop and affecting spending for decades to come.