1-4 of 4 Results

  • Keywords: formalization x
Clear all

Article

John Lavagnino

Digital textuality has its roots in the most familiar digital system, the alphabet. In defining rules for what aspects of an inscription contain information, the alphabet makes exact copying of writing possible; such exact copying is the fundamental digital characteristic, without which digital machinery could not work. But copyability can have practical limitations, when more complex forms are built up out of basic digital elements: documents, in particular, often assume particular concepts and systems. Digital document systems can be based on many different theories of documents, and typically combine incompatible theories in one document; they also hide considerable amounts of information from users. Very different digital approaches to texts are found in databases, which atomize texts and render all relationships explicit; this degree of formalization is not common in the humanities, but it enables the creation of widely used research tools (such as library catalogues). The principal innovation in digital documents so far is the hypertextual link, which in connecting texts more closely together created new possibilities for expression and exploration. The creation of vast amounts of digital text led to the unexpected importance of searching, which was made more usable by exploitation of the information provided by links. Searching has overturned ancient hierarchies of importance and attention, by making forgotten texts as accessible as canonical ones.

Article

Alan Smart

Squatting is one of the most important forms of housing for the world’s poor, accommodating perhaps a billion people, with the numbers continuing to grow. Squatters occupy vacant land or buildings without the consent of the owner. Squatting in existing buildings is more common in the Global North, particularly in Europe, and tends to be more political, often explicitly anticapitalist, than squatting on vacant land, which accounts for the vast majority of squatters, particularly in the Global South. Urban squatter housing needs to be seen as valuable housing rather than just as a social problem. Housing generally has exchange value, a price on housing markets, as well as use value, the utility of it for those who live in it. Early research dealt primarily with use value because of the emphasis on self-building and collectively organized invasions of land. Demand for scarce stocks of affordable housing leads to market prices despite governmental denial of the possibility of ownership of illegal dwellings. Squatter housing often meets the needs of poor people more effectively than public housing, and policy initiatives around the world are attempting to enhance the utility of informally built and regulated housing while mitigating the environmental problems that they can cause. Formalizing informal housing is a key but controversial policy. Research has revealed that informal tenure security is considered adequate by residents, resulting in lower than expected demand for squatter titling. Formalization may also lead to gentrification and thus diminishes the abilities of informal housing to provide affordable accommodation.

Article

Béatrice Godart-Wendling

The term “philosophy of language” is intrinsically paradoxical: it denominates the main philosophical current of the 20th century but is devoid of any univocal definition. While the emergence of this current was based on the idea that philosophical questions were only language problems that could be elucidated through a logico-linguistic analysis, the interest in this approach gave rise to philosophical theories that, although having points of convergence for some of them, developed very different philosophical conceptions. The only constant in all these theories is the recognition that this current of thought originated in the work of Gottlob Frege (b. 1848–d. 1925), thus marking what was to be called “the linguistic turn.” Despite the theoretical diversity within the philosophy of language, the history of this current can however be traced in four stages: The first one began in 1892 with Frege’s paper “Über Sinn und Bedeutung” and aimed to clarify language by using the rules of logic. The Fregean principle underpinning this program was that we must banish psychological considerations from linguistic analysis in order to avoid associating the meaning of words with mental pictures or states. The work of Frege, Bertrand Russell (1872–1970), George Moore (1873–1958), Ludwig Wittgenstein (1921), Rudolf Carnap (1891–1970), and Willard Van Orman Quine (1908–2000) is representative of this period. In this logicist point of view, the questions raised mainly concerned syntax and semantics, since the goal was to define a formalism able to represent the structure of propositions and to explain how language can describe the world by mirroring it. The problem specific to this period was therefore the function of representing the world by language, thus placing at the heart of the philosophical debate the notions of reference, meaning, and truth. The second phase of the philosophy of language was adumbrated in the 1930s with the courses given by Wittgenstein (1889–1951) in Cambridge (The Blue and Brown Books), but it did not really take off until 1950–1960 with the work of Peter Strawson (1919–2006), Wittgenstein (1953), John Austin (1911–1960), and John Searle (1932–). In spite of the very different approaches developed by these theorists, the two main ideas that characterized this period were: one, that only the examination of natural (also called “ordinary”) language can give access to an understanding of how language functions, and two, that the specificity of this language resides in its ability to perform actions. It was therefore no longer a question of analyzing language in logical terms, but rather of considering it in itself, by examining the meaning of statements as they are used in given contexts. In this perspective, the pivotal concepts explored by philosophers became those of (situated) meaning, felicity conditions, use, and context. The beginning of the 1970s initiated the third phase of this movement by orienting research toward two quite distinct directions. The first, resulting from the work on proper names, natural-kind words, and indexicals undertaken by the logician philosophers Saul Kripke (1940–), David Lewis (1941–2001), Hilary Putnam (1926–2016), and David Kaplan (1933–), brought credibility to the semantics of possible worlds. The second, conducted by Paul Grice (1913–1988) on human communicational rationality, harked back to the psychologism dismissed by Frege and conceived of the functioning of language as highly dependent on a theory of mind. The focus was then put on the inferences that the different protagonists in a linguistic exchange construct from the recognition of hidden intentions in the discourse of others. In this perspective, the concepts of implicitness, relevance, and cognitive efficiency became central and required involving a greater number of contextual parameters to account for them. In the wake of this research, many theorists turned to the philosophy of mind as evidenced in the late 1980s by the work on relevance by Dan Sperber (1942–) and Deirdre Wilson (1941–). The contemporary period, marked by the thinking of Robert Brandom (1950–) and Charles Travis (1943–), is illustrated by its orientation toward a radical contextualism and the return of inferentialism that draws strongly on Frege. Within these theoretical frameworks, the notions of truth and reference no longer fall within the field of semantics but rather of pragmatics. The emphasis is placed on the commitment that the speakers make when they speak, as well as on their responsibility with respect to their utterances.

Article

Roxanne M. Mitchell

Schools are organizations with a formal bureaucratic structure. Hoy and Sweetland applied the work of Gouldner, who viewed organizational structure as ranging from representative to punishment centered, and Adler and Borys, who viewed bureaucracy as ranging from enabling to coercive, to schools. They coined the term “enabling school structures” (ESS), which they defined as “a hierarchy of authority and a system of rules and regulations that help rather than hinder the teaching learning mission of the school.” Hoy and Sweetland then developed and validated a reliable instrument to measure this construct. This spawned a considerable body of research on the antecedents and consequents of ESS. A comprehensive literature review from 2000 to 2018 produced 22 articles that utilized ESS as conceptualized and operationalized by Hoy and Sweetland. This review did not include book chapters or unpublished dissertations. Findings from the research suggest that ESS fosters trust relationships and collaboration among teachers. It helps to establish a culture of academic optimism and promotes the development of professional learning communities. ESS has been shown to have both a direct and indirect effect on student achievement. ESS is correlated with a host of factors deemed important in schools, such as teacher and principal authenticity, collective teacher efficacy, teacher professionalism, and collective responsibility. It is negatively associated with dependence on superiors, dependence on rules, truth spinning, role conflict, and illegitimate politics. It appears to be higher in smaller schools, particularly schools situated in rural areas. Studies have been conducted in China, Turkey, and South and Central America, which have given credence to the notion that ESS has applicability beyond the United States where the work was originally conceptualized. ESS was not affected by socioeconomic status in schools in the United States, and therefore, it may serve as one way to ameliorate the negative effects of poverty on student success.