1-11 of 11 Results

  • Keywords: artificial intelligence x
Clear all

Article

Internet-based services that build on automated algorithmic selection processes, for example search engines, computational advertising, and recommender systems, are booming and platform companies that provide such services are among the most valuable corporations worldwide. Algorithms on and beyond the Internet are increasingly influencing, aiding, or replacing human decision-making in many life domains. Their far-reaching, multifaceted economic and social impact, which results from the governance by algorithms, is widely acknowledged. However, suitable policy reactions, that is, the governance of algorithms, are the subject of controversy in academia, politics, industry, and civil society. This governance by and of algorithms is to be understood in the wider context of current technical and societal change, and in connection with other emerging trends. In particular, expanding algorithmizing of life domains is closely interrelated with and dependent on growing datafication and big data on the one hand, and rising automation and artificial intelligence in modern, digitized societies on the other. Consequently, the assessments and debates of these central developmental trends in digitized societies overlap extensively. Research on the governance by and of algorithms is highly interdisciplinary. Communication studies contributes to the formation of so-called “critical algorithms studies” with its wide set of sub-fields and approaches and by applying qualitative and quantitative methods. Its contributions focus both on the impact of algorithmic systems on traditional media, journalism, and the public sphere, and also cover effect analyses and risk assessments of algorithmic-selection applications in many domains of everyday life. The latter includes the whole range of public and private governance options to counter or reduce these risks or to safeguard ethical standards and human rights, including communication rights in a digital age.

Article

Martin Obschonka and Christian Fisch

Advances in Artificial Intelligence (AI) are intensively shaping businesses and the economy as a whole, and AI-related research is exploding in many domains of business and management research. In contrast, AI has received relatively little attention within the domain of entrepreneurship research, while many entrepreneurship scholars agree that AI will likely shape entrepreneurship research in deep, disruptive ways. When summarizing both the existing entrepreneurship literature on AI and potential avenues for future research, the growing relevance of AI for entrepreneurship research manifests itself along two dimensions. First, AI applications in the real world establish a distinct research topic (e.g., whether and how entrepreneurs and entrepreneurial ventures use and develop AI-based technologies, or how AI can function as an external enabler that generates and enhances entrepreneurial outcomes). In other words, AI is changing the research object in entrepreneurship research. The second dimension refers to drawing on AI-based research methods, such as big data techniques or AI-based forecasting methods. Such AI-based methods open several avenues for researchers to gain new, influential insights into entrepreneurs and entrepreneurial ventures that are more difficult to assess using traditional methods. In other words, AI is changing the research methods. Given that, so far, human intelligence could not fully uncover and comprehend the secrets behind the entrepreneurial process that is so deeply embedded in uncertainty and opportunity, AI-supported research methods might achieve new breakthrough discoveries. We conclude that the field needs to embrace AI as a topic and research method more enthusiastically while maintaining the essential research standards and scientific rigor that guarantee the field’s well-being, reputation, and impact.

Article

Since the beginning of space exploration, outer space has fascinated, captivated and intrigued people’s mind. The launch of the first artificial satellite—Sputnik—in 1957 by the Soviet Union, and the first man on the Moon in 1969 represent two significant missions in the space exploration history. In 1972, Apollo 17 marked the last human program on the lunar surface. Nevertheless, several robotic spacecrafts traveled to the Moon such as the Soviet Luna 24 in 1976 or more recently China’s Chang’e 4 in 2019 which touched down on its far side, the first time for a space vehicle. The international space community is currently assessing a return to the Moon in 2024 and even beyond in the coming decades, toward the Red Planet, Mars. Robots and rovers, for instance, Curiosity, Philae, Rosetta or Perseverance, will continue to play a major role in space exploration by paving the way for future long-duration missions on celestial bodies. Landing humans on the Moon, Mars, or on other celestial bodies, needs robotics because there are significant challenges to overcome from technological and physiological perspectives. Therefore, the support of machines and artificial intelligence is essential for developing future deep space programs as well as to reach a sustainable space exploration. One can imagine future circumstances where robots and humans are collaborating together on the Moon’s surface or on celestial bodies to undertake scientific research, to extract and to analyze space resources for a possible in situ utilization, as well as to build sites for human habitation and work. Indeed, different situations can be considered: (a) a robot, located on a celestial body, operated by a human on Earth or aboard a space station; (b) the in situ operation of a robot by an astronaut; (c) the interaction between a robot in outer space, manipulated from Earth and an astronaut; (d) the interaction between a robot operated from space and an astronaut; (e) the interaction between a robot with an artificial intelligence component and an astronaut; (f) the interaction between two robots in the case of on-orbit servicing. The principles of free exploration and cooperation are two core concepts in the international space legal framework. Hence, it is necessary to analyse the provisions on the five United Nations space treaties in the context of “human-robotic” cooperation. In addition, the development of a Code of Conduct for space exploration, involving humans and robots, might be needed in order to clearly identify the missions using robotic systems (e.g., mission’s purpose, area of operations) and to foresee scenarios of responsibility and liability in case of damage. Lastly, a review of the dispute settlement mechanisms is particularly relevant as international claims related to human–robot activities will inevitably occur given the fact that their collaboration will increase as more missions are being planned on celestial bodies.

Article

Noreen Herzfeld

Cybernetics is the study of systems of control and communication. While often used to refer to control systems in or by machines, such as computers, cybernetic theory can be applied to control and communication within a variety of areas, including human interaction and systems of production, distribution, or design, systems that may be comprised of humans, machines, or a combination of humans and machines. A cybernetic view of any system focuses on information and the flow of information, for that is what effects both control and communication. While cybernetics is a concept that can be used to describe any system through which information flows, today most human generated information flows through computers or computer controlled networks; thus in the popular mind, cybernetics is frequently used as a referent to anything pertaining to computer design, use, and human-computer interaction. A cybernetic view of the human person finds each person’s identity in the information comprising our memories, feelings, emotions, and thoughts. Human beings are considered in this view to be biological machines, each of whose unique identity is found in the patterns stored in the neuronal structures of the brain. In such an anthropology, there is no soul. Each of us is merely a vast and ever-changing collection of information. However, there is the possibility of a form of immortality effected by uploading the human brain to a computer. Cybernetics is, historically, closely associated with the field of artificial intelligence. Though experiencing initial successes in fields such as game playing or mathematics, producing a full, human-like intelligence has so far been limited by the difficult problems of giving a robot a body similar to ours, in order to experience the world as we do, and the necessity of emotion for true cognition and autonomous decision making. We have come closer to realizing the dreams of cybernetics by using the computer to mediate human-to-human relationships, especially through social media, such as Facebook and Twitter. This has implications for religion, in that the widespread dissemination of a variety of religious materials and discussions has led to increased contact with other religions, increased conversions, and an increase in fundamentalism. Cybernetic theories can also be used to describe the origin of religion and the development of ethical systems. In general, a cybernetic view of the development of religion focuses on religion as an adaptive mechanism for the survival of groups as they evolve and change in an atmosphere of physical and social competition.

Article

Over the last six decades, discussions and approaches to communication and development have evolved considerably. Some of these changes particularly focus on the transformation of the nation-state role, from its initial conception to its current formation, as well as the transition from the study of political and economic progress to the analysis of cultural components and social development today. These major approaches include modernization, diffusion of innovation, dependency paradigm, monistic-emancipatory approach, institutional theory approach, industrial policy, strategic restructuring model, evolutionary paradigm, interorganizational approach, ecosystem approach, and an approach that highlights culture, power, age, gender and disability dimensions. Part of this investigation includes research trends in communication and development. Scholarship identifying such trends highlights newer technologies as well as a continuing presence of digital inequalities. Additional research is needed to capture processes such as cross-organizational and cross-cultural learning and improvisation in terms of communication and development, and to recognize the roles of power and culture in these domains. Furthermore, taking a co-processes approach prevents one from assuming that there is only one correct pathway in the field of communication and development

Article

Fei Yang

Predictive policing, also known as crime forecasting, is a set of high technologies aiding the police in solving past crimes and pre-emptively fighting and preventing future ones. With the right deployment of such technologies, law enforcement agencies can combat and control crime more efficiently with time and resources better employed and allocated. The current practices of predictive policing include the integration of various technologies, ranging from predictive crime maps and surveillance cameras to sophisticated computer software and artificial intelligence. Predictive analytics help the police make predictions about where and when future crime is most likely to happen and who will be the perpetrator and who the potential victim. The underpinning logic behind such predictions is the predictability of criminal behavior and crime patterns based on criminological research and theories such as rational choice and deterrence theories, routine activities theory, and broken windows theory. Currently many jurisdictions in the United States have deployed or have been experimenting with various predictive policing technologies. The most widely adopted applications include CompStat, PredPol, HunchLab, Strategic Subject List (SSL), Beware, Domain Awareness System (DAS), and Palantir. The realization of these predictive policing analytics systems relies heavily on the technological assistance provided by data collection and integration software, facial/vehicle identification and tracking tools, and surveillance technologies that keep tabs on individual activities both in the physical environment and in the digital world. Some examples of these assisting technologies include Automatic License Plate Recognition (ALPR), Next-Generation Identification (NGI) System, the Global Positioning System (GPS), Automatic Vehicle Location (AVL), next-generation police body-worn cameras (BWC) with facial recognition and tracking functions, aerial cameras and unmanned aircraft systems, DeepFace, Persistent Surveillance Systems, Stingrays/D(i)RT-Box/International Mobile Subscriber Identity Catcher, SnapTrends that monitors and analyzes feeds on Twitter, Facebook, Instagram, Picasa, Flickr, and YouTube. This new fashion of using predictive analytics in policing has elicited extensive doubt and criticism since its invention. Whereas scholarly evaluation research shows mixed findings about how effectively predictive policing actually works to help reduce crime, other concerns center around legal and civil rights issues (including privacy protection and the legitimacy of mass surveillance), inequality (stratified surveillance), cost-effectiveness of the technologies, militarization of the police and its implications (such as worsened relationship and weakened trust between the police and the public), and epistemological challenges to understanding crime. To make the best use of the technologies and avoid their pitfalls at the same time, policymakers need to consider the hotly debated controversies raised in the evolution of predictive policing.

Article

Empirical-statistical downscaling (ESD) models use statistical relationships to infer local climate information from large-scale climate information produced by global climate models (GCMs), as an alternative to the dynamical downscaling provided by regional climate models (RCMs). Among various statistical downscaling approaches, the nonlinear methods are mainly used to construct downscaling models for local variables that strongly deviate from linearity and normality, such as daily precipitation. These approaches are also appropriate to handle downscaling of extreme rainfall. There are nonlinear downscaling techniques of various complexities. The simplest one is represented by the analog method that originated in the late 1960s from the need to obtain local details of short-term weather forecasting for various variables (air temperature, precipitation, wind, etc.). Its first application as a statistical downscaling approach in climate science was carried out in the late 1990s. More sophisticated statistical downscaling models have been developed based on a wide range of nonlinear functions. Among them, the artificial neural network (ANN) was the first nonlinear regression–type method used as a statistical downscaling technique in climate science in the late 1990s. The ANN was inspired by the human brain, and it was used early in artificial intelligence and robotics. The impressive development of machine learning algorithms that can automatically extract information from a vast amount of data, usually through nonlinear multivariate models, contributed to improvements of ANN downscaling models and the development of other new, machine learning-based downscaling models to overcome some ANN drawbacks, such as support vector machine and random forest techniques. The mixed models combining various machine learning downscaling approaches maximize the downscaling skill in local climate change applications, especially for extreme rainfall indices. Other nonlinear statistical downscaling approaches refer to conditional weather generators, combining a standard weather generator (WG) with a separate statistical downscaling model by conditioning the WG parameters on large-scale predictors via a nonlinear approach. The most popular ways to condition the WG parameters are the weather-type approach and generalized linear models. This article discusses various aspects of nonlinear statistical downscaling approaches, their strengths and weaknesses, as well as comparison with linear statistical downscaling models. A proper validation of the nonlinear statistical downscaling models is an important issue, allowing selection of an appropriate model to obtain credible information on local climate change. Selection of large-scale predictors, the model’s ability to reproduce historical trends, extreme events, and the uncertainty related to future downscaled changes are important issues to be addressed. A better estimation of the uncertainty related to downscaled climate change projections can be achieved by using ensembles of more GCMs as drivers, including their ability to simulate the input in downscaling models. Comparison between more future statistical downscaled climate change signals and those derived from dynamical downscaling driven by the same global model, including a complex validation of the RCMs, gives a measure of the reliability of downscaled regional climate changes.

Article

Gina Griffin

As technological advances continue to develop, delivering macro human service through social work innovations becomes a new priority for the discipline. Digital technologies offer potential applications using tablets, smartphones, cloud computing, artificial intelligence, and wearable technology to enable whole new possibilities for human services. As a result, policymakers and community organizers alike can access the existing information much faster, and potentially connect with hard-to-reach communities to make meaningful decisions. Incorporating the latest digital trends from business and industry settings to macro social work practice are highlighted. By utilizing digital technology, human service organizations can become more proactive and citizen-centered, potentially transforming personal and economic capacity.

Article

Global climate models (GCM) are fundamental tools for weather forecasting and climate predictions at different time scales, from intraseasonal prediction to climate change projections. Their design allows GCMs to simulate the global climate adequately, but they are not able to skillfully simulate local/regional climates. Consequently, downscaling and bias correction methods are increasingly needed and applied for generating useful local and regional climate information from the coarse GCM resolution. Empirical-statistical downscaling (ESD) methods generate climate information at the local scale or with a greater resolution than that achieved by GCM by means of empirical or statistical relationships between large-scale atmospheric variables and the local observed climate. As a counterpart approach, dynamical downscaling is based on regional climate models that simulate regional climate processes with a greater spatial resolution, using GCM fields as initial or boundary conditions. Various ESD methods can be classified according to different criteria, depending on their approach, implementation, and application. In general terms, ESD methods can be categorized into subgroups that include transfer functions or regression models (either linear or nonlinear), weather generators, and weather typing methods and analogs. Although these methods can be grouped into different categories, they can also be combined to generate more sophisticated downscaling methods. In the last group, weather typing and analogs, the methods relate the occurrence of particular weather classes to local and regional weather conditions. In particular, the analog method is based on finding atmospheric states in the historical record that are similar to the atmospheric state on a given target day. Then, the corresponding historical local weather conditions are used to estimate local weather conditions on the target day. The analog method is a relatively simple technique that has been extensively used as a benchmark method in statistical downscaling applications. Of easy construction and applicability to any predictand variable, it has shown to perform as well as other more sophisticated methods. These attributes have inspired its application in diverse studies around the world that explore its ability to simulate different characteristics of regional climates.

Article

The application of digital technologies within interdisciplinary environments is enabling the development of more efficient methods and techniques for analyzing historical corpora at scales that were not feasible before. The project “Digging into Early Colonial Mexico” is an example of cooperation among archaeologists, historians, computer scientists, and geographers engaged in designing and implementing methods for text mining and large-scale analysis of primary and secondary historical sources, specifically the automated identification of vital analytical concepts linked to locational references, revealing the spatial and geographic context of the historical narrative. As a case study, the project focuses on the Relaciones Geográficas de la Nueva España (Geographic Reports of New Spain, or RGs). This is a corpus of textual and pictographic documents produced in 1577–1585 ce that provides one of the most complete and extensive accounts of Mexico and Guatemala’s history and the social situation at the time. The research team is developing valuable digital tools and datasets, including (a) a comprehensive historical gazetteer containing thousands of georeferenced toponyms integrated within a geographical information system (GIS); (b) two digital versions of the RGs corpus, one fully annotated and ready for information extraction, and another suitable for further experimentation with algorithms of machine learning (ML), natural language processing (NLP), and corpus linguistics (CL) analyses; and (c) software tools that support a research method called geographical text analysis (GTA). GTA applies natural language processing based on deep learning algorithms for named entity recognition, disambiguation, and classification to enable the parsing of texts and the automatic mark-up of words referring to place names that are later associated with analytical concepts through a technique called geographic collocation analysis. By leveraging the benefits of the GTA methodology and resources, the research team is in the process of investigating questions related to the landscape and territorial transformations experienced during the colonization of Mexico, as well as the discovery of social, economic, political, and religious patterns in the way of life of Indigenous and Spanish communities of New Spain toward the last quarter of the 16th century. All datasets and research products will be released under an open-access license for the free use of scholars engaged in Latin American studies or interested in computational approaches to history.

Article

Gary L. Kreps

Ehealth, also known as E-health, is a relatively new area of health communication inquiry that examines the development, implementation, and application of a broad range of evolving health information technologies (HITs) in modern society to disseminate health information, deliver health care, and promote public health. Ehealth applications include (a) the widespread development of specialized health information websites (often hosted by government agencies, health care systems, corporations, professional societies, health advocacy organizations, and other for-profit and nonprofit organizations); (b) the widespread use of electronic health record (EHR) systems designed to preserve and disseminate health information for health care providers, administrators, and consumers; (c) an array of mobile health education and support applications that have often been developed for use with smartphones; (d) mobile health behavior monitoring, tracking, and alerting equipment (such as wearable devices and systems imbedded in vehicles, clothing, and sporting equipment); (e) interactive telemedicine systems for collecting health data and delivering health care services remotely; (f) interactive adaptive tailored health information systems to support health education, motivate health behaviors, and to inform health decision making; (g) online social support groups for health care consumers, caregivers, and providers; (h) health promotion focused digital games to engage consumers in health education and train both providers and consumers about health promoting procedures; (i) dedicated computer portals that can deliver a variety of digital health information tools and functions to consumers, caregivers, and providers; and (j) interactive and adaptive virtual human agent systems that can gather and provide relevant health information, virtual reality programs that can simulate health environments for training and therapeutic purposes, and an ever-increasing number of digital applications (apps) for addressing a range of health conditions and activities. As information technology evolves, new ehealth applications and programs are being developed and introduced to provide a wide range of powerful ehealth systems to assist with health care and health promotion. Ehealth technologies have been found by many researchers, practitioners, and consumers to hold tremendous promise for enhancing the delivery of health care and promotion of health, ultimately improving health outcomes. Many popularly adopted ehealth applications (such as health websites, health care portals, decision support systems, and wearable health information devices) are transforming the modern health care system by supplementing and extending traditional channels for health communication. The use of new ehealth applications enables the broad dissemination of relevant health information that can be personalized to the unique communication orientations, backgrounds, and information needs of individuals. New ehealth communication channels can provide health care consumers and providers with the relevant health information that they need to make informed health care decisions. These ehealth communication channels can provide this information to people exactly when and where they need it, which is especially important for addressing fast-moving and dangerous health threats. Yet, with all the promise of ehealth communication, there is still a tremendous amount of work to be done to make the wide array of new ehealth applications as useful as possible for promoting health with different audiences. This article describes the current state of knowledge about the development and use of HITs, as well as about strategies for improving ehealth communication applications to enhance the delivery of health care and the promotion of public health.