Technology and society
From Wikipedia, the free encyclopedia
From Wikipedia, the free encyclopedia
Technology, society and life or technology and culture refers to the inter-dependency, co-dependence, co-influence, and co-production of technology and society upon one another. Evidence for this synergy has been found since humanity first started using simple tools. The inter-relationship has continued as modern technologies such as the printing press and computers have helped shape society. The first scientific approach to this relationship occurred with the development of tektology, the "science of organization", in early twentieth century Imperial Russia.[1] In modern academia, the interdisciplinary study of the mutual impacts of science, technology, and society, is called science and technology studies.
This article has multiple issues. Please help improve it or discuss these issues on the talk page. (Learn how and when to remove these messages)
|
The simplest form of technology is the development and use of basic tools. The prehistoric discovery of how to control fire and the later Neolithic Revolution increased the available sources of food, and the invention of the wheel helped humans to travel in and control their environment. Developments in historic times have lessened physical barriers to communication and allowed humans to interact freely on a global scale, such as the printing press, telephone, and Internet.
Technology has developed advanced economies, such as the modern global economy, and has led to the rise of a leisure class. Many technological processes produce by-products known as pollution, and deplete natural resources to the detriment of Earth's environment. Innovations influence the values of society and raise new questions in the ethics of technology. Examples include the rise of the notion of efficiency in terms of human productivity, and the challenges of bioethics.
Philosophical debates have arisen over the use of technology, with disagreements over whether technology improves the human condition or worsens it. Neo-Luddism, anarcho-primitivism, and similar reactionary movements criticize the pervasiveness of technology, arguing that it harms the environment and alienates people. However, proponents of ideologies such as transhumanism and techno-progressivism view continued technological progress as beneficial to society and the human condition.
The importance of stone tools, circa 2.5 million years ago, is considered fundamental in the human development in the hunting hypothesis.[citation needed]
Primatologist, Richard Wrangham, theorizes that the control of fire by early humans and the associated development of cooking was the spark that radically changed human evolution.[2] Texts such as Guns, Germs, and Steel suggest that early advances in plant agriculture and husbandry fundamentally shifted the way that collective groups of individuals, and eventually societies, developed.
Technology has taken a large role in society and day-to-day life. When societies know more about the development in a technology, they become able to take advantage of it. When an innovation achieves a certain point after it has been presented and promoted, this technology becomes part of the society. The use of technology in education provides students with technology literacy, information literacy, capacity for life-long learning, and other skills necessary for the 21st century workplace.[3] Digital technology has entered each process and activity made by the social system. In fact, it constructed another worldwide communication system in addition to its origin.[4]
A 1982 study by The New York Times described a technology assessment study by the Institute for the Future, "peering into the future of an electronic world." The study focused on the emerging videotex industry, formed by the marriage of two older technologies, communications, and computing. It estimated that 40 percent of American households will have two-way videotex service by the end of the century. By comparison, it took television 16 years to penetrate 90 percent of households from the time commercial service was begun.
Since the creation of computers achieved an entire better approach to transmit and store data. Digital technology became commonly used for downloading music and watching movies at home either by DVDs or purchasing it online. Digital music records are not quite the same as traditional recording media. Obviously, because digital ones are reproducible, portable and free.[5]
Around the globe many schools have implemented educational technology in primary schools, universities and colleges. According to the statistics, in the early beginnings of 1990s the use of Internet in schools was, on average, 2–3%.[citation needed] Continuously, by the end of 1990s the evolution of technology increases rapidly and reaches to 60%, and by the year of 2008 nearly 100% of schools use Internet on educational form. According to ISTE researchers, technological improvements can lead to numerous achievements in classrooms. E-learning system, collaboration of students on project based learning, and technological skills for future results in motivation of students.[citation needed]
Although these previous examples only show a few of the positive aspects of technology in society, there are negative side effects as well.[6] Within this virtual realm, social media platforms such as Instagram, Facebook, and Snapchat have altered the way Generation Y culture is understanding the world and thus how they view themselves. In recent years, there has been more research on the development of social media depression in users of sites like these. "Facebook Depression" is when users are so affected by their friends' posts and lives that their own jealousy depletes their sense of self-worth. They compare themselves to the posts made by their peers and feel unworthy or monotonous because they feel like their lives are not nearly as exciting as the lives of others.[3]
Technology has a serious effect on youth's health. The overuse of technology is said to be associated with sleep deprivation which is linked to obesity and poor academic performance in the lives of adolescents.[7]
In ancient history, economics began when spontaneous exchange of goods and services was replaced over time by deliberate trade structures. Makers of arrowheads, for example, might have realized they could do better by concentrating on making arrowheads and barter for other needs. Regardless of goods and services bartered, some amount of technology was involved—if no more than in the making of shell and bead jewelry. Even the shaman's potions and sacred objects can be said to have involved some technology. So, from the very beginnings, technology can be said to have spurred the development of more elaborate economies. Technology is seen as primary source in economic development.[8]
Technology advancement and economic growth are related to each other. The level of technology is important to determine the economic growth. It is the technological process which keeps the economy moving.
In the modern world, superior technologies, resources, geography, and history give rise to robust economies; and in a well-functioning, robust economy, economic excess naturally flows into greater use of technology. Moreover, because technology is such an inseparable part of human society, especially in its economic aspects, funding sources for (new) technological endeavors are virtually illimitable. However, while in the beginning, technological investment involved little more than the time, efforts, and skills of one or a few men, today, such investment may involve the collective labor and skills of many millions.
Most recently, because of the COVID-19 pandemic, the proportion of firms employing advanced digital technology in their operations expanded dramatically. It was found that firms that adopted technology were better prepared to deal with the pandemic's disruptions. Adaptation strategies in the form of remote working, 3D printing, and the use of big data analytics and AI to plan activities to adapt to the pandemic were able to ensure positive job growth.[9][10][11]
Consequently, the sources of funding for large technological efforts have dramatically narrowed, since few have ready access to the collective labor of a whole society, or even a large part. It is conventional to divide up funding sources into governmental (involving whole, or nearly whole, social enterprises) and private (involving more limited, but generally more sharply focused) business or individual enterprises.
The government is a major contributor to the development of new technology in many ways. In the United States alone, many government agencies specifically invest billions of dollars in new technology.
In 1980, the UK government invested just over six million pounds in a four-year program, later extended to six years, called the Microelectronics Education Programme (MEP), which was intended to give every school in Britain at least one computer, software, training materials, and extensive teacher training. Similar programs have been instituted by governments around the world.
Technology has frequently been driven by the military, with many modern applications developed for the military before they were adapted for civilian use. However, this has always been a two-way flow, with industry often developing and adopting a technology only later adopted by the military.
Entire government agencies are specifically dedicated to research, such as America's National Science Foundation, the United Kingdom's scientific research institutes, America's Small Business Innovative Research effort. Many other government agencies dedicate a major portion of their budget to research and development.
Research and development is one of the smallest areas of investments made by corporations toward new and innovative technology. [citation needed]
Many foundations and other nonprofit organizations contribute to the development of technology. In the OECD, about two-thirds of research and development in scientific and technical fields is carried out by industry, and 98 percent and 10 percent, respectively, by universities and government. But in poorer countries such as Portugal and Mexico the industry contribution is significantly less. The U.S. government spends more than other countries on military research and development, although the proportion has fallen from about 30 percent in the 1980s to less than 10 percent.[12]
The 2009 founding of Kickstarter allows individuals to receive funding via crowdsourcing for many technology related products including both new physical creations as well as documentaries, films, and web-series that focus on technology management. This circumvents the corporate or government oversight most inventors and artists struggle against but leaves the accountability of the project completely with the individual receiving the funds.
The relationship between science and technology can be complex. Science may drive technological development, by generating demand for new instruments to address a scientific question, or by illustrating technical possibilities previously unconsidered. An environment of encouraged science will also produce scientists and engineers, and technical schools, which encourages innovation and entrepreneurship that are capable of taking advantage of the existing science. In fact, it is recognized that "innovators, like scientists, do require access to technical information and ideas" and "must know enough to recognize useful knowledge when they see it."[13] Science spillover also contributes to greater technological diffusion.[14] Having a strong policy contributing to basic science allows a country to have access to a strong a knowledge base that will allow them to be "ready to exploit unforeseen developments in technology,"[15] when needed in times of crisis.
For most of human history, technological improvements were arrived at by chance, trial and error, or spontaneous inspiration. Stokes referred to these innovators as "'improvers of technology'…who knew no science and would not have been helped by it if they had."[15] This idea is supported by Diamond who further indicated that these individuals are "more likely to achieve a breakthrough if [they do] not hold the currently dominant theory in too high regard."[16] Research and development directed towards immediate technical application is a relatively recent occurrence, arising with the Industrial Revolution and becoming commonplace in the 20th century. In addition, there are examples of economies that do not emphasize science research that have been shown to be technological leaders despite this. For example, the United States relied on the scientific output of Europe in the early 20th century, though it was regarded as a leader in innovation. Another example is the technological advancement of Japan in the latter part of the same century, which emphasized more applied science (directly applicable to technology).[15]
Though the link between science and technology has need for more clarity, what is known is that a society without sufficient building blocks to encourage this link are critical. A nation without emphasis on science is likely to eventually stagnate technologically and risk losing competitive advantage. The most critical areas for focus by policymakers are discouraging too many protections on job security, leading to less mobility of the workforce,[17] encouraging the reliable availability of sufficient low-cost capital for investment in R&D, by favorable economic and tax policies,[18] and supporting higher education in the sciences to produce scientists and engineers.[18]
The implementation of technology influences the values of a society by changing expectations and realities. The implementation of technology is also influenced by values. There are (at least) three major, interrelated values that inform, and are informed by, technological innovations:
Technology often enables organizational and bureaucratic group structures that otherwise and heretofore were simply not possible. Examples of this might include:
Technology enables greater knowledge of international issues, values, and cultures. Due mostly to mass transportation and mass media, the world seems to be a much smaller place, due to the following:[21]
Technology can provide understanding of and appreciation for the world around us, enable sustainability and improve environmental conditions but also degrade the environment and facilitate unsustainability.
Some polities may conclude that certain technologies' environmental detriments and other risks to outweigh their benefits, especially if or once substitutive technologies have been or can be invented, leading to directed technological phase-outs such as the fossil fuel phase-out and the nuclear fission power phase-out.
Most modern technological processes produce unwanted byproducts in addition to the desired products, which are known as waste and pollution. While material waste is often re-used in industrial processes, many processes lead to a release into the environment with negative environmental side effects, such as pollution and lack of sustainability.
Some technologies are designed specifically with the environment in mind, but most are designed first for financial or economic effects such as the free market's profit motive.[22] The effects of a specific technology is often not only dependent on how it is used – e.g. its usage context – but also predetermined by the technology's design or characteristics, as in the theory of "the medium is the message" which relates to media-technologies in specific. In many cases, such predetermined or built-in implications may vary depending on factors of contextual contemporary conditions such as human biology, international relations and socioeconomics. However, many technologies may be harmful to the environment only when used in specific contexts or for specific purposes that not necessarily result from the nature of the technology.
Historically, from the perspective of economic agent-centered responsibility, an increased, as of 2021 commonly theoretic and informal, value of healthy environments and more efficient productive processes may be the result of an increase in the wealth of society. Once people are able to provide for their basic needs, they can – and are often facilitated to – not only afford more environmentally destructive products and services, but could often also be able to put an – e.g. individual morality-motivated – effort into valuing less tangible goods such as clean air and water if product-, alternatives-, consequences- and services-information are adequate.
From the perspective of systems science and cybernetics, economies (systems) have economic actors and sectors make decisions based upon a range of system-internal factors with structures – or sometimes forms of leveraging existing structures – that lead to other outcomes being the result of other architectures – or systems-level configurations of the existing designs – which are considered to be possible in the sense that such could be modeled, tested, priorly assessed, developed and studied.
The effects of technology on the environment are both obvious and subtle. The more obvious effects include the depletion of nonrenewable natural resources (such as petroleum, coal, ores), and the added pollution of air, water, and land. The more subtle effects may include long-term effects (e.g. global warming, deforestation, natural habitat destruction, coastal wetland loss.)
Each wave of technology creates a set of waste previously unknown by humans: toxic waste, radioactive waste, electronic waste, plastic waste, space waste.
Electronic waste creates direct environmental impacts through the production and maintaining the infrastructure necessary for using technology and indirect impacts by breaking barriers for global interaction through the use of information and communications technology.[23] Certain usages of information technology and infrastructure maintenance consume energy that contributes global warming. This includes software-designs such as international cryptocurrencies[24] and most hardware powered by nonrenewable sources.
One of the main problems is the lack of societal decision-making processes – such as the contemporary economy and politics – that lead to sufficient implementation of existing as well as potential efficient ways to remove, recycle and prevent these pollutants on a large scale expediently.
Digital technologies, however, are important in achieving the green transition and specifically, the SDGs and European Green Deal's environmental targets. Emerging digital technologies, if correctly applied, have the potential to play a critical role in addressing environmental issues. A few examples are: smart city mobility, precision agriculture, sustainable supply chains, environmental monitoring, and catastrophe prediction.[25][26]
Society also controls technology through the choices it makes. These choices not only include consumer demands; they also include:
According to Williams and Edge,[27] the construction and shaping of technology includes the concept of choice (and not necessarily conscious choice). Choice is inherent in both the design of individual artifacts and systems, and in the making of those artifacts and systems.
The idea here is that a single technology may not emerge from the unfolding of a predetermined logic or a single determinant, technology could be a garden of forking paths, with different paths potentially leading to different technological outcomes. This is a position that has been developed in detail by Judy Wajcman. Therefore, choices could have differing implications for society and for particular social groups.
In one line of thought, technology develops autonomously, in other words, technology seems to feed on itself, moving forward with a force irresistible by humans. To these individuals, technology is "inherently dynamic and self-augmenting."[28]
Jacques Ellul is one proponent of the irresistibleness of technology to humans. He espouses the idea that humanity cannot resist the temptation of expanding our knowledge and our technological abilities. However, he does not believe that this seeming autonomy of technology is inherent. But the perceived autonomy is because humans do not adequately consider the responsibility that is inherent in technological processes.
Langdon Winner critiques the idea that technological evolution is essentially beyond the control of individuals or society in his book Autonomous Technology. He argues instead that the apparent autonomy of technology is a result of "technological somnambulism," the tendency of people to uncritically and unreflectively embrace and utilize new technologies without regard for their broader social and political effects.
In 1980, Mike Cooley published a critique of the automation and computerisation of engineering work under the title "Architect or Bee? The human/technology relationship". The title alludes to a comparison made by Karl Marx, on the issue of the creative achievements of human imaginative power.[29] According to Cooley ""Scientific and technological developments have invariably proved to be double-edged. They produced the beauty of Venice and the hideousness of Chernobyl; the caring therapies of Rontgen's X-rays and the destruction of Hiroshima,"[30]
Individuals rely on governmental assistance to control the side effects and negative consequences of technology.
Recently, the social shaping of technology has had new influence in the fields of e-science and e-social science in the United Kingdom, which has made centers focusing on the social shaping of science and technology a central part of their funding programs.
Seamless Wikipedia browsing. On steroids.
Every time you click a link to Wikipedia, Wiktionary or Wikiquote in your browser's search results, it will show the modern Wikiwand interface.
Wikiwand extension is a five stars, simple, with minimum permission required to keep your browsing private, safe and transparent.