User:SteRos7/sandbox/Approaches to Knowledge/Seminar 1/History

History of Sociology
The roots of modern sociological thought have often been thought to be located in the European Age of Enlightenment and its influence on positivist thinking, with writers such as [|Henri di Saint-Simon] cited as providing the foundations of a 'New science of society' in the aftermath of the French Revolution. Auguste Comte, similarly, in the late 18th Century with his Course of Positive Philosophy, has been said to have set the foundations for the science of society ; outlining a theory of progressive stages of societal development and focusing on issues including the division of labour, religion, and social cohesion. Other prominent writers have also been noted for their contributions to the beginnings of sociological thought, including Marx: for his contributions on political economy and his materialist conception of history, and Weber: for his focus on the rise of modern Western society, the cultural determinants of capitalism, bureaucracy, and the development of the modern state.

Distinctions have been made between early sociological thought; characterised by narratives of progressive history, and classical sociological thought; characterised by the application of the scientific method to studying society.

Despite multiple early influences on sociological concepts, it is Durkheim, known most famously for his concepts of anomie and solidarity, who has most widely been recognised as the founding father of Sociology as a discipline. The process of institutitonalisation has been described as a slow one due to resistance from other dominant disciplines of the time. It was Durkheim himself that founded L'Année Sociologique, an academic sociological journal in which he, along with others, published several writings including his infamous Le Suicide in 1897. Rene Worms also played a role in establishing Sociology in France at this time with the International Institute of Sociology and other journals.

In England, the first sociology society (The Institute of Sociology) was founded following the death of then influential Herbert Spencer in 1903. The first courses offered in University were at the University of London, beginning in 1907 with Professors Leonard Hobhouse and Edward Westermarck.

In the United States, the beginnings of Sociology have often been attributed to Albion Small at the University of Chicago in 1892. However, this interpretation has been disputed with recognition of the fact that Frank Wilson Blackmar initiated his 'Elements of Sociology' class at the University of Kansas early as 1890.

More recently within the discipline, proposals have been made to recognise earlier influential origins of sociological thought outside of Europe, with debates over the extent to which Islamic scholar Ibn Khaldun could justifiably be recognised as one of the founding fathers. It has been argued that he "discovered the essentials of sociology such as the systematic analysis of social structure and group behaviour...some five centuries before Comte coined the word". Similarly, efforts have also been made to recognise other Non-Western and female influences which have hitherto been underrepresented in the traditional sociological canon.

Understanding How Biology Evolved Through Evolution
Knowledge and concepts related to the contemporary understanding of biology can be found dating back to 200BCE in manuscripts from Ancient Egyptians. However, it was not until the 19th century where the discipline had begun to be realised as a result of paradigm shifts and scientific breakthroughs which consequently formed the basis of our modern understanding in biology.

Before that, the natural science had close roots to theistic explanations with notable biologists as well as academics such as William Paley, Louis and Alexander Agassiz, George McCready Price subscribing to the creationist explanation that we are the product of an intelligent designer: a god. Because of this, though Darwin published his theory of evolution, 'On the Origin of Species' in late 1859, the scientific community stuck to what they were familiar with, rejecting the explanation of how modern organisms emerged via natural selection. However, like his peers, Darwin believed in blending inheritance, a now invalidated theory which inhibited Darwin from understanding his observations on how genetic variation in an animal population could be preserved, thereby failing to offer a holistic explanation to support his theory.

At the turn of the century in 1900, scientists rediscovered an 1866 paper by Gregor Mendel and his study with pea plants; filling in the gaps of knowledge in Darwin’s theory with what we now know as the Mendelian ratios, explaining how favourable traits remained amongst the next generation, thus earning it more credibility. The following decades reformed the understanding amongst scientists regarding our origins with the publication of Dobzhansky’s '[https://en.wikipedia.org/wiki/Genetics_and_the_Origin_of_Species#:~:text=Genetics%20and%20the%20Origin%20of%20Species%20is%20a%201937%20book,American%20evolutionary%20biologist%20Theodosius%20Dobzhansky.&text=Dobzhansky%20implements%20theories%20of%20mutation,effects%20on%20their%20genetic%20behavior. Genetics and the Origin of Species]'. The paper draws from both Darwin and Mendel’s observations; offering a comprehensive explanation of evolution, consequently conceding Darwin’s theory to be universally accepted.

This paradigm shift was a result of various pieces of information from different researches over time, all contributing to consolidating a prominent theory within biology. Subsequently, a new sub-discipline, evolutionary biology gradually formed with evolutionary biologists looking to the past whilst using contemporary theories and methods to reconstruct the events/processes that led to evolution. Such methods have proved beneficial to related disciplines, especially Medicine in tracing the "origins of pandemic-causing viruses, informing research in cancer treatment and determining susceptibility to specific diseases". In parallel, by assessing the discipline’s history, one can infer how knowledge in biology or in the natural sciences as a matter of fact can drastically change through time; welcoming understood truths supported by replicable evidence whilst refuting false assumptions as new information is attained.

The rise of a new discipline: The Philosophy of Chemistry
Disciplines can die out due to lack of inputs or advancements. Indeed, brutal change of direction, emphasis or approach on the subject can be dramatic for some disciplines. On the contrary, new disciplines can also come up from from related areas inputs. The key factor is the emergence of new knowledge from some disciplines. Indeed, it is the result of the formal coming together of multiple disciplines to focus on a common issue or problem. This interdisciplinarity allows new fields to come up. That is the case for the philosophy of chemistry. Philosophy has always been related to Chemistry. In the past greatest philosophers as Hegel,Comte or Engels, claimed that chemistry was a source of fundamental insights into physical reality. At first, philosophers of science didn't consider this discipline as they preferred “grand, unifying theoretical visions instead of complicated local sights.”. Among those philosophers, Dirac even claimed in 1929 that chemistry was nothing more than applied quantum mechanics.

However, philosophers started to focus their attention into the chemistry field. First of all chemistry received wide-spread attention regarding the reduction of chemistry to physics. Indeed for a long time, scientists and philosophers believed that any chemical phenomenon could be deduced from the Schrodinger equation. It appeared that application to quantum mechanics of problem of molecular structures require to have some basic knowledge of chemistry. Thus, this shows that chemistry is an autonomous discipline from physics.

This discipline thinks of time more than an axis on the graph. In fact, temporal dimension and spatial dimension have to be conferred with the same complexity to explain important phenomenons. In contrast with Physics, Chemistry has its own distinct language organized and classified in the famous periodic table. Furthermore, it has unique visual representations of molecules. Its uniqueness means it deserves its own philosophical investigation. Models of nature are central in Chemistry as chemists don't deal directly with nature. Because models are never absolutely true or false they stay effective more or less for a given task. Then chemists rely heavily on those models to explain experimental observations.

The phrase “natural kind” denotes a group of objects that share a significant characteristic or property. Examples of natural kinds often used by Philosophers are fundamental physical particles and chemical elements. Furthermore, synthetic chemists are able to produce artificial compounds identical to natural ones. They can also create new compounds that have never existed before. These achievements shows that the impact of chemistry on society goes far beyond the substances. After those achievements, boundaries between natural and unnatural has been then well debated in the centuries religions and philosophical discussions. Therefore, that is how emerged this new discipline joining philosophy and chemistry, two areas that are closely linked to each other.

Before the 1980s the term philosophy of chemistry would have been considered as irrelevant. However with the evolution of sits content and the way to approach the discipline of chemistry by philosophers, this new discipline nowadays makes sense. Indeed, two journals are devoted to the field : Hyle and Foundations of Chemistry. In addition, the society for the philosophy of chemistry has emerged in 1997 and holds meetings each year since that date discussing the philosophy of science. The link between philosophy and chemistry is therefore well illustrated and explained by the following article Physical Chemistry

History of Clinical Psychology
The field of clinical psychology, recognized as assessment and treatment of persons with emotional, cognitive and behavioural problems, is widely considered to have originated in 1896, when an American psychologist, [|Lightner Witmer], the founding father of clinical psychology, established the first facility that treated patient with mental disorders at the University of Pennsylvania. Witmer was one of the first advocates of applying psychology to treat patientsand not only carry out academic research. He devoted his early work to study and help children with learning disabilities. The discipline itself has a somewhat autonomous emergence as professional practice rather than sole academic research. Although 1896 is formally acknowledged as birth of clinical psychology, many previous events of great importance have contributed to the development of the discipline.

The earliest notions of scientific approaches (from as early as 7000 years ago) to treat mental issues were recorded as a combination of religious, magical and medical techniques. Trephination, for instance, was one of the ancient procedures that were believed to improve one's mental soundness.

History of Behavioural Economics
Behavioural economics as an idea originally arose during the classical period of economics which was a school of economic thought in the late 18th century. This was a way of explaining people’s behaviour when it came to economic decisions by implementing psychological underpinnings. Adam Smith, who is a key figure in classical economics, drew inspiration from psychology for his first work, The Theory of Moral Sentiments, published in 1759. Other notable figures of the era also touched upon this such as Jeremy Bentham also wrote extensively on psychological aspects of utility. This would be further studied and elaborated upon developing into its own field during the neo-classical period of economics during the late 20th century where economic psychology and the concept of homo economicus was birthed. This concept demonstrated how in many instances; humans would not behave under the assumption of being rational beings which was the basis many historical economic models.

Herbert A. Simon introduced bounded rationality as an alternative basis for decision-making. This is the idea that humans are limited by information processing and prioritise satisfaction over optimality. Bounded rationality was the basis of later works and models such as Gerd Gigerenzer’s “Fast and Frugal Heuristics”, and Richard Thaler and Cass Sunstein’s book “Nudge: Improving Decisions About Health, Wealth, and Happiness” in 2008.

In 1979, Prospect Theory was published by Amos Tversky and Daniel Kahneman which undermined mainstream economic ideas of the time. They used cognitive psychology to explain how decision-making is not always optimal focusing on principles including loss aversion, reference dependence, non-linear probability weighing. They would revisit this in 1992, in the Journal of Risk and Uncertainty, with cumulative prospect theory looking at traits such as overconfidence and projection bias.

These behavioural models combine ideas of psychology and neuroscience into microeconomic theory to better understand market decisions and public choice.

History of Demarcation between Scientific and Pseudoscientific Disciplines
Definitions

Science and pseudoscience group together similar disciplines. Many study similar topics, yet have different rhetoric’s, methodologies and tests to validate their truths.

Science - Disciplines with knowledge gained objectively through established rules and evidence. This knowledge is used to further our understanding of the world.

Pseudoscience - Disciplines that are incompatible with the modern scientific content/methodologies yet say or imply they are grounded in it. Examples of classified pseudosciences are alternative medicine, parapsychology, transcendental meditation movement and paranormal studies. Non-disciplines, rather theories, such as flat-earth conspiracies, climate change denial, aryanism and ancient astronauts are included as well.

History

In 1796 J. P. Andrews used ‘fantastical pseudoscience’ to describe alchemy, this is the first known use of the word. Frequented use began in the 1880s and has been of defamatory meaning throughout, so those who work within pseudoscientific disciplines don’t declare their work as such because of the word’s connotations.

It was the 1960s when Karl Popper put forward ‘falsifiability’ in order to solve the demarcation challenge of how to separate disciplines of science from pseudoscience disciplines or religious or metaphysical claims. If a discipline or theory focuses on a topic that cannot be disproved (proved to be not falsifiable) then it can be classed as pseudoscience. In 1978, Thagard put forward additional criteria for classification as a pseudoscience if there is selectivity in ‘considering confirmations and disconfirmations’ by practitioners, and if there is a lack of progression or comparative evaluation of a theory.

The classification of disciplines can be difficult due to the historical nature of them. Many disciplines relating to the pseudosciences began in the pre-scientific era, with some scientific disciplines deriving from pseudoscientific ones.

On a surface level, the categorisation into pseudoscience can come from when disciplines are ‘bad science’ and fail the tests that scientific disciplines use to validate their truths. The separation can also occur when a discipline is unrelated to science, but its content threatens the public’s understanding of science, for example eugenics or creationism. Or if a discipline is posing and claiming to be science.

The Need to Redefining Pseudoscience for the Sake of Interdisciplinarity

I believe such a short history of demarcating disciplines into pseudoscience and science shows there’s work to do before we reach a better way of categorising that keeps dangerous theories separate from science without losing and demonising disciplines that can help and inform others in interdisciplinary ways.

Many disciplines are categorised as pseudoscience because their methodologies and motives don’t match that of modern science and they can’t be falsified by modern Western scientific methods, not because they are a threat, this leads to disciplines being grouped with theories that don’t make sense together. For example, Chinese medicine, an established discipline since 3rd Century BC, still has formalised teaching globally, societies and published journals, which can’t be falsified by modern scientific methods, is grouped with the clear threat to humanity that is aryanism. This is problematic as pseudoscience is a negative relational category to science, so any disciplines classed as pseudoscience are automatically put in a negative light and generalised as ‘quackery’, ‘dangerous’ and are discredited. Obviously, this is correct in the case of theories like aryanism, but I believe the negative relational category mean we will miss content and ways of thinking that are unique to non-threatening pseudoscience disciplines that can inform other disciplines in new and exciting ways.

Like with Chinese medicine, many disciplines within pseudoscience are Eastern, and I believe it’s problematic to categorise them with such a derogatory term just because they don’t match modern science standards stemming from the European ‘Age of Enlightenment’ and the ‘[https://en.wikipedia.org/wiki/Scientific_Revolution#:~:text=The%20Scientific%20Revolution%20was%20a,views%20of%20society%20about%20nature. Scientific Revolution]’. I believe this type of grouping hierarchy of disciplines is counterproductive to interdisciplinary approaches and encourages academic [https://en.wikipedia.org/wiki/Orientalism#:~:text=In%20art%20history%2C%20literature%20and,and%20artists%20from%20the%20West. orientalism]. For example, Vastu Shastra which is the ancient Hindu system of architecture is classed as pseudoscience, as is Feng Shui (astrology-based Chinese aesthetic system) or crystal healing (Indian, Chinese and Native American origins).

I also believe the relational categorisation of pseudoscience goes against an interdisciplinary approach as it perpetuates a hierarchical categorisation of disciplines rather than promoting them spectrum that all inform and intersect. It also means it can be used by scientific academics (whether consciously or subconsciously) as a manifestation of the sociological idea of intergroup comparisons where the academics gain self-esteem from their in-group (science) being presented in a positive light and the out-group (pseudo-science) being presented in a negative light, and want this to perpetuate to continue gaining self-esteem, which is counterproductive to free-thinking, exploration across discipline boundaries and open-mindedness.

Artificial Intelligence: A trendy discipline
Artificial Intelligence (AI) are two words that make most of those who hear them dream. Inventors are not trying to replace muscles and bodies, which was the case during the Industrial revolution for example with strong machines capable of acting faster and sometimes better than humans but brains and minds. If the society was to be replaced by machines for every manual task, Humans would still have their minds, their brains, their capacity to think intelligently to differentiate themselves. Artificial Intelligence is about imitating human brain and conscience: invent a machine able to think and therefore conscient that it is, that it (or he?) exists. AI obviously is a discipline: there are Department of AI such as the DAI in the School of Informatics of Edinburgh. What is the history of this discipline? The term was first used by John McCarthy in 1956 during an academic conference. Yet, one of the first inventor that got interested in AI is Alan Turing, the father of the famous Enigma. He wrote a paper in 1950 on the concept of Artificial Intelligence as we define it: machines capable of “being” human. AI progress went up and down from 1957 to this day. From 1957 and 1974, inventors were able to create more and more advanced machines (which can be called robots) thanks to the innovations related to computers, electronic systems and coding. Nevertheless, the objectives of the scientists weren’t met, mostly because the memory in computers wasn’t sufficient. Therefore, the discipline stagnated for ten years long until the 1980’s when the computers became more powerful and the funds sky-rocketed. From 1982 to 1990, the improvement of AI wasn’t as big as the inventors thought it would, but some progress was made. During this period, the rise of expert systems, the revolution of knowledge and the money returns enabled a Boom in the field of AI. Resulting this in 1997, IBM’s Deep Blue Computer, which basically was an AI, won against the world chess champion. Machine-learning became achievable: in 2012, a Google Brain managed to train himself to recognize a cat by watching millions of videos. This discipline inspires a lot of young scientists and inventors so it’s future is guaranteed as we make technological inventions every day. How did AI become so known? The History of AI was told to the general public mainly with movies and books that make people fantasize about it. As an example, in 1950, Isaac Asimov wrote I, Robot, in 1984, the famous movie Terminator by James Cameron was released and in 2001, A.I. Artificial Intelligence by Steven Spielberg was a hit. These movies helped the discipline to be so well known on one hand, but also so little on the other hand. It made people think that the concept of AI was achievable but, whereas for now and for a large number of years before, it has been in a difficult posture. One of the main reasons for this are the ethics related to AI. Are the robots (physical incarnation of AI) going to be under laws like humans? If they can think and have feelings, will they be able to revolt against humans when they don’t want to do what they are supposed to do?

History of Epistemology
Today, epistemology is considered a major branch of philosophy, but it was not till the 17th century that it was referred to with a single word for the first time, indicating it was recognised as a unified field of study. Yet the nature of knowledge is such a central philosophical problem that philosophers as early as the pre-Socratics held the position that knowledge was unchangeable, though their work did not directly focus on it – indeed, any inquiry must directly or indirectly address epistemological issues.

Later, Plato built a theory of knowledge that more directly grappled with epistemological issues, as did Aristotle. A more radical development stemmed from the Ancient Skeptics, such as Arcesilaus, Carneades and Pyrrho, who held that knowledge was impossible. Fast forward to the modern day and philosophers such as Thomas Aquinas, René Descartes, John Locke and Bertrand Russell have all made contributions to this millennia-old field of philosophy. Of course, such a central philosophical problem was not tackled just by Western philosophy; for instance, epistemology featured heavily in ancient Indian and Buddhist thought.

History of Epistemology and Historical Epistemology

With such a rich and storied history, there is much debate about how historical perspectives on epistemology should be considered in the present day. Some hold that the problems of epistemology are so fundamental and unchanging that the perspectives, definitions and arguments of philosophers throughout history have value even within a modern framework – a sort of ‘historical epistemology’ – while others believe that the discipline has built upon and progressed past historical work in epistemology. While every discipline is informed by historical perspectives, it is often not to such a large extent as the philosophical disciplines. The fact that such a debate continues at all may suggest that while the literature has undeniably grown and become more complex, we are barely closer to answering foundational questions about knowledge than we were millennia ago.

Art History as a Discipline
Art history is the study of various objects of art in their respective historical contexts. Traditionally, this has meant material objects such as paintings, sculptures, and buildings. However, art history has since evolved to include different art forms and media, such as fashion, photography, and performance art.

It is important to distinguish art history as a discipline from art criticism and philosophy of art. Art criticism concerns itself with interpreting meaning and evaluating specific works of art, i.e. it asks the question “is this work of art good?”. It serves to help pass judgement on artworks. Meanwhile, philosophy of art as a discipline concerns itself with the fundamental nature of art and aesthetics. It asks the questions “what is art?” and “what makes something beautiful?”. On the other hand, art history does not deal with these questions. It uses the historical method to analyze objects of art within their time period. Art history concerns itself with questions of “why and for whom was this work of art created?” and “why was this artwork created this particular way and by this particular artist?”. It is essential that this distinction be made to prevent any confusion between disciplines. Art history requires the rigour of historical analysis along with the visual skills required to interpret works of art and examine their form.

History of Art History

The roots of art history can be traced back to passages in Pliny The Elder’s Natural History where there are passages about the development of Greek sculpture and painting. Pliny The Elder wrote extensively about the history of sculpture and painting and it is believed that his writing was strongly influenced by the work of Xenokrates, one of the world’s first art historians.

Giorgio Vasari wrote Lives of the Most Excellent Painters, Sculptors, and Architects, which is known as “the first important book on art history”. Vasari’s writing has been described as “by far the most influential single text for the history of Renaissance art”. However, Vasari’s work holds biased views and contains various inaccuracies.

Heinrich Wölfflin is known as the “father” of modern art history. He introduced a scientific approach to art history and was instrumental in devising a method for distinguishing the development in style over time and making distinctions between different artistic style. Furthermore, unlike Vasari, Wölfflin was uninterested in the biographies of artists and proposed the creation of an “art history without names”.

We can observe how over time, art history developed to become a more rigorous, scientific, and accurate discipline.

History of Interdisciplinarity in the study of Artificial Intelligence
The formal study of AI as an academic discipline began in 1956 at the Dartmouth conference when a team of scientists convened to attempt to make technical advancements in the field which had been purely theoretical up until then. This team was comprised of scientists from a wide variety of disciplines including mathematicians, computer scientists, neuroscientists, psychologists and economists. From its formal inception as an academic discipline it has been an interdisciplinary field but even before study of AI began, it has been a topic discussed in disciplines such as philosophy and literature.

The question of whether all intelligence could be reduced to mathematical reasoning was discussed by philosophers such as Leibniz, Thomas Hobbes and Descartes in the 17th century. In 1950 Alan Turing gave grounding to the formerly abstract question of what constitutes intelligence. In his paper, titled Computing Machinery and Intelligence, he described a test called ‘The imitation game’ in which a machine attempts to deceive an interrogator into believing it is human. He proposed that a machine’s success in this game would be a valid metric for intelligence.

Biological principles have been responsible for many major breakthroughs in the field of Artificial Intelligence. In 1943, modelling of the human brain by neuroscientists inspired McCulloch and Pitts to design neural networks, which are now a staple technique in AI development. Another notable example is genetic algorithms, developed by Lawrence J. Fogel in 1960. These use random mutation and natural selection iteratively to fine tune algorithms to a specific task.

In the 21st century, as AI started to become ubiquitous in society, there have been warnings from respected academics, such as Stephen Hawking, on the potential dangers that a rise in automation may bring. An inevitable flaw of AI is that the techniques used to develop it often result in the creators having very little understanding of how they work and what unforeseen consequences may arise from their implementation. A notable example is an algorithm called COMPAS which was used to assist judges in determining sentence length. Unbeknownst to its creators, racial biases from the data set of previous rulings it was trained on were embedded in the algorithm. This highlights a need for research in ethics and social sciences to combat the moral downfalls of AI.

History of Mathematics
Mathematics can be generally divided into four categories: number, space, symbols and inference. It is generally assumed to be a common mode of thought called "mathematics" that is hardwired in human. For example, counting and common shapes, such as squares and rectangles, have the same meaning to everyone. . The practice of mathematics can be dated back to pre-history. For example, a wolf bone was discovered in Czech in 1937, with two series of notches, grouped by fives, which indicated the people were conducting counting with the bone. There were disagreements on how old the bone was. Some believed it could be 30,000 years old. Throughout human history, the recognition and application of numbers, space, symbols and inference can be detected in archaeological evidence and primary source. . However, the application of mathematics cannot be equal as the beginning of mathematics as a discipline. It was not until later in history that people began to develop more sophisticated and systematic methods of dealing with numbers, space, symbols and inference, leading to arithmetic, geometry, logic and beyond, which then formed mathematics as a discipline. .Some groups of people began to devote to learning and studying those mathematical areas, and people began to pass on, acquire and improve the knowledge, which indicated the beginning of mathematics as a discipline. . The birth of mathematics as a discipline differed in countries and regions. The earliest sophisticated studies of Mathematics alone can be traced back to around 4000 years ago in some areas, but the earlies proposals or practice of establishing mathematics as a discipline in educational systems can only be dated back to around 2000 years ago in some countries. . For example, in Plato's Republic, which was written more than 2500 years ago, Socrates decided on four subjects the leaders of the ideal state must learn, namely arithmetic, geometry, astronomy and music, three of four fall into Mathematics. . This can be seen an early recognition of Mathematics as a discipline. In China, the earliest book on mathematical study that was found was more than 2000 years ago, but it was not until the Tang Dynasty in 7th Century that Mathematics became a curriculum in a standardised educational system for the training of civil servants. .