User:TimRJordan/sandbox/Approaches to Knowledge/2020-21/Seminar group 4/History

The History of Psychology as a Branch of Medicine
Psychology can be simply defined as the study of experience and behaviour according to Sonja Hunt in The relationship between psychology and medicine, which makes psychology have a common point with medicine: an interest in human functioning. The involvement of psychology in medicine has not been continuous or unanimously encouraged, and the interdependence between body and mind was questioned throughout the centuries in parallel with the interdependence between psychology and medicine. Mind and body were believed to be linked in Ancient times. The saying "Mens sana in corpore sano", which is a quote from Juvenal in Satires, is still used today. This citation demonstrates the importance in people’s mind of, in order to be well, not only be in a good physical shape but also in a good state of mind. However the link between these two disciplines was questioned when the view on Man changed: body and mind became two distinct areas of study in the 17th century, especially due to the work of Descartes (Mediationes de Prima philosophia, Meditation VI).

The view on the link between mind and body has since changed. Psychology is considered essential and has come to be a medical discipline even if it doesn't use prescriptions and medicines like psychiatry does. However, it may still be marginalised due to its sometimes-abstract aspect, less concrete than other branches of medicine. Hunt claims it can undermine the credibility of the discipline. Nevertheless, psychology is increasingly recognised especially with the growing awareness of the importance of mental health.

The History of Material Culture
Analysing objects is not a recent way to understand other communities, as it has always been implied in the ethnographic work. However, its consideration as a discipline may only be dated from the late 1990’s with the creation of the Journal of Material Culture in 1996 firstly edited by members of the UCL department of Anthropology. Here it points out that academics started to share about their conception of material culture and thus it evidences its beginning as a proper discipline.

At first, in anthropological researches, material culture studies’ aim was to prove the ‘modernism’ of western culture through the comparison between their evolved objects and ‘primary’ objects of the non-western ones: European culture is showed as superior. Thus, due to colonialism the supremacy of Western assumptions exists, which also lead to a ‘masculine’ hierarchy of senses placing the visual sense at the top of the scale. This traditional ‘ocular centrism’s mode of analysis has been questioned from the assertion of non-Western cultures thanks to decolonization and the emergence of material culture studies as a peculiar discipline. The development of material culture studies allows to not discriminate any culture, ethne by criticizing a single approach of objects. The social change with those new pluricultural communities, lead to increasingly challenge the foundational way of interacting with objects especially in museums where only the Sight is engaged. Although each culture has its own sensory model, only the Western model was considered relevant, which showed again the desire to distinguish European culture from others. Material culture studies as a discipline want to teach people in a crossed senses’ manner how to apprehend objects coming from different societies, “not only to see objects but to sense objects” as said in Sensible Objects Colonialism, Museums and Material Culture.

Thereby material culture studies will keep changing, challenging its traditional ways of analysis based on colonialism history. The question of changing the only visual approach of artefacts exhibited in museums is complexed, since the physical preservation aspect has to be raised.

Logic as discipline
The word “logic” originates in the Greek word logos. Traditionally, logos is translated to “reason”, but this translation has been contested by scholars and as a result, its entry in A Greek-English Lexicon, Liddell–Scott–Jones (LSJ), has over 60 translations, including ”speech”, ”word”, and ”argument”. A broad definition of logic as ”the appraisal and analysis of arguments” has also been proposed. Whilst humans have always reasoned and engaged with arguments, and therefore used logic, medieval universities distinguished logica utens (the use of logic in thought, speech, and writing) from logica docens (the formal study of Logic as a discipline), with Aristotle considered the founder of the latter in the West.

Significant Work
Aristotle wrote a total of six books on logic, collectively called the Organon. In Topics, his first attempt at a logic textbook, he discusses the invention of arguments based on endoxa, “consensus”, and called it the art of dialectic. Aristotelian Logic continued to be studied in medieval Europe, and most works on Logic remained based on the work of Aristotle until the mid-thirteenth to the mid-fourteenth century when more original work was developed.

The most influential works in Logic after Aristotle include Port-Royal Logic, published in 1662 by Antoine Arnauld and Pierre Nicole and considered the start of Traditional Logic, and John Stuart Mill’s A System of Logic, which was published in 1843 and what prompted the new perspective on logic as a branch of Psychology. Since then, several modern logical systems have been developed, especially in mathematical logic, which after the Second World War split into Model theory, Proof theory, Computability theory, and Set theory.

Logic in Higher Education
Whilst logic classes and courses are common at universities, logic degrees are relatively rare and new. These are usually interdisciplinary and taught in departments of Philosophy or Mathematics, and more recently also Computer Science.

History of Artificial Intelligence within Computer Science
Oxford Languages defines artificial intelligence (AI) as 'the theory and development of computer systems able to perform tasks normally requiring human intelligence'

The Dartmouth Summer Research Project on Artificial Intelligence (DSRPAI), held in 1956, saw the emergence of AI as a discipline. It was here where John McCarthy, who first coined the term 'artificial intelligence', met with other academics such as Marvin Minsky, Nathaniel Rochester, and Claude Shannon to research and discuss AI.

Before its establishment as a discipline, AI was a popular subject amongst researchers. Perhaps the most notable work being the Logic Theorist, a computer program created by Allen Newell, Herbert A. Simon and Cliff Shaw in 1955, which was later presented at the DSRPAI.

Significant Work
Alan Turing was a British mathematician and computer scientist who famously cracked the enigma code during WW2, reaching feats in the field of AI before its debut. In his 1950 paper, 'Computing Machinery and Intelligence', he details the Turing test which sought to answer the question "Can Machines Think?". Turing describes 'The Imitation Game' in which an examiner questions a computer and a human to identify one from the other. Based on Turing's work, the CAPTCHA is now more widely used to determine whether a user is a computer or a human.

At the MIT Artificial Intelligence Laboratory, Joseph Weizenbaum developed ELIZA - a natural language processing computer program - released in 1966. Many users thought ELIZA could conceptualise but she was simply "pattern matching". Despite this, ELIZA was one of the first programs with the capability to attempt the Turing test.

In 1997, the reigning chess champion Garry Kasparov was defeated in multiple chess matches by IBM's supercomputer Deep Blue. Of the six games, three were a draw, one a win for Kasparov, and two were won by Deep Blue.

Education and the Future of AI
At higher education, artificial intelligence is primarily taught as part of a computer science degree but is now increasingly being offered as a standalone course. AI has roots in many subject areas including philosophy, mathematics, psychology, and biology. Within psychology, for example, to build artificially intelligent machines requires an understanding of how human brains function. Its versatility as a discipline is reflected in the multiple sub-fields of AI - neural networks, robotics, speech processing, machine learning, etc.

With AI expanding as a discipline, the AI worldwide software market grows year-on-year (approximately 54% from 2019-20). However, there are concerns about AI and what it means for our future. Often these are centred around unemployment, security, inequality, and singularity. These concerns point to a different direction for AI; one which interests the public sector.

History of Forensic Linguistics as a Discipline
[https://en.wikipedia.org/wiki/Forensic_linguistics#:~:text=Forensic%20linguistics%2C%20legal%20linguistics%2C%20or,a%20branch%20of%20applied%20linguistics. Forensic linguistics] is a subsidiary of applied linguistics which involves implementing linguistic understanding as forensic evidence in lawful procedures. Similar to most sciences, themes of Forensic Linguistics can be traced back to Ancient Greece and author identification (a key pillar of the discipline) has been a topic of debate in public discourse since literature began. There were attempts to establish means of author attribution in the nineteenth and early twentieth century, with methods rooted in mathematics and statistics using quantitative factors such as average word and sentence length but these are incompatible with the definition of forensic linguistics for lacking in both forensic applications and established linguistic technique.

The term forensic linguistics, however, was coined in 1968 by Jan Svartvik when he was commissioned to analyse a body of statements given to the Notting Hill Police Station regarding the case of Timothy John Evans who was accused of murdering his wife and child and was subsequently convicted and hanged in 1953. Evans’ statements troubled many experts who pondered their authenticity and Svartvik’s analysis showed that they couldn’t have been dictated by him. The efficiency of the technique of methodically examining bodies of literature was demonstrated in this case and its use began to gain traction.

The American linguist Roger Shuy is widely regarded to be the founder of modern forensic linguistics and his work, along with others, has shaped many aspects of civil and criminal practice thus solidifying the discipline’s place and use. The International Association of Forensic Linguistics (IAFL) was formed in the late 20th century and is the primary organisation for forensic linguists. The IAFL edits the International Journal of Speech, Language and the Law, a peer-reviewed journal that is focused around all aspects of forensic language and audio analysis and they also hold biennial conferences. Additionally, forensic linguistics is taught internationally at Universities, sometimes as a stand-alone degree but often in dual honour degrees or in specific modules.

The Evolution of Beauty and Visual Culture within the Arts
Many of us spend hours curating and maintaining our social media presence every day; Instagram is a conduit to modern visual culture expression. Our perception of ourselves, society and the world are all mediated by consumption and aesthetic pleasures. Western society has a long history with the visual and it is interwoven into many aspects of our worldly perspective. This history permeates our every day through magazines, YouTube, Instagram and television’s influence on our self-esteem, how we feel about our bodies and our identity. The cultural anthropologist Seremetakis believes the Western fetishisation of sight and aestheticism is seductive and hides the other senses on our cultural periphery. The documentation of social expectations drives our digital economy, what one must purchase and achieve in order to be ‘normal’.

Literature and Art have changed and manipulated our perspective on beauty over time, not only in terms of ourselves but also of objects and architecture. These disciplines act as gateways to new perspectives of popular visual culture such as new genres of Literature or new art movements such as the Bauhaus movement. In his poem An Ode to a Grecian Urn, John Keats states that 'beauty is truth, truth beauty', suggesting that a more realistic representation of life and human experiences should be the principal subject of art.

However, this realism within the creative disciplines was interrupted by the Aesthetes of the Pre-Raphaelite era (such as Gabriel Rossetti and William Morris). This movement upturned the expectation of art’s subject to simply be “art for art’s sake”, and the primary function of its existence to exemplify beauty and adhere to the pleasure of our senses. Danto theorised that once beauty is made the epicentre of an art piece; the work becomes redundant in terms of meaning, the disclosure of knowledge and its function within Capitalism. The chronological history of the creative disciplines is filled with discussion and dispute of the function of the discipline itself. Beauty has been moulded and shaped by Literature through a wide number of literary texts. For this particular essay,  Virginia Woolf’s ‘Orlando’. discloses the privilege of beauty and how beauty can be attained and maintained through wealth and power. Woolf’s main subject is the beauty of the human body through the eyes of a gender-fluid protagonist and his possessions.

In an age and milieu where the importance of the visual, whether beautiful or not, galvanises disciplines such as Art and Literature but also causes division between smaller subcategories such as Aesthetes and the Didactic influences, the concept of beauty can be seen as both limiting and liberating. The history and progression of Visual Culture gives the academics of this discipline the tools to analyse the current visual pandemic of Instagram. When one considers that 500 million people globally post an Instagram story per day, the epicentral platform for visual culture during the current epoch, there is no doubt the academic and statistical analysis of the visual is fertile.

The History of Inuit Studies
The study of the Inuit people, previously known as 'Eskimology', can be traced back to 1745, to the missionary works of Hans Egede in Greenland, as well as Ivan Veniaminov (1840) in Alaska. At this time, studies were exclusively dominated by non-native intellectuals, emphasising 'facts’ instead of theoretical discourse about how these details may best be represented.' The history of the discipline shows how direction of change can be 'influenced by the dilemma of being loyal to the colonised society ... studied while remaining a part of the ... colonising society.'

Between the 1850s and 1920s, the study shifted from dispersed, independent research to a more coherent scholarly community with increased exchange of knowledge. It mostly comprised of early scientific knowledge based on explorations by natural scientists and colonial administrators. In 1894, Greenland was first colonised by Denmark in its easter region. Thus, an imperial project directed research initiatives in Greenland (in the 19th and early 20th centuries.) Consequently, throughout history, the discipline has been a tool for colonialism as well as, more contemporarily, for decolonisation.

From the 1920s to the 1950s 'Eskimology' focused on the origins of Inuit culture and people, drawing prehistory and archaeology to its core. In 1920, 'Eskimology' was institutionalised by the University of Copenhagen when a teaching position was given to William Thalbitzer, an ethnographer and philologist, introducing ethnographic and philological approaches to 'Eskimology.'

In the 1950s -1980s, ‘Eskimology’ transformed into ‘Inuit studies.’ The name ‘Eskimo’ was rejected by natives as it means ‘eaters of raw meat’ (though this is a matter of contention) in the language of Atlantic Inuits, and was imposed upon the people by colonisers. The change in the discipline’s very nomenclature 'reflects a shift in the status of the Inuit – being (tentatively) reversed from objects to subjects.’ The establishment of a 'Department of Eskimology' in 1967 at the University of Copenhagen furthered the discipline's engagement with contemporary political developments in Greenland, such as indigenous rights. Inuit people are now 'acknowledged as co-producers of knowledge and project initiators' within the discipline. The department also introduced discourse around cultural identity and ethnicity, taking a more anthropological approach. These developments were driven by forces of social pressure, including endogenously, from students and lecturers, as indigenous rights and decolonisation became increasingly integrated into mainstream conscience.

However, the significant delay in the discipline’s institutionalisation and introduction of ethnographic, anthropological approaches may be symbolic of intentional silencing of Inuit culture and the colonial process of ‘cultural conversion’.

Rise of DIgital Anthropology
The development of online and digital phenomena such as social media, online politics, big data, search engines, artificial intelligence, etc. opened a completely new chapter in the scientific work of human scientists. Diversity and dynamism of the internet environments led to a quick differentiation of cybercultures which became a subject of scientific inquiry. As a result, The Digital Revolution gave birth to a number of of new sub-disciplines among which we can distinguish digital anthropology which aims to understand how digital environments influence the way people communicate, relate, and interact with each other.

According to Philipp Budka and Manfred Kremser, the beginnings of the digital anthropology can be tracked down to the 1994 Arturo Escobar’s article ‘Welcome to Cyberia’ published in Current Anthropology journal. In the article, Escobar formulated several fundamental questions regarding cybercultures and created basic guidelines for ethnographical research in the cyberspaces

Areas of Study
Despite the relatively short existence of the discipline, it is now possible to distinguish several main areas of anthropological investigation in the digital spaces. Daniel Miller in his article for the Cambridge Encyclopedia of Anthropology characterised the following trends in the digital studies :


 * 1) Study of technologies themselves via specific populations such as hackers, creators etc. which focuses on the exploration of closed online communities.
 * 2) * An example of such work is Gabierlla Coleman’s investigation of hacker cultures (2014) or Tom Boellstroff’s ethnographies of the online computer game- Second Life- community (2008)
 * 3) Study of ubiquitous digital platforms such as social media upon ordinary populations which focuses on producing traditional holistic ethnographies of people whose life was substantially influenced by digital technologies.
 * 4) * An example of such work is Madianou and Miller’s research on the transnational communication between mothers and children (2012) or Nicolescu’s ethnographies of Southern Italians in which he explored the impact of social media platforms on the public sphere and interests of people in Southeast Italy (2016)
 * 5)   Study of digital technologies for anthropological methodology. Digital revolution  implied changes within the discipline regarding the methods of gaining and gathering knowledge.
 * 6) * Access to new technology e.g. use of visual and audio recordings, downloads, etc. introduced new ways of conducting anthropological investigations such as online ethnographies which aren’t subject to space and time contraints.

Early developments
The word ‘entrepreneurship’ has always been linked with the word ‘entrepreneur’ which comes from the thirteenth-century French verb ‘entreprendre’, meaning “to undertake”. It was not until the 18th and 19th century, however, that the concept of entrepreneurship was used in academia - more specifically by the economists: Richard Cantillon, Jean-Baptiste Say and John Stuart Mill. For its origins in classical economics, entrepreneurship was considered part of the field, being connected tightly with many economic concepts such as productivity, risk-taking and management.

Entrepreneurship as a field of study
"In 1934, Schumpeter first identified entrepreneurs as distinct from business owners and managers”. Thanks to his account, entrepreneurship started to be considered as necessary for economic growth and development. By the end of 20th century, the amount of theory behind entrepreneurship and the number of studies in the field distinguished entrepreneurship as a discipline on its own. Consequently, today one can find entrepreneurship departments in many social sciences and business schools around the world.

Further developments
Entrepreneurship research diverged from studying the effects of entrepreneurship in business, to the study of entrepreneurial processes in various contexts. While one still cannot disregard its strong tie to business fields, it has to be argued that the study of entrepreneurship differs greatly from generic and small business management and no longer focuses only on the supply side or individual traits that drive business formation. Some of the salient implications that extend beyond understanding business and the functioning of complex organisations are the ones on inequality, social mobility, transition economies, social networks, family and life work.

History of Ethics as a Branch of Philosophy
Ethics is a discipline that deals with questions of morality, what is good and what is bad, trying to create a set of moral values, it "involves systematizing, defending, and recommending concepts of right and wrong behavior". From Ancient Greek the work ēthos means "habitual character and disposition; moral character; habit, custom;". The origin of ethics as a system of moral norms cannot be described in the same sense as the origin of, for example, science or philosophy. There was no particular point on a timeline when morality arose, morality was inherent in the society, in one form or another, at all stages of its development. People living together were bound by different moral norms shaped by accepted beliefs, tendencies, and presuppositions.

In the western tradition, philosophical reflection on morality is considered to have started with the sophists in the fifth century B.C.E. The Sophists were teachers who taught a wide range of different subjects from philosophy and rhetoric to mathematics. They were focusing on arete –"virtue" or "excellence" Socrates challenged the subjectivism of Sophist ethics. He held the belief that ethical principles were universal and they were able to be identified, examined, and improved within the individual. Our knowledge of Socrates is derived mainly from Plato’s dialogues, such as The Republic and Gorgias. Plato’s student, Aristotle builds his view of ethics upon his teacher’s beliefs but with significant differences. Unlike Plato, he viewed the goodness as a part of a flourishing life (or eudaimonia ) as the highest good, not the good itself. And eudaimonia can only be achieved through virtuous life.

With the development of the discipline, philosophers started to distinguish between deontological ethics (from Greek: δέον, 'obligation, duty' + λόγος, 'study') and consequentialism, the first of the two hold that morality of an action is determined by the action itself, while the second one focuses on the consequences or results of an action. In this sense, deontology is often associated with Immanuel Kant, who believed that ethical actions follow universal laws, while utilitarianists, such as Jeremy Bentham or John Stuart Mill, believed in maximization of utility of an action.

Ethics is now a part of most degrees in Philosophy, however it is rarely presented as a separate degree.

The History of Psychoanalysis
Psychoanalysis is a theory of the human mind and a method of psychotherapy developed by Austrian neurologist Sigmund Freud. It is based on the idea that there are conscious and subconscious parts of the mind, and that they interact with each other. Psychoanalytical therapy consists of one-on-one sessions with a psychoanalyst, during which the patient speaks freely of their past and present experiences, uncovering repressed subconscious thoughts which are discussed with the therapist. Freud started theorizing psychoanalysis in the 1880s and 90s after working with Josef Breuer on the treatment of hysteria under hypnosis. Their collaboration ended however, and for almost a decade Freud was the only person working on psychoanalysis. He published The Interpretation of Dreams, considered to be his most important work, in 1899. In 1902, Freud was granted a Neuropathology professorship at the University of Vienna and he maintained it until his exile in 1938. In 1902, Freud also started the “Wednesday Psychological Society” in which he met with 5 to 15 members to discuss psychoanalysis every week until about 1908, when the International Psychoanalytical Association was officially founded (even though there had already been an international psychoanalytical congress called “First Congress for Freudian Psychology” in 1907). Psychoanalysis was becoming a more widespread discipline and was slowly gaining acceptance in the academic realm, attracting more and more scholars and theorists: its concepts were even being used in other disciplines, such as history of art.

Criticisms of Psychoanalysis
But along with this growing posterity came many criticisms, such as the ones formulated by Karl Popper, an epistemologist who considered psychoanalysis to be a “pseudo-science” because its theories could not be proven or disproven. In fact, psychoanalytical methods were, despite Freud’s claims, not entirely scientific: psychoanalysts would often start with a theory and try to find an example in a clinical case, rather than starting with an observation and coming to conclusions empirically. And as psychology shifted to a more scientific approach in the second half of the 20th century, psychanalysis started losing its prestige.

Psychoanalysis in Higher Education
Today, some universities still offer courses and degrees relating to psychoanalysis, such as the University of Essex, the University of Oxford , University College London , University College Dublin , and the International University for Graduate Studies in Dominica. Among these, only the latter offers a doctorate in “Psychoanalysis”, the others all having different names, such as “BA Psychosocial and Psychoanalytical Studies” at Essex or “MSc in Developmental Psychology and Clinical Practice” at UCL. The University of Oxford offers a course in “Psychodynamic Counselling”, but it is a Postgraduate Certificate and not a Master’s degree. Some of these courses are approved by the IPA and allow students to become psychoanalysts, but nowadays psychoanalysis is mostly studied in relation to other disciplines. The Psychoanalysis Unit at UCL states: "Our mission is to break the mould of traditional approaches to psychoanalysis, taking inspiration from the discipline's ideas to meet the challenges of the modern world. Our interdisciplinary research applies psychoanalysis to contemporary issues, including mental and physical health, financial instability, gender, technology and the arts."

The role of Psychoanalysis in Psychiatry
Despite its controversies, psychoanalysis continues to influence the field of psychiatry, the NHS stating it as a possible cure for depression, and it remains an important part of our collective psyche: Time magazine mentioned Freud as one of the 100 most influential people of the past century.

Defining Parkour
Defining parkour is a complex task. Parkour is an acrobatic discipline because it is a stunning sport consisting of jumping, running, rolling, climbing, and balancing in an urban environment. As David Belle, one of its founders puts it, parkour is a “type of freedom”. The purpose of the sport is to move from one point to another point as quickly as possible whilst expressing and developing one’s freedom. It is also an art form as well in that it’s values and philosophy construct that of teamwork, solidarity, fraternity, trust, rigorous physical training and a mindset that pushes back the body’s boundaries.

The beginning
The predecessor of parkour was brought about by Georges Hébert, a member of the French marines, who wrote a book on it. In this book named La Méthode Naturelle, he describes the physical fitness of native indigenous tribes he met whilst in Africa. Hebert established a “natural method” that consists of activities such as running, climbing and jumping, for people to become physically fit. An obstacle course for the French military was created based on Hébert’s writings. During the Vietnam War, a French military called Raymond Belle trained on this obstacle course for hours and his stamina and physical aptitude were said to have surged because of it. This empowered him to achieve exploits in the army and firefighting services. Furthermore, it was his son, David Belle, who was originally a gymnast, that is considered to be the pioneer of the parkour discipline.

The 1980s
Parkour in English is derived from the French word ‘Parcours’ meaning pathway. In a documented interview, David Belle told about when he asked his father, Raymond Belle, to characterise what the word meant when he created the sport. Belle Senior replied ‘Parcours is like in life, you have obstacles and you train to overcome them, you search for the best technique, you try all the techniques, you keep doing your best, you repeat it and then you get better.’ From this point on, David Belle and fellow members created a group named ‘Yamakasi’ which means “strong spirit” in Lingala and the discipline was officially practiced as ‘parkour’.

Parkour today
Today, there are parkour associations and groups internationally. Its attraction has significantly risen with many people around the world, particularly amongst the younger generations, learning about the discipline. The International Parkour Federation (or World Freerunning and Parkour Federation), established in 2014, also encourages the teaching of parkour in war zones or underpriviledged areas, to bring communities together through physical activity and the taching of some of parkour's values. It is also the WFPF that licenses parkour teachers.

The original philosophical construct and mindset of its practice with hard training, stamina, and brotherhood, however, tends now to be more anecdotal as the vast majority of those that practice it consider parkour as a sport or hobby and not an art form. In other words, parkour today is more of an acrobatic sport than a discipline, which it what it was considered to be in the 1980s and 1990s with the Yamakasi group training and mindset.