User:LGreg/sandbox/Approaches to Knowledge (LG seminar 2020/21)/Seminar 9/History

This is the sandbox page for the issue: History.

The Roots of Ancient Psychology
The study of the mind and behaviour can be traced back to Pre-Socratic Greek philosophers, who developed concepts of the psyche and soul. Hippocratic medicine and Plato's theory of the [https://en.wikipedia.org/wiki/Triune_brain#:~:text=The%20triune%20brain%20is%20a,The%20Triune%20Brain%20in%20Evolution. triune brain] then asserted the brain as the seat of mental processes.

[https://en.wikipedia.org/wiki/Vedas#:~:text=Composed%20in%20Vedic%20Sanskrit%2C%20the,the%20Samaveda%20and%20the%20Atharvaveda. The Vedic Scriptures] also convey complex ideas of consciousness, mental wellbeing and practical psychology, many of which are complementary with contemporary western thought. While the Vedas were transcribed, the oral tradition holds much more weight ; this differs from the Greeks who wrote books and treatises that form cornerstones of modern psychological thinking. Aristotle's psychology studied the interaction of the mind on the body, the 'soul' and the intellect, which he believed, humans have, but that plants and animals do not. This was an early example of interdisciplinarity where he combined psychology and biology into biopsychology, today known as behavioural neuroscience.

Psychology since the 19th Century
The discipline of Psychology began in proper with the emergence of experimental psychology in Germany in the mid 19th century. Hermann von Helmholtz attempted to investigate the mind empirically by investigating sensory perception through observing the speed of neural transmission. In providing a scientific study of the mind rather than a purely philosophical one, Von Helmholtz paved the way for experimental psychologists that followed him, most notably Wilhelm Wundt. Wundt emphasised heavily the experimental basis of the mind and established the first experimental psychology in Leipzig in 1879, formalising the field as an independent discipline that bridged the gap between philosophical speculation and physiology. With the establishment of formal study, psychological schools of thought began to manifest, both developing ideas that came before them and rejecting them for more probable hypotheses.

Edward Titchener (one of Wundt's students) conceived the Structuralist school of thought which provided a reductionist view of consciousness: the idea that the mental process could continuously be broken down until a subject's responses had been reduced to the most basic perceptions. While the school was notable in its emphasis on scientific research, it's methods were criticised as subjective and unreliable. It later gave way to William James' notion of Functionalism.

While both sought to gain an understanding of human consciousness, Functionalist thought saw the conscious experience as a constantly changing process rather than one which can be reduced into static components. It's methods directly observed behaviour in order to provide an empirical explanation of its utility in everyday life. Conversely, Structuralist Introspection relied on the self-analysis of the subject to provide an understanding of the conscious experience.

Psychological thought continued developing over the course of the 20th century, both building upon and rejecting previous ideas. Freud revolutionised the field with his psychoanalytic theory which focuses in unconscious forces rather than questions of perception raised by his predecessors, publishing The Interpretation of Dreams in 1889. While it is debatable how relevant his theories are in a modern climate, the impact he had on the field as a whole is undeniable.

Modern Psychology
It has become popular to "catastrophise" our state of being, or events that happen to us. In 1998 Martin Seligman's theory of Learned Helplessness, where clinical depression and mental illness is "accepted" as the individual has no control over it led him to found a new branch of psychology called Positive Psychology. He did this because psychologists had created for themselves a large manual called "Diagnostic and Statistical annual of Mental Disorders" which diagnosed every mental illness or behavioural tics but did not have any language to talk about the good things that humans do, their "health, talent and possibility." Seligman's first thing to do was to write a manual celebrating all the strengths and virtues that would be valid in any human socio-cultural environment. Jonathan Haidt is an American Social Psychologist teaching at New York University who has done much to address modern cultural psychological problems, in books such as "The Coddling of the American Mind. How Good Intentions and Bad Ideas are Setting Up a Generation for Failure" and "The Happiness Hypothesis". He believes that "anti fragility", "safteyism" (the cult of eliminating threats, both real and imagined) and "concept creep" all do damage to our psychology. He explains why meditation, cognitive behavioural therapy and Prozac are the best ways to treat depression.

The History of Marketing as a discipline


The Study of Marketing demonstrates how different disciplines, with diverse approaches, can work together to arrive at new insights. Marketing studies are a field of study that lies at the intersection between Economics, Sociology, Psychology and even Law or Computer science sometimes. It includes advertising, selling, and delivering products to consumers or other businesses. This is a vast subject, which is why you can follow a four-year course on marketing at King’s College London for example. The practice of marketing has been known for thousands of years, but the term "marketing" used to describe the commercial activity of buying and selling a product or service became popular in the late 19th century. The study of marketing as an academic field appeared at the beginning of the 20th century.

Dr. Philip Kotler defines marketing as “the science and art of exploring, creating, and delivering value to satisfy the needs of a target market at a profit” (Marketing management, Philip Kotler). He believes that an equal weighting between these two elements can help achieve marketing purposes.

Which factors contributed to the creation of the discipline?
During the industrial revolution of the 18th and 19th centuries, there were technological and scientific innovations which resulted in markets that became saturated by competition. The discipline was created to get and keep customers. It was at that time that it began to be easier for a consumer to buy a good rather than making it himself. Producers had to find new and better ways to develop the products that customers needed, to convince them to buy those products and to have a more sophisticated approach to educate them about those products. People had to be qualified to meet this growing need, especially with the creation of the Internet. Websites began to be an essential tool for companies, it allowed them to sell their products. This is how the discipline of marketing was born.

When did the academic field emerge?
In the academic year between 1904-1905, the University of Pennsylvania started teaching marketing. Other universities soon followed, including the Harvard Business School. Prior to the emergence of marketing courses, marketing was not recognized as a discipline; rather, it was treated as a branch of economics and was often referred to as applied economics. Early marketing theories were described as modifications or adaptations of economic theories.

Marketing as a discipline emerged in the 1960s, Dr. Philip Kotler was the first to suggest the concept of societal marketing in an article published in 1972 by the Harvard Business Review. Societal marketing is also know as a sustainable marketing. This type of marketing is more and more important nowadays with global warming. We can see with this example that the way of teaching marketing has changed compared to 50 years ago. There is a constant evolution in the teaching of the discipline which is happening at the same time as the evolution of society.

This is where interdisciplinary plays a key role. Marketers must have an interdisciplinary background because they will have to deal with many different topics. They need to have taken economics courses to keep up with and understand current events and how a business operates, law courses to know what is legal, or psychology studies to know how consumers think. But this is also why marketers disagree about the way that marketing practice has evolved over time.

The Paradigm shift
Science Communication is a discipline encompassing the dynamics between 'public awareness of science, public understanding of science, scientific culture and scientific literacy,' as well as the use of skills, media and dialogue to evoke a response to science within society.

Although science communication has been seen as an emerging disciple by some academics, it is well established within institutions, such as the European Science Communication Institute and with well-established master’s courses at a wide range of universities – the MSc Science Communication at Imperial College London is just one example. This is a subject with a secure knowledge base, concepts and theories and numerous publications, all of which indicate science communication as being a discipline in its own right. The Bodmer Report is an example of such publications, although at this point science communication was still an emerging discipline, not yet moulded into the framework of society.

These examples indicate that the field of science communication is already established, and already a discipline in its own right. This can be dated back to the Bodmer Report published by the Royal Society in 1985, which concluded that public understanding of science needed vast improvement, calling for scientists to learn to communicate better with their audiences.

Although an element of science communication has always featured in society, there has been heavy criticism of the methods and tools through which the communication is delivered, and models have been devised to describe the shift in communication over time. The first model was known as the 'deficit model', where the public were seen as ignorant and with a "knowledge deficit." Scientific literacy was equated to an appreciation and understanding of science, whereas scepticism and questioning were seen by scientists as ignorance. The models gradually shifted over time, but despite the publication of the Bodmer Report in 1985, no real change was implemented until the House of Lords published a report on Science and Technology in 2000. This report led to a shift from the deficit model, to the 'Science and Society' paradigm, where a two-way dialogue, public engagement and rebuilding of public trust were advocated. The shift in beliefs led to the acknowledgement that the public have valuable input and should be respected; scientists and the public became equal, offering different, yet of equal merit, perspectives. This was the end of the hierarchal deficit model and the start of a scientific egalitarianism – everyone’s opinions were equally valid. Once the need for public engagement with science as opposed to just awareness of science had been truly established, science communication became imperative and high on the public agenda, a firm shift from a scientific dictatorship to a stream of two-way communication. This report from 2000 was pivotal in science and communication becoming a true discipline and was the point from which science communication emerged as a discipline itself. Coupled with the continuing publications, recognition by institutions, models and theories and the firm knowledge base from which ideas are still growing, science communication has now become an autonomous discipline in its own right. Science communication has since become an integral part of all other scientific disciplines, making the subject matter accessible to the public as a whole, as opposed to just a select few, highlighting just how critical a discipline it is in the world today.

The Application of Science Communication
Science Communication helps make science interesting and exciting through education and awareness to non-scientists to whom science can generally be considered esoteric. (Think Brian Cox). It is the general practice of communicating science to non-experts. In 2020, with much of the public hearing about epidemiology due to COVID-19, it is important that scientific facts are communicated to the general public in order for them to understand why they are not merely being "nudged" into new behaviour patterns, but "forced" into new patterns by the Government.

Science Communication can therefore take place in many different ways. In the UK, the chief scientists in the country hold daily briefings that are broadcast nationwide. This forms the basis of UK Government policy, which is the "articulation of objectives" Alan Wilson as classified in his 2010 book Knowledge Power. Added to this, there are newspaper articles, media blogs, television programmes all helping to communicate as much information as possible to the general public. In a time where "fake news" is prevalent, scientific data must be communicated clearly to help understand and accept that which is being asked of the public. Although Science Communication emerged as a discipline in its own right after the House of Lords Report in 2000, this shows that the discipline is constantly evolving and that history is still being written.

Introduction
In the Twenty First Century it is more appealing to collaborate with different disciplines to solve big problems. A holistic approach to health and well-being has been adopted in recent years. In Western culture, Mind and Body has traditionally been separated. However, it has been scientifically proven that self-healing can happen given the right tools which has led to this paradigm shift in approach. In the UK, an extensive list of NHS Evidence is available.

Under the new discipline of Mind-Body Science, ancient traditions such as Yoga, Tai chi and Qi Gong, pranayama (breathwork), meditation, and more modern, cognitive behavioural therapy, coupled with a healthy diet, have been seen to have benefits in managing stress related non-communicable diseases such as cardio-vascular disease or diabetes.

Benefits of Mind Body Disciplines
Two large-scale UK government studies among civil servants found that those workers in low-level jobs where there was little autonomy leading to high level of stress, were more than twice as likely to suffer from life-threatening illness when compared to more highly-skilled workers who had more autonomy.

The Mind-Body approach reduces the release of stress-related hormones such as adrenaline and cortisol by lowering the autonomic nervous system response. Acute stress is heavily related to socio-economic groups and there is medical acceptance that a holistic, non-clinical intervention can lead to a happier, healthier life. Ancient texts such as The Bhagavad Gita and today's proponents such as Jon Kabat-Zinn are valid in 2020.

Although these ancient traditions have existed for thousands of years, western medicine has consistently separated the mind from the body. René Descartes believed in Mind Body Dualism, or the separation of the physical body from the mental body. The establishment of connection between mood/emotions and susceptibility to illness demonstrates how the neurological, immune and endocrine systems interact, demonstrating the effectiveness of this new holistic discipline, and refutes Cartesian argument.

The Future of the Discipline
Mental health issues have increased exponentially in 2020 due in large part to lockdowns in many countries. These have led to high stress levels and the data shows increases in domestic abuse, child abuse, substance abuse, more untreated illness. Mind Body Disciplines will only grow and become more widely prescribed and more easily accessible to help reduce and prevent illness, integrated with medical interventions, allowing for a multi-disciplinary approach to effective treatment.

Historical Background
Economic thought has existed in various forms across the world and throughout history, and was often intertwined with the study of social science subjects such as philosophy, history and politics. Economics as a separate discipline can be said to have emerged in the late 18th century, most notably with the publication of Adam Smith’s 1776 treatise "The Wealth of Nations". This was a period during which modern capitalism was forming, with colonialism and the [https://en.wikipedia.org/wiki/Industrial_Revolution#:~:text=The%20Industrial%20Revolution%2C%20now%20also,sometime%20between%201820%20and%201840. Industrial Revolution] at their heights; these created complex exchanges and systems of trade and dependency. The discipline has undergone several transformations since then, and has seen the rise of differing perspectives and contending schools of thought.

Paradigm shifts in Economics in the 20th & 21st century
Societal changes through the 20th century directly influenced and gave rise to a range of prevailing economic theories and paradigm shifts, all of which argued for different focal points in economics. By the 20th century, significant contributions had been made to the discipline by Thomas Malthus and Karl Marx, both of whom disagreed with Smith’s ideas. Malthus’ work changed the focus of economics from the concept of demand to the scarcity of goods. Marx argued the means of production should be the primary focus of study, and criticised ideas of capitalism. Economists such as Léon Walras and Alfred Marshall provided formal lexicon, and mathematical and statistical tools respectively to help advance economic theory.

The role of the state began to play a large role in economic theory due to the social shifts that occurred in the 20th century. Keynesian economics, developed by British economist John Maynard Keynes, was highly influential from the start of the Great Depression in the 1930s and up to the 1970s when economic policy faced new challenges. Keynes’ ideas were a direct response to the economic downturn during the Depression era, and called for full employment to be the core tenet of the study and greater government intervention. His theory argued governments could be key actors in combating recession by focusing on expansionary monetary and fiscal policies.

A new neoliberal framework formed in the discipline in the 1970s and 1980s, when major disruptions such as the oil crises, a move towards greater free-market policies, the process of ‘stagflation’ and the election of influential conservative leaders Ronald Reagan and Margaret Thatcher amongst other new governments, all made room for a new school of thought. At the forefront of this were Milton Friedman and the Chicago School, who opposed many of Keynes’ fundamental theories. Friedman believed that the free market should be the core mechanism of economic theory and policy, and that economic efficiency would only be achieved by way of market forces and not state action. In academia, neoliberal assumptions about deregulation being the most effective means of resource allocation, amongst others, became accepted truths and had major social and cultural effects. This viewpoint was prevalent into the 21st century, up to 2008-2009 when the financial crisis led to a form of Keynesian resurgence as the need for government regulation and bailouts once again led to a shift in opinion, even amongst academics who adopted a more critical view of free-market capitalism.

Further Developments in Economics Today
A major development in economics during the last few decades has been the concept of ‘freakonomics’ - the application of economic concepts to everyday problems and unusual issues, first named in the book "Freakonomics" by Stephen J. Dubner and Steven Levitt (2005). A notable characteristic of this trend has been the emergence of popular works and the ability for a wider general audience to understand economic concepts as they become “increasingly exposed to economic ways of thinking.” (Leeson, 2008) Disciplines like economics have also been combined together with other social and natural sciences to help solve social and global issues in the 21st century. As an example, Behavioural economics draws on cognitive, psychological, cultural and social behaviours of both the individual and institutions that are behind economic decision-making covering such diverse areas as the financial markets or domestic economics.

The History of Artificial Intelligence
Artificial Intelligence is a field of knowledge which has seen rapid development since its conception as an academic discipline by computer scientist John McCarthy in 1956. Due to its complex and multidisciplinary nature, the specialty has distinct strands in relation to science and to humanities, which coalesce into creating an interdisciplinary approach to knowledge garnered from these varied fields.

The Foundations of the Discipline
It is generally agreed upon that the 1956 Dartmouth Summer Research Project on Artificial Intelligence was the true genesis of artificial intelligence, an event where twenty individuals with backgrounds in mathematics, physics and psychology gathered for eight weeks to discuss the trajectory of the field. Precursors to this event include the creation of the Turing test for machine intelligence in 1950, and various works of science fiction, including Mary Shelley’s Frankenstein (1818) which acted as a cautionary tale against creating sentience.

Artificial Intelligence Today as a Scientific Discipline
On the side of scientific research, the primary aim of artificial intelligence is to simulate the cognitive behaviours of the human brain. Contemporary research focuses on mimicking the process of the brain to solve computational problems, such as reading live visual data in self-driving cars, or analysis and translation of human language. This pursuit brings a variety of sciences together, namely mathematical logic, statistics, probabilities, neurobiology and computer science. In this sense, the neural network strand of AI research can be seen as a form of biomimetics.

After the Dartmouth workshop, attendees began to lead AI research at Carnegie Mellon and MIT, resulting in Logic Theorist, an AI which could prove mathematical theorems. This was aided by increased funding from the US and UK governments, which was then cut in 1974 due to the Lighthill Report, leading to the first of multiple periods of limited progress colloquially known as AI winters. Despite this setback, the 1980s saw developments in neural networks via back propagation algorithms, a technology which would go on to influence the discipline up until today. This area has seen progress up until the present day: in the 2010s experiments in neural networks by large companies including Google and Microsoft would be the basis for its use in solving real-world issues.

Links between AI, the Arts and Humanities
AI research also has a strong connection to the humanities. Recent advancements in the field have allowed for the creation of human-aided computer generated art. On the other hand, there is an emergent branch of philosophy which looks at the various ethical issues of AI. The possibility of an artificial general intelligence gives credence to the computational theory of mind (that consciousness is an emergent property of the brain as an information processing system), a theory which was established by various American philosophers and mathematicians including Walter Pitts, Warren Sturgis McCulloch and Hilary Putnam from the 1940s to the 1980s.

Self driving cars, one of the most practical applications of AI we have currently, presents some ethical dilemmas: for example, should a trolley problem-like scenario present itself in front of an autonomous vehicle, a decision has to be made by the AI over a human life.

Early Origins
The birth of international relations as a phenomenon (rather than as a discipline) is widely regarded as being intrinsically linked to the end of the Thirty Years’ War, or the declaration of the Peace of Westphalia in 1648 and the establishment of the modern state system. International relations (IR) was formalized as a field of study following the First World War when Aberystwyth University established the first department of IR in 1919. The same year, the United States saw the founding of its first IR faculty at Georgetown University.

There are several earlier works in documented history in which similar subject matter had been considered, such as Thucydides’s History of the Peloponnesian War and Sun Tzu's The Art of War, though such texts precede the base conception of nation-states, instead referring to intraregional relations between the earlier notion of city-states.

History of International Relations Theory
IR employs a robust theoretical framework with its two most prominent traditional strands being Liberalism and Realism. Although related to the Utopianism of the interwar period, modern liberal IR theory has much earlier roots in Immanuel Kant's "To Perpetual Peace," in which Kant posits there are several necessary conditions to achieve perpetual peace. Such conditions then formed the basis for the dominant post-WWII liberal theory in IR. From this base conception of liberal theory in IR, various renditions of liberal theory emerged from Kant's work. This includes Neoliberalism, or liberal institutionalism, which urges the significance of maintaining international institutions as a necessary peace-keeping measure. Additionally, commercial liberalism, derived from Kant's "universal hospitality," supports the economic interdependence of nation-states as an effective peace-keeping measure.

Realist theory in IR takes a more conflictual approach with a seemingly more pessimistic view of the world, rather than the idealist collaboration characteristic of liberal IR theory. Realist IR theory has its historical roots, at least indirectly, in the early writings of Thucydides' Peloponnesian War. These roots are only in influence for the modern conceptions of realist theory, of course, as IR as a discipline was not formalized until much later in the 20th century. The many similarities between the modern and ancient worlds meant such early works could be drawn upon for the basis of the later formalized realist theory. This included Niccolo Machiavelli's "The Prince," though early works such a this focused primarily on the unprovable features of human nature in regard to political relations. Kenneth Waltz in his 1979 "Theory of International Politics," moved the realist thought away from such methods and into the contemporary understanding of realist theory with more objective assessments of the international system as a whole.

History of International Thought (HIT)
As an interesting side, a significant subfield of IR, the history of international thought (HIT), has gained prominence in the last two decades. It is important to note that the history of international thought is a subfield which is distinct from the aforementioned history of theory in international relations more broadly. The HIT has largely remained a peripheral subfield of IR, however, which may be attributed to historians of international thought applying a "contextualist" approach in the subfield. The central idea underlying contextualism is that historians should appreciate historical works given the circumstances and lexicon of the time in which the works were created. In theory, this would allow for an authentic understanding of the text, rather than insights made which benefit from contemporary foresight.

Theory and Practice of International Relations
International Relations (IR)is a multi disciplinary social science that combines politics, law and economics in studies on relations between nation states. The socio-historic basis of the discipline means that interpretations can be nationalistic and subjective in nature, rather than the "global dimensions of international life" as described by Hayo Krombach in 1992. IR is sensitive to cultural and intellectual influences and biases and is therefore an intrinsic part of understanding how the world works on a global basis. The United Nationsand its many component parts, including the (WHO,UNHCR, UNDP), indicate the extent to which the international community attempts to collaborate since its inception in the aftermath of the Second World War.

Origins
Criminology studies the phenomena of deviance, its actors, as well as its impact on society. Criminology, in the past, was described as the procedures to take care of criminal conduct. We can retrace this type of criminology to primitive societies as they already contained people that were deviant to the norms and rules in vigour. In Europe, during the late 1700s, there was a shift in the approach to criminology. A general concern emerged regarding the unfair treatments and poor living conditions of criminals in prison and courts. A common will to humanize the jurisdiction emerged.

The emergence of Criminology as a discipline
Seeing new tensions emerge at the beginning of 1800, to solve the problem, people decided they should try to understand the systems of deviance. Torture and punishment weren’t the unique solutions in criminology. Cesar Lombroso’s research during the early 1900’s suggesting that criminology should be worthy of study. He argued that there was a psychological aspect behind a criminal and deviance. Felonies are committed out of a free will. It was the origins of this free will that had to be investigated to improve a better jurisdiction.

Criminology: an independent discipline
Slowly, criminology started to develop itself as a subject in disciplines across different countries. For example, in Britain, criminology was studied for medical purposes, whereas in France, Durkheim used it to tackle sociological studies. Criminology relating to various subjects, made scholars realize they should consider it as an independent discipline. Institutions all over the world started considering criminology as a discipline. In the United-States, the first textbook “Criminology” was published by Maurice Parmelee in 1920. In Britain, the “Association for the Scientific Treatment of Delinquency” was inaugurated by the Kings College of London in 1942 and later on in 1941 Cambridge opened the Department of Criminal Science.

Today: Criminology as a discipline
Today the discipline of criminology has evolved. In Britain, we can nowadays count up to 80 criminology degrees, which focus on developing different ways to approach criminals and crimes. The study of deviance is now affected by the biological, sociological, political, environmental and psychological context it is set in. Examining new ways of preventing crimes and further understanding of the criminal world and the different subjects linked to it are also part of the discipline.

Origins
The birth of street art or urban art is difficultly placable in a defined temporal space. This discipline includes diverse and interdisciplinary forms of artistic expression such as: graffiti, streets installations, prints and mural paintings. One could argue that the first artistic expression in an urban context can be found in Vienna at the beginning of the 19th century. The author of these inscriptions was Joseph Kyselak, who used the city’ walls as an artistic medium:on which he wrote his name throughout the entire Austro-Hungarian Empire. This culture of using urban infrastructures as canvas also manifests itself in other ways throughout the 19th century, when the writer Jack London sees types of inscriptions, drawings and other types on trains in 1890. It will be confirmed later that these comes from "hobos" that marked their surnames sometimes embellished with ornaments and / or poetry. During World War II, visual urban culture continued to spread through a graffiti called 'Kilroy Was Here'. It was an inscription and a doodle drawn and carved on walls by soldier assigned in different areas. This practice of tagging became increasingly popular in New York especially among gangs in the 20s and 30s. Trains, cars, walls: everything became subject to artistic expression.

A Turning Point: The 1970s and 1980s.
It wasn't until the 1970s that street art took a different turn due to the socio-political context and the reactions it aroused. Then, street art through its revolutionary character, became exponentially rooted in culture. The desire to challenge power and authority, which constitutes one of the very principle of street art, then coordinates perfectly with the states of mind of the 70s and 80s. Individuals used such artistic method to revolt and raise awareness toward political and social issues of their times while upseting the established system and norms.

The issues of Understanding the History of Morality
Moral philosophy has been historically defined through large paradigm shifts in ideas, both in our concept of ethics and ethics in practice. Despite there often being large schools of specific ethical thought throughout history, the analysis of the history of Ethical Thought is precarious. There is a fine balance to be struck between looking back at ethical thought through our own modern preconceptions whilst avoiding a Hegelian view on history. A Hegelian view can be problematic as it leads to the searching for the wrong ideas and concepts as he sees history as acting purely towards forming the present. On the contrary, a view with our own preconceptions can leave out many trains of Ethical thought that we simply would not regard as ethics today. To further this idea, Schneewind argues that it is impossible to go into the history of moral philosophy without some form of established view. He draws upon two traditional, but opposing views, the first being the study of morals on a Socratic basis where we aim to find moral truths, and the second being that moral truths already exist and always have, whether for secular or religious reasons. When we study morals, we tend to take one of these ideas with us. However, whilst Schneewind accepts that these beliefs can influence how we perceive the history of morality, they can also help us answer different questions. He argues that the interrelationship between these ideas can help explain a lot about contemporary societies, and how the two clash/reconcile can explain a lot about contemporary norms, values and issues.

The Birth of Moral Philosophy
The first usage of what we would call Ethics began with the first ascriptions of positive behaviour within the earliest forms of literature, such as in Homer's Iliad. However, it took up to the 5th century for the actual discipline to begin to form under Socrates, with the introduction of thought to help develop a way of life in a form of care for the soul. It is Socrates who introduced a vital aspect of Moral Philosophy which distinguishes him from predecessors. In historical Greek works, such as the Iliad, shows the ascription of specific moral traits to specific classes in society, eg. a King should have X traits, a warrior should have Y traits. Thus, the concept of the good (ἀγαθός) was only relevant towards one's social standing. To put it simply, Socrates identified changes in society and aimed to adapt morals into the more universal approach that we see today. It is through this change in perspective that many scholars prescribe the title "the first moral philosopher" to Socrates.

In the seventeenth century, Thomas Hobbes gave much thought to Human Nature and the question of morality and political philosophy. In his book Leviathan, he stated that the State of Nature was one where the individual was naturally in a state of war and therefore gave up his rights for the greater good to keep peace, led by an absolute sovereign. Is this morally correct? John Locke, on the other hand, wrote that the state of nature was socially harmonious. In these social contracts, we can see where questions of morality begin to form, and where they clearly differentiate and collaborate. For Hobbes, moral and political philosophy were inseparable. Locke disagreed: he believed an absolutism was impossible in a Civil Society and anything else would be tyrannical. Moral philosophy asks questions of individuals, prompting consideration of what is right and wrong or good or bad, and how individuals should live together peacefully in society. These are eternal questions that contextually evolve and therefore disciplines will continue to apply moral philosophy to answer contemporary questions.

History of the Culinary Arts
Culinary arts is the study of the preparation, cooking and presentation of food. Whilst cooking itself has been around for at least a million years, the emergence of culinary arts as a discipline is more recent.

Many traditions have used cooking and the knowledge of ingredients for medicine. In Ayurveda, for example, great emphasis is put on prevention of illness through diet and the use of herbs and spices. The knowledge of how a food tastes, the digestive effect and the post-digestive effect of an ingredient have been studied for over 5,000 years.

Deipnosophistae
Deipnosophistae, by Athenaeus is a fictionalised account of several banquets set in Rome, written in the early 3rd century AD. It is seen as a useful view into the culture of the Greek symposium, and the culture of gourmets and food. Within the account, there are references to cooks and their learning.

The training of chefs is not considered a new practice: “What—do you consider me less well-trained than the famous cooks in the old days, whom the comic poets discuss?” The students of cooks are also referred to multiple times, as seen in a quote from Possidipus’s “Χορεύουσαι” (Dancing Girls”), from the speaker, a cook: “Leucon my student.” Athenaeus quotes Sosipater’s “False Accuser”, stating that ‘Sicon’ is a ‘pioneer’ in culinary teaching and explains the multidisciplinary demands and thus multi-faceted learning cooks must undertake (in astrology, architecture, and military strategy).

Tenzo Kyokun
Written in 1237, the Tenzo Kyōkun (典座教訓) (typically written as 'Instructions for the Cook') is an essay written by Dogen, head of the Soto school of thought, that gave instructions for the tenzo and established standards for their cooking. However, to some of Dogen’s followers, the text is viewed more as a spiritual doctrine than a set of instructions. The preparation of food was associated with spiritual practice, and so the practice of one was seen as the practice of both.

Guilds
The reign of Louis IX (1226-1270) oversaw the organisation of guilds, documented by Etienne Boileau in his [https://fr.wikipedia.org/wiki/Livre_des_m%C3%A9tiers#:~:text=Le%20Livre%20des%20m%C3%A9tiers%2C%20r%C3%A9dig%C3%A9,un%20%C2%AB%20ordre%20social%20chr%C3%A9tien%20%C2%BB. Livre des Metiers]. In this he describes the “cuisiniers-oyers” who were allowed to “prepare and sell boiled or roasted meats,” and upheld standards of meat quality. The ‘Livre des Metiers‘ establishes a framework for apprenticeship in this area: to directly be able to work, one must be the son of a master or have 2 years practical experience; if not related to a master, or an untrained son of a master, learning must occur under an employee of the master. In France between the middle ages and the French Revolution, the system of guilds and the master-apprentice relationship continued.

Shift from apprenticeships to formal schooling
In the late 19th century, schools began to be established with a focus on cookery. The National Training School of Cookery, established in 1873, taught Plain Cookery (which included teaching students to go on and teach cookery skills, and general cookery skills), and High Cookery for those who would go on and work in middle and upper class households. In Boston in 1879, the Boston Cooking School opened following the success of its cookbook - The Boston Cooking-School Cook Book, which its author Fannie Farmer prefaces with her wish that it may be more than a regular cookbook, but also “lead to deeper thought and broader study of what to eat”. In 1895, Le Cordon Bleu opened in Paris with similar goals to that of the National Training School of Cookery.

Modern Study
In the modern era, culinary arts can not only be studied through apprenticeships, but also through many degree programs worldwide. The culinary world continues to develop, evolve and explore. The interest in this discipline is growing and the programs are numerous. However, criticisms have aroused regarding the lack of creativity and the very conventional model of these curriculums, which is according to many practiconners a crucial and essential element in the apprenticeship of culinary arts.