User:TimRJordan/sandbox/Approaches to Knowledge/2020-21/Seminar group 3/History

Computer Art as a sub-discipline
The words “computer art” bring several pictures to mind but certainly not an image of a Renaissance painter with a blank canvas in front of them. Perhaps, an artist in front of a computer screen, who is developing algorithms that generate experimental art. Or maybe even an Artificial Intelligence (AI) robot that creates paintings that are just as good as the ones in the National Gallery. All of the above are examples of Computer Art in the modern world. Art and computer science as disciplines use contrastingly different methodological approaches to arrive at knowledge. In recent years, however, a unique overlap between the two has led to the formation of a sub-discipline -- Computer Art -- an increasingly popular bachelor program in universities across the world.

Computer Art could be viewed as a branch of both art and computer science. As a branch of art, the sub-discipline explores the development of new art forms through the use of technology. In the context of computer science, it focuses on the development of technology (for example, algorithms) to create art. An increasing number of artists nowadays learn to code in order to develop algorithms for aesthetic purposes. To go even further, AI advancements have allowed for the production of AI-generated paintings which are accessible to be viewed online and even purchased. Every day, the intersection between art and computer science grows and history is made.

Formation of the sub-discipline and its development
The exact point in time when Computer Art first emerged as a sub-discipline is an academic debate. However, a possible answer would be the founding of the non-governmental organization “Experiments in Art and Technology” (E.A.T.) in 1966 by the electrical engineers Billy Klüver and Fred Waldhauer and the artists Robert Rauschenberg and Robert Whitman. E.A.T. sought to forge collaboration between engineers and artists through industrial projects. The organization quickly gained popularity and a few years after its establishment, it had over 4,000 participants: artists, engineers, programmers, researchers, and scientists, whose collaborative efforts led to the development of new computer art forms, such as video and digital image. A disclaimer should be made that earlier examples of the presence of technology in Art exist but they do not point to the history of the sub-discipline and its methodologies, which is why those examples are not mentioned.

Nowadays, we consider digital image and video (and the technology involved in producing those) as an integral part of Art, however, sixty years ago, they were a revolutionary concept. Coding, as a means of creating art, is also being normalized. Today, we tremble with excitement (and fear) at the thought of AI using complex algorithms to produce art by itself. In sixty years, however, we may see AI as an inseparable part of Art.

Definition of Museology
Museology, in general, is the theoretical study of the history and social roles of museums as well as activities related to the operation and management of them, including preservation, curating, and other practices.

Origin of Museums
The term "museum", or "museion" in Greek meaning "the seat of the Muses", emerged during the Classical Period. However, human beings have a longer history of collecting objects in the purpose of inquiry and acquisition. The earliest evidence can be seen from large amounts of grave goods in Palaeolithic burials. Founded by Ptolemy Sotor in the 3rd century BCE, the great museum at Alexandria contained various collections related to botany and zoology. However, its function was a philosophical institute supporting scholars to engage in their studies. The prototype of modern museums originated from the cabinets of curiosity starting from the 16th century in Europe. Accompanied by the colonization process, more overseas object entered into Europe, forming a large basis for collections. In the 18th century, there was a boom of museums, including the British Museum and the Louvre in France. The function of museums shifted to educating and “civilizing“ the general public in the 19th century, and several world’s fairs were held serving for this purpose.

Establishment of the Discipline
Though the history of collections is long, the systematic study of museums is a relatively new discipline. In the 16th century, the Belgian doctor published a book concerning museum practices, providing a guidance for organizing a collection. The term museology is first brought out by Georg Rathgeber, who formulated methods for making art collections in museums in one of his books. The first attempt of museum practices education can be traced back to 1856 in Spain, when the government established an institution to train professionals with national heritage preserving skills. In 1889, the Museums Association was founded in London. Annual conferences were held for discussing museological topics. In 1901, the association published the Museums Journal, which was the first academic journal in the field of study.

Problems and Development of Museology
The research interest and methodology of museology underwent large changes in the second half of the 20th century, shifting focus from museum operations to its social roles. Stimulated by the social unrest in Western countries and the popularisation of environmentalism, political activism, and postmodernism, researchers started to rethink the role of museums in society. It was pointed out that the ideology of traditional museums was isolated from the public and tended to be elitist. Also, most museums were curator-entered and building-bounded, making them superior to the audience. In 1980, the idea of “new museology’’ was introduced by French museologist André Devallées in contrast to the “old museology’’, focusing on the roles of museums in social and political contexts and the engagement of the whole community in curatorial practices. There was also the “ecomuseum’’ movement, reconstructing museums into democratic places without boundary and are originated from the local community, thus increasing the involvement of the general public.

Another contemporary field in museology is critical museology, which emerged in the late 20th century and the beginning of the 21st century. Since many collections in Western museums were historically related to colonization and were taken from wars, it is important to re-evaluate the history of the collections and to reconstruct museum practices critically. New methodologies are developing to remove the culturally superior part of traditional museology by interdisciplinary approaches. For example, with the help from historians and anthropologists, repatriations are undergoing on a global scale to decolonize museums.

Future Fields of Research
The methodologies and the research scope of museology are still actively changing by combining elements from other disciplines. For example, with the rapid development of information technology, the study of digital museology is becoming a trend. Many virtual museums are under construction, allowing more people to see the collections through the Internet, thus improving the efficiency in conveying knowledge. Documenting collections by digital technologies can also help with the organization of museums and the preservation of collections.

The Loss of Stenography as a sub-discipline
Stenography, the practice of shorthand, is a method of abbreviated writing that allows dictation to be taken in real time. Classes in stenography were highly respected and common in the mid 1900s since it was considered a valuable skill for multiple careers, such as police work, journalism, and law. These classes also became gateways for women to enter the clerical workforce in the '50s. Since there have been classes with students and teachers, and academic writings on the importance and history of shorthand, I am inclined to state that it was once a discipline, or perhaps, a sub-discipline of english language studies.

Shorthand has played an important role in documenting history since perhaps around 400BC, which was implied when Diogenes Laërtius stated that Xenophon “was the first who took down conversations as they occurred”. Through historians’ analyses of the different branches of shorthand, we have been able to translate some of Cicero’s works, who is known to be one of the greatest orators, and also introduced shorthand writers into his senate house ; Samuel Pepys’ diary which gave us an insight into English politics and the daily life of a Member of Parliament in the 1600s; and even some accounts of Sir Isaac Newton’s works which gave way to some of the greatest advancements in Maths and Physics.

Whilst there are still a few people who are trained in shorthand, it is very rarely used in today’s society. Shorthand is no longer a desirable or necessary skill to have in almost any profession. So, what changed? What has replaced the need for shorthand? Starting with the acoustic engineer Homer Dudley of Bell Laboratories in the 1930s through to the first speech recognition software, Dragon Dictate, having its initial release in 1997, automated speech recognition has developed from being able to process short sounds to fluent and sophisticated speech covering a wide range of dialects and languages. This development was accelerated by WWII, when Dudley focused his efforts towards finding a secure method of sending voice transmissions.

Now in most computers, phones, courtrooms, offices - and even homes with the likes of Amazon’s Alexa - we have highly accurate automated speech recognition devices and softwares that can record and take dictation with great speed and minimal clarification required. There is no need for shorthand anymore. The emergence and developments in the disciplines of Artificial Intelligence, Technology, and Acoustic Engineering has made shorthand redundant, and has replaced it as a discipline in the modern day.

The History of Art Conservation as a scientific discipline
Conservation of art is a multi-disciplinary study incorporating fine art, chemistry and scientific techniques. It has has only emerged as a discipline in the last century alongside developments in science. Prior to this scientific revolution, conservation of art was considered to be a craft and the focus was placed on cleaning and repairing rather than looking into scientific methods to conserve artwork.

The discipline of art conservation and philosophies around its intentions come to the fore towards the end of the late eighteenth and nineteenth century. During this time, there was a change of emphasis, looking beyond the mere physical rectification of the artwork or artefact to looking at the material history and future value of its cultural heritage.

One of the first notable conservation theorists is John Ruskin. In his 1849 book 'The Seven Lamps of Architecture', Ruskin argued against repairing and rebuilding old gothic buildings, introducing the idea of 'trusteeship'. His views countered contemporaneous architect Eugene Viollet-le-Duc's belief that these buildings should be kept in as best condition as possible. Both men presented contrary views to conservation.

Harold Plenderleith, in his 1998 journal article 'A History of Conservation', attributed the origin of scientific conservation to post First World War Britain. Historical artefacts from the British Museum, which were temporarily held in the London Underground system, showed signs of serious damage. The scale of the subsequent operation to restore and preserve these artefacts was unlike any previous restoration project and, for the first time, was assisted by the Department of Scientific and Industrial Research. An emergency laboratory was set up under the instruction of Alexander Scott. This laboratory was officially incorporated as a Department of the Museum in 1931.

By this stage, more sophisticated scientific equipment and examination techniques were being used as a means to study artworks and artefacts. One such example is X-radiography, a technique used to examine the composition and condition of paintings and various objects. X-radiography was championed by Edward W. Forbes, an art historian and director of the Fogg Art Museum between 1909 and 1944.

By 1950, the science of art conservation developed further by the founding of the International Institute for the Conservation of Museum Objects (renamed in 1959 as the International Institute for Conservation of Historic and Artistic Works).

Innovative, science based conservation techniques have continued to evolve. One example is the newfound use of nanotechnology, its first use being the restoration of the Brancacci Chapel in Florence where a micro emulsion was used as an alternative to solvent cleaning for the removal of beeswax. As a scientific discipline, art conservation now has to take into account both the health and safety of the conservators and the environmental impact of the conservation processes.

The birth of computer science
Though advances in computing go back to the work of Charles Babbage and Ada Lovelace in the 1830s, the emergence of computer science as a discipline occurred much later. Before it was recognized as its own discipline, the mathematical foundations of computing were explored in the mathematics discipline. In 1928, the Entscheidungsproblem challenge was posed by the German mathematician David Hilbert. It speculated on the existence of an algorithm that was able to take a statement and a set of axioms as inputs, and then determine whether the statement was true or false, i.e, whether it satisfied all the axioms. This challenge in the field of mathematical logic prompted the work of Alan Turing who set out to prove that such an algorithm could not exist. Turing broke down computations step by step and explained how these steps could be executed by a theoretical machine, defining algorithms as computations that could be carried out by these machines. These machines were later termed Turing machines by Alonzo Church in 1937. Turing’s work on the theory of computation in the mathematics field is considered to be the foundation of the theory behind modern day computers.

The formalization of computer science as its own a discipline occurred in the 1940s. Spurred by the work of Alan Turing and the construction of the Turing-complete computer, ENIAC, in 1945, computing as an academic interest rose to prominence. In 1961, George E. Forsythe coined the term "computer science". Forsythe identified the study of computer science as the study of "the theory of programming, numerical analysis, data processing, and the design of computer systems", setting apart computer science’s disciplinary identity from fields it was historically intertwined with such as mathematics and engineering. In this period there was an increased demand for instruction in computing and the first degree programs and academic departments dedicated to computer science were established. In 1962, the first computer science department was officially formed at Purdue University and in 1965, Richard Wexelblat from the University of Pennsylvania became the first person to receive a Ph. D. from a computer science department.

Until the mid 1980s, the focus of the computer science field was making advances in the power of computers and increasing their efficiency and effectiveness. However, as the personal computer and the Internet became more ubiquitous, questions about how computers interact with other fields of study arose and demanded academic attention. As computers became tools used for study and fundamentally changed how work was carried out in disciplines, ethical discussions about how they were being used started to take place.

Ethics in the computer science discipline
Work in computing has been around for nearly two centuries; however, the discussion around computer ethics is relatively newer, only having started in the 1940s. Computer ethics as a concept originated during WW2 when MIT professor Norbert Wiener was investigating the science of information feedback systems that enabled different parts of a cannon to communicate with each other. This new branch of science, which Wiener termed “cybernetics”, would later influence artificial intelligence. In 1950, Wiener warned against the negative consequences of technology on society and encouraged the development of technology that enhances the well-being of humans in his book The Human Use of Human Beings.

There was little academic interest in this new area of applied ethics until the mid 1960s, during which there was a series of computer-enabled bank robberies and privacy invasions by authoritarian government agencies. As the social consequences of technology became apparent in society, there was an increased academic interest in computer-related ethical issues. The term ‘computer ethics’ was conceived in 1976 by Walter Maner when he noticed that the use of computers in the medical field created a whole new branch of ethical considerations. He deemed this new branch of applied ethics as “computer ethics” and defined it as the study of ethical problems “aggravated, transformed or created by computer technology.” He made efforts to encourage the teaching of computer ethics in university, developing university courses and conducting workshops. By the early 1980s, the concept of computer ethics quickly caught the attention of other scholars who began to contribute to this new field.

Recently with infamous controversies such as the Cambridge Analytica scandal that have gained widespread mainstream attention, the importance of including the education of computer ethics in the computer science discipline has only been emphasized.

The History of Neuroscience as a Discipline
Neuroscience focuses on the study of the body's nervous system, especially the brain.

The Slow Study of Neuroscience
It is believed that the Egyptians, in 1700 BC, were the first to study the brain and its function. Indeed, the Edwin Smith Payprus contains proof of their brief knowledge of several parts of the nervous system. Over the centuries, many scientists continued to investigate on this complex part of the human body, but were very limited by a lack of sufficient material. In 1543, Vesallius published a medical textbook on neuroscience. During the 18th century, philosophers reflecting on the link between the mind and body encouraged research. . Scientists start to understand the concept of neurons and to perform neurosurgeries.

20th Century: Neuroscience's rise as a discipline
While neuroscience used to be a subject limiting itself to the study of the nervous system, it became in the 20th century a unique discipline. It started encompassing other fields of study, such as as computer science, biology, chemistry, medicine, psychology, linguistics and mathematics. Scientists now also work on the link between the mind and the human behavior, and the influence of our environment on the brain. Up to 13 different branches in neurosciences have appeared, from cognitive neurosciences to neurophysiology. The rise of neuroscience is due to the respective discoveries of nuclear and functional magnetic resonance imaging in 1938 and in 1992, allowing neuroscientists to realise more precise research. Progress is made on the structure and activity inside the brain, and many Nobel Prizes are awarded. The term “Neuroscience” finally appears in 1960 and in 1969, the Society for Neuroscience is founded, marking the emergence of this new discipline.

The Society for Neuroscience and Interdisciplinarity
Neuroscience is historically interdisciplinary through the creation of the Society for Neuroscience (SfN) in 1969, which marked the institutionalisation and recognition of Neuroscience as a formal discipline. Indeed, this was a product of growing scientific interest in the nervous system across a blurred set of disciplines along with new technology, leading to the Committee on Brain Sciences (CBS) creating this inclusive new organisation. The SfN’s interest in attracting a variety of scientists across different but linked fields was apparent through the thoughtfulness attributed to the naming of the society in itself. The CBS wanted the name to be representative of the wide scope of disciplines they sought to welcome, encompassing behavioural, biological and psychiatric aspects of neuroscience among others, without suggesting any form of hierarchy between them. There was debate as to whether or not the name should include the term “brain” for example, it then being decided that it would not be representative of the whole community the Society was aimed at, the most urgent aims for it being to gain memberships and funding for new interdisciplinary research related to the nervous system.

Neuroscience and Philosophy: The Birth of New Disciplines
The development of neuroscience as a discipline also led to the emergence of new disciplines in the late 20th century, with particular links being made between neuroscience and philosophy. Indeed, new studies and research about the brain, which is recognised as being associated with consciousness and “The Self”, were of importance and interest to philosophy in exploring corresponding old philosophical questions. A new discipline known as neurophilosophy thus emerged, accompanied by other new fields such as neuroethics, which is linked to both neuroscience and philosophy but also marketing, neuromarketing in itself also developing as a new discipline.

The Future of Neuroscience
Today, neuroscience has a bright future ahead. It is playing a key role in the understanding of many diseases such as Alzheimer’s and Parkinson’s, and also in mental disabilities. Associated with other disciplines such as computer science, neuroscientists are a big help in artificial intelligence. This field tries to recreate natural intelligence and the structure of the brain in machines.

Introduction
Philosophy takes its origin from Ancient Greece and the word φιλοσοφία which translates to Philo+sophia, meaning the ‘love or desire for wisdom’. It is important to note the meaning of the words ‘desire' or ‘love’ for ancient greeks at the time were slightly different than our definitions today. According to them, ‘desire’, or ‘love’ in this case, were used to describe things that were unattainable. Philosophy was for them a state of great wisdom they would pursue to get as a close as possible but they would never be able to fully achieve it.

In that regard, philosophy tried to answer to the great questions of our world. From, what it means to be human, what is our world made of, the different elements, what is truth, reality and so on. While many of these questions can never truly be answered Philosophy teaches every single one of us to develop our thinking skills and to challenge what we know.

Early Days
Unlike many others fields of study where the discipline was practiced long before it actually became taught at school or on a university level, Philosophy has always been closely interlinked between practitioners and tutors. One of the first well known philosophers, Socrates is famous not only to be the father of western philosophy but also because of his status as a teacher of the discipline. Himself never actually wrote any text but focused on transmitting his knowledge, and teaching his disciples the fundamentals of philosophy. His student Plato expanded this idea of teaching the discipline by creating the Platonic Academy, where students could come study and think among other scholars and philosophers. This trend continued with Plato’s student, Aristotle opening his own school known as the Peripatetic School or sometimes Lyceum or even Epicurus opening another school in his Garden, known as Epicure’s Garden. The same principle was true in the far east, where Confucian Schools were created in honor of Confucius to keep his teachings alive and reflect on his ideas.

Through the centuries until today
Philosophy as a discipline has changed very little ever since its birth around 600 B.C.E in Ancient Greece. Because of the nature of the subject many of the questions asked by Plato, Epicure, or Aristotle are very similar to questions faced by humans through the centuries and even today. As a result the way of teaching philosophy has remained more or less the same. While in Europe philosophical writings became less common after the fall of Greece’s influence due to a decline in literacy, greek philosophy continued to be studied in the islamic world and translated to arabic. Al-Kindi, a famous philosopher and mathematician is known to have translated and taught many philosophical writings in the Library of Baghdad known as the House of Wisdom around the 9th century AD. In Europe, the discipline continued to be practiced and taught but this time with a strong christian influence in the Italian Accademia Platonica and the resurgence of Plato’s ideas in 15th century Florence. In England the department of Philosophy made its debut at the University of Oxford at the start of the 1620’s and marks the resurgence of philosophy being taught at school. Ever since, the discipline has continuously expanded to most universities around the world and is greatly respected in the world of Academia. Today UCL hosts one of the biggest department of philosophy of all the UK.

History of Law as a discipline
The Cambridge dictionary defines Law as the system of rules of a particular country, group, or area of activity. It is a highly interdisciplinary discipline as it sets the rules of our societies and therefore has always existed.

Origins
The first origins of law as an academic discipline may go back to the 5th century BC in Athens where citizens could be taught about related subjects like philosophy of law or argumentation. Around 160, Gaius, a Roman jurist, wrote the Institutes, a teaching book inspired by Roman law and Greek philosophy intended for future lawyers. There are different roots to law. In England for instance, the famous universities of Oxford and Cambridge followed the new system of adjudication made during the Norman Invasion, which became Common Law. It can be stated that law as an academic discipline exists since the creation of the concept itself of university. It was indeed taught in the University of Constantinople (founded in 425) considered the first university of the world.

Evolution of Law since the 16th century
By the 16th century, European countries started to realize that law had to change alongside with History and every European country wrote its own version including Grotius's Introduction to Dutch Law (1619-1621) or later Napoleon's Civil Code (1804). Around the 18th century, when a wave of criticism was born against the legal education as law-based schools were created, some new ideas emerged. Some, like Sir William Blackstone, tried to make things evolve but no significant change was noticed before the 19th century. The US-model is now the most commonly used system to teach Law, giving both an academic discipline and a professional approach to its students.

The History of History as a Discipline
It is difficult to pinpoint the emergence of history as a discipline because engagement with the past has always been a fundamental aspect of human culture. Chronographic texts such as king lists and annals can be classified as historical documents, and these can be found among the archaeological evidence of civilisations as early as Ancient Egypt and Mesopotamia. . One artififact that can be classified as evidence of historical thinking among ancient civilisations is the Palermo Stone, a fragment of an Ancient Egyptian stele inscribed with a list of five dynasties of rulers (c. 2925–c. 2325 BCE) and a year-by-year record of significant events. Such texts relied on written language to record the past, so although humans may have been involved with their past to a certain extent through oral tradition and storytelling, the elementary signs of the emergence of history as a discipline in a recognisable form can arguably be traced back no further than the invention of writing.

Ancient Greece
The Ancient Greeks are credited with significant contributions to the discipline of history in the traditional sense. They helped cultivate the discipline particularly through their contribution to the genre of historical writing, as they developed forms of recording the past that permitted more sophisticated accounts than the chronographic texts of earlier civilisations. Although the epic poetry of texts such as Homer's Iliad and Odyssey convey the importance the Greeks placed in the stories of their ancestors, it is Herodotus' Histories that is widely regarded as the foundational text of the genre of historical writing. It is with Herodotus that the word "History," deriving from the Greek history meaning "inquiry," first appears in the context of the study of the past. Herodotus set out with Histories to investigate the events of the Persian Wars, and to apply this knowledge to an understanding of his contemporary world. The examination of cause and effect relationships and the idea that the past helps illuminate the present are features of Herodotus' work that still play a defining role in the discipline of history.

Thucydides was another prominent historian to come out of Ancient Greece. Influenced by Herodotus, his History of the Peloponnesian War recorded the events of a war in a factual and analytic manner. However, Thucydides differed from Herodotus in that he was a contemporary of the period that he wrote about. Thucydides was highly critical of Herodotus' attempt to record the nature of foreign places and societies, as well as events that he had not personally witnessed. . Thucydides' criticisms raised the issue of reliability in history, which remains an important consideration within the discipline today. Other historians were inclined to agree with Thucydides' suspicions that Herodotus was more of a storyteller than a credible source for knowledge of the past. As a result of Thucydides' influence, for much of antiquity historians did not concern themselves with the direct acquisition of knowledge of the past, as this was perceived as futile, and instead the practice of history focused on building on previous historical work and the recording contemporary events to which the historians had themselves born witness. Therefore, although the origins of the discipline are evident in the analytical perspective which the Ancient Greeks applied to studying the past, as well as their focus on core historical issues such as cause and effect and reliability, a large part of what today constitutes history was neglected.

The Enlightenment
The emphasis on reason and empiricism that dominated the pursuit of knowledge during the Enlightenment period had a significant impact on the discipline of history as key thinkers pushed for a more scientific approach to the study of the past.

History of Animal Magnetism
Though Edgar Allan Poe’s Facts in the Case of M. Valdemar was a work of fiction when it was published in 1845 many saw it as a scientific report about Mesmerism, also known as Animal Magnetism. It had been a popular discipline from the mid-eighteenth century through to the beginning of the nineteenth century, emerging in 1776 with the publishing of De planetarum influxu in corpus humanum, « The influence of the planets on the human body », by Franz Mesmer, a German doctor. By the mid-1800’s, new phenomena such as electromagnetism and hypnosis were gaining in popularity. As they were strongly founded scientifically, Mesmerism became side-lined and was no longer considered a credible theory.

According to the Britannica Encyclopedia, Animal Magnetism is a « presumed intangible or mysterious force that is said to influence human beings ». It is a form of alternative medicine which gained popularity throughout Europe from the mid 18th century to the beginning of the 19th century. Mesmer defined it as the ability to heal others thanks to the natural fluid, a force present within ourselves which links man, the earth and the universe.

Although Animal Magnetism was never backed by scientists throughout the years, many works and reviews were published on the subject such as Puységur’s Du magnétisme animal considéré dans ses rapports avec diverses branches de la physique générale (Animal magnetism considered through its relations with diverse branches of general physics) in 1807, or in 1814 Annales du Magnétisme (Annals of magnetism) by François Deleuze, a review of European experiments at the time. Furthermore, in 1782 the « Societé de l’Harmonie Universelle » (Society of Universal Harmony) was created to ensure the future of the doctrine, as it was threatened by academics and the French government for lacking scientific evidence. Animal Magnetism was also taught in several universities in Germany and in England, attracting public interest until the 1840s as works and reviews were translated in many different languages.

Promoters of Animal Magnetism were keen to ensure that the discipline was not seen as lacking a scientific approach. Instead, they were advocates for a new form of rationality staying away from the idea of possible and impossible. With the advancement of medicine and physics during the 19th century, the theories at the origin of Animal Magnetism were proved wrong, and Mesmerism started to be discredited.

Today, Animal Magnetism has lost its past influence and is considered an obscure, alternative medicine, often associated with hypnosis. In recent years however, the Law of Attraction theory has gained momentum, which picks up on an important concept of Magnetism.

The History of Feminism and Feminist Literature as sub-disciplines
Feminist Literature can be seen as a sub-discipline of literature and is also important in gender studies, studying feminism, therefore making it a discipline.

The beginnings of feminist literature
The roots of feminist literature start off in convents (around the 11th century), as religious women were the only ones who were taught to read and write unlike other women at the time destined to marriage. The first feminist authors were therefore religious women who were able to start questioning the social hierarchy of their (european) societies, like Jane Anger who wrote in 1589 that women, unlike what the huge majority of europeans believed during the 16th century, may actually be superior to men. Indeed, her interpretation of the creation of Eve in the Bible was that, as she was created from Adam’s rib whereas he was created from dirt, Eve (and therefore women) was(/were) a better version of the human being.

The development of feminism outside of the religious context however, was much harder as women intellectuals were not tolerated if not gifted with a “divine inspiration” like in the previous religious context.

First Circles of Women Intellectuals
Circles of women intellectuals still managed to appear progressively like Mary Astell’s, an author from the 17th/18th centuries. Through the literature emerging from them, women were encouraged to develop their own judgements and ways of thinking like in Astell’s A Serious Proposal to the Ladies, for the Advancement of Their True and Greatest Interest , (1694), calling to an intellectual emancipation of women and challenging the education of women.

The birth of a real “movement”
The apparition of a real movement became clearer at the end of the 19th century, based on the education of girls, the legal situation of married women and the lack of accessibility to employment for women. Another cause then became central in this movement: the right to vote, and with it the term “suffragette” One of the first countries to give women the right to vote was New Zealand, in 1893. Then, the majority of European countries gave women the right to vote at the end of WW1; 1918: Britain and Germany, 1919: Austria and Netherlands, 1920: US, and women now have the right to vote in every country where elections take place since the instauration of women suffrage in Saudi Arabia in 2015.

Importance of World War I
The First World War was a key moment for feminism. As women engaged in the war effort, the world realized the possibility and need for women to contribute to society in the world of labour too. It challenged the idea of the inferiority of women anchored in western societies and made it hard to support the idea that women were unfit to vote. Finally, it allowed women to enter the “public arena” they had been deprived from for centuries. The feminism movement and its literature continued to evolve through the 20th century, the ‘second-wave’ feminism and the UN established a Commission on the Status of Women in 1947, proof of the progress of the movement.

Diverse Feminism(s)
As the movement gained more importance and reached women of the entire world, the necessity for a diversity of feminisms to answer different issues faced by different women became clear. This need was for instance expressed by the author Simone de Beauvoir in The Second Sex (1949), exploring a large variety of categories of women (chapters: the girl child, the wife, the mother, the prostitute, the narcissist, the lesbian, and the woman in love).

This necessity to recognize different forms of feminism was also expressed later by Ien Ang, in 1995 saying that the lack of understanding and relating between different feminists should be accepted rather than trying to be brought down in the name of an unreal united feminism.

Studying Feminism
The increasing importance of Feminism in our world is also shown by the increasing number of courses focused on this discipline at a University level. For instance the Gender Studies degree at SOAS focuses on issues all around the world, emphasizing the importance of diverse feminisms mentionned earlier. Additionally, the master's degree in Sussex includes feminist research in its programme, whereas the University of York focusses more on its cultural, historical, political and sociological aspects, offering a master's in "Women's Studies" as well as a master's called "Women, Violence and Conflict". All these different programmes are proof of the expansion of feminism's importance in our society, and reflect the diversity of approaches to feminism.

Definition
According to the Cambridge dictionary, today the definition of economics is "the study of the way in which economies work, for example, the way in which they make money and produce and distribute goods and services". Economics is a discipline not only covering international issues with macroeconomics (import/export, valuation/devaluation or currencies, tax barriers) or international organization (IMF for example); but also the actions of individual people or firms with microeconomics (supply & demand, consumer choice...). Therefore, it is a broad discipline connected to many others such as Sociology, Politics, International Relations...

Origins
Today, many consider Adam Smith (1723 - 1790) as being the one who created the discipline of economics. He was the first to write about and create the now well-known notion of a liberal economy. Indeed, he was against any intervention by governments in the market and believed in the fact that in a free-market, supply and demand would balance themselves on their own. This pushed many other economists to emerge such as Marx, Malthus, and later Keynes as they either elaborated or contested Smith's liberal point of view.

However, many argue that economics was studied way before Adam Smith, in Ancient Greece with philosophers such as Hesiod (talks about the scarcity of resources and the concept of competition), Xenophon (explore management and leadership), and Aristotle ("money are a substance that has a telos, that individuals have devised a unit that supplies a measure on the basis of which just exchange can take place"). . Furthermore, many of Adam Smith's ideas were taken by the French writers exploring the idea of Mercantilism (says that countries should get richer through exports, forcing countries to develop themselves and become more and more productive and less dependent. They also invented the notion of protectionism). Therefore, the origins of economics are still a debate, however, Adam Smith is still known as the one who created the modern economy by uniting and expanding on all theories exposed before him.

Economics Today
Today, economics has become more mathematical and less theoretical. It is now possible to test many theories used in the past through calculations and graphs. Furthermore, with the importance advances that technology has taken, most trades (whether it is on the stock market or other markets) are less on the instinct of individual consumers but on calculations and predictions made. Meaning that Economics is now closely related to the discipline of mathematics and is bordering of science.

Moreover, the theories that are still very important are the ones of Adam Smith and John Maynard Keynes, used by many countries (Marxism lost a lot of importance towards the end of the XXth century and the fall of the USSR). Smith's liberal view is used mostly during times of economic prosperity but is losing more and more importance with the multiple crises that have hit the western world. However, Keynes promotes the intervention of the government during times of crisis through the lowering of interest rates and funding of firms in difficulty (the exact opposite of Smith's theory). We have seen the USA use this method several times (1929 crisis, 1970s oils shocks, 2008 wall street crash). With the COVID 19 crisis, this theory has never been more applicable as almost every country has provided aids to firms, people through direct pay instead of their employer, a reduction of taxes, or a delay on debt repayment for example.

Emergence of Alchemy
Alchemy emerged before the common era as a proto-scientific philosophical tradition. It is difficult to trace the evolution of alchemy, but it is thought to have emerged independently at three different points. The first being in Hellenistic Egypt, the second on the Indian subcontinent potentially as early as the second millennium BCE, and the third in China potentially as early as the fourth century BCE, however evidence for this is anecdotal and linguistic study contradicts this claim as there was no word for gold during this time. In Egypt, the earliest attributable author is Zosimos of Panopolis, who published his treatises c.300 CE and in it emphasised the influence of Egyptian metallurgy and religion on the development of alchemy. Alchemical knowledge is thought to have been lost when the Roman emperor Diocletian ordered the burning of alchemical texts following a revolt in Alexandria in 272 CE.

Aims of Alchemy
Alchemy involved the production of “noble” metals from “base” metals a common example of this being the transmutation of lead into gold (known as chrysopoeia). Across all alchemical schools of thought the process of creating gold remains central, whether it be to gain wealth or, in the Chinese and Indian tradition especially, as an ‘elixir of life’ which would grant immortality. The transmutation of metals was used as an allegory for the spiritual transformation of man.

Modern Alchemy
Alchemy is often thought of as a precursor to modern chemistry, and existed alongside modern science before its claims could be fully refuted due to its close ties with the Church, which controlled much of European society well into the Industrial era ; Isaac Newton focused much of his study on chrysopoeia in the pre-industrial era even alongside his much more fruitful study of physics. Alchemy has become an important concept in modern fiction, with the idea of the ‘Philosopher’s Stone’ being a core concept of the first Harry Potter book, and an exaggerated set of alchemical principles ruling the world of the popular anime ‘Fullmetal Alchemist’.