User:LGreg/sandbox/Approaches to Knowledge (LG seminar 2020/21)/Seminar 12/History

This is the sandbox page for the issue: History.

The founding of biochemistry as a discipline
Biochemistry emerged as a subject matter when biology first met chemistry, especially during the XIXth century when new methods and technologies allowed the chemists to study the molecules of the living organisms. At this point, biochemistry was referred to as physiological or pathological chemistry.

However, the founding of the discipline of biochemistry, a mix of two existing disciplines, in itself can be marked by the coinage of the name "Biochemistry", which is a contraction for ”biological chemistry”. Although several sources attribute the creation and first use of this word to Carl Neuberg, one of the first and most important German biochemist, in 1903, the first real mention of “biochemie” ( “biochemistry“ in German) can be found in the foreword of the first issue of the journal called « Biological Chemistry », in January 1878. This journal was published by Felix Hoppe-Seyler, a German physiologist and chemist, and was at the start called ”Zeitschrift für Physiologische Chemie”(”Journal of Physiological Chemistry”).

The establishment of the discipline
In the late XIXth century, early XXth, biochemistry really began to establish its roots as a discipline throughout the world. The following examples highlight some key dates of its establishment in the UK.

In 1902, the first department of Biochemistry of the UK was founded in the University of Liverpool. UCL is also the seat of the first Research School of Biochemistry in the UK. It was founded by Dr W.D. Halliburton who then contributed to the creation of the first Society of Biochemistry with the help of J. A. Gardner and R. H. A. Plimmer in January 1911.

How has it changed?
Hikmet Geckil realised two advancements that really pushed forward biochemistry has a discipline. The first is the "The discovery of the roles of enzymes as catalysts" by using yeast as an example, proving that enzymes are proteins. The second is the in-depth study of nucleic acids. This ranges from discovering DNA as the genetic material of the cell (1940) to using it in cancer research and brain sciences (2010s).

Hence, Biochemistry developed from the study of extracellular (e.g: chemistry of digestion) to intracellular chemistry. The key moment that might have contributed to this transformation could have been when Sangers, Berg and Gilbert received a Nobel Prize (1980) for their work on base sequences in nucleic acids, aiding the research of PCR. Molecular Biochemistry in particular is becoming more and more favored due to the clinical use of PCR which has been proved to diagnose and treat genetic diseases such as cystic fibrosis (gene therapy).

Who decides what the subject matter or approach of a discipline should be?
Combing everything together, biochemistry studies living process/metabolism at a molecular level and applies this in medicine. This includes the roles of various structures ranging from large complexes such as carbohydrates and smaller compounds like vitamins.

Therefore, looking at history, we can see that Carl Neuber aside, it is mainly the collection of successful research publications over time such as Nobel Prize winner Sangers' experimentation with chemistry and biology/ enzymes. As Biochemistry originated from chemistry, it is the theories and methods used in chemistry that is interdisciplinarily used to solve problems in biology hence why only the "bio" is added on.

History of Medical Physics
Medical physics, the discipline where applied physical techniques meet medicine, emerged from the linking of the two previously existing disciplines – physics, covering mechanics amongst other things, and medicine, covering the diagnosis and treatment of disease.

It is difficult to make out an exact moment or incident in time where the discipline was born. Some links between physiology and physics already occurred in ancient Egypt and Greece, when tools were used to measure pulse rate or to see differences in skin temperature. The first time, however, when a method known in the field of physics was applied to measure something concerning health, was the weighing of the human body at the turn of the 17th century. After that, the idea emerged to start thinking of the body as a machine that is explicable by physics and mathematics. This idea was adopted by an increasing number of scientists who started explaining how the body works fully in mechanical terms. An important step in the discipline’s evolvement was also the use of electricity applied to subjects concerning health when it became clear that electricity impacted biological beings.

At the end of the 18th century, French physician Jean-Noel Hallé became the first professor to teach about medical physics in Paris, which might mark the beginning of medical physics as a discipline in academia. He is also referred to as the “father of medical physics”. In 1814 a definition of “medical physics” was published in a dictionary, a few years later books on the subject were being released, first only ones for medical students, but about physics until others were published that revolved around applied physics in medicine in particular.

Medical physics went through a shift in focus by the end of the 19th century: while until then most of the discipline’s weight was put on how physics could be applied to explain how the human body works to begin with, it turned to the question of how physics could help to detect and cure diseases in the 20th century. Medical imaging developed and became more and more established, its methods including positron emission tomography, the use of x-rays and magnetic resonance imaging.

The founding of International Relations as a discipline
The roots of International Relations can be traced to at least 460-395 BC, when Thucydides, a Greek Historian narrated the war in his writing « The History of the Peloponnesian War ». Divided into eight books, Thucydides describes in is writing the conflicts between the Peloponnesian League led by Sparta against the Delian League led by Athen. While describing, he also explains his point of view on the war and until today, he is considered one of the fathers of a dominant International Relations theory: realism.

However, as a discipline, International Relations appeared for the first time in the aftermath of World War One. Indeed the first Department of International Politics was founded in 1919 at the University College of Wales, Aberystwyth University. The first holder of the Woodrow Wilson Chair of the Department was Alfred Zimmern, a classical scholar and English historian. . Founded in response to the horrors of World War One, the Department was created to prevent a future war. The members of the Department of Politics in Aberystwyth would search for the causes of war in order to find solutions. Once the solutions were established, they would submit them to politicians to stop other conflicts from emerging. As explained above, the appearance of International Relations as a discipline was made after World War One. Plus, the publication of " The Twenty Years' Crisis" by Edward Hallett Carr in 1939 and "Politics Among Nations" by Hans Morgenthau and Kenneth Winfred Thompson in 1949 also precipitated the focus on the discipline. Those two publications introduced new concepts, new points of view and quickly became classics in International Relations.

Changes in International Relations
International Relations is a topic that changes a lot over time. Since its origins after World War One, loads of debates emerged, creating new theories, new ideas, and new concepts.

Indeed, they are main debates in International Relations, the first one being between the Liberals and the Realists and with the question « How do we end all wars and conflicts ? ». A question that was asked just after the tragedies of World War One. Liberals assumed that conflicts could be ended and peace created. On the other end, Realists scholars claimed that conflict was inherent and endemic in world affairs.

The debates around International Relations also changed during the Cold War. Because the first debate couldn't provide any solution or explanation as to why another conflict arrived, a new question emerged: "If we can't eliminate all conflict, how best to manage them?". Realists and neo-realists said that the best way was to establish a balance of power and for great powers to have dominance over world affairs. . Liberals and neo-liberalists tended to believe in the creation of institutions to manage relations and emphasized the role of economics, free trades as they would ensure links between states so fewer possibilities for conflicts to arise. Moreover, the Marxists focused on economics as they thought that economic inequalities needed to be resolved in order to have peace and stability.

After the end of the Cold War and since World War II, International Relations as a discipline has immensely evolved with society and its development. New topics are analyzed, beyond conflict and warfare. As power dynamisms are shifting International Relations gives more space to theories such as Post-Colonialism. With worldwide societal movements like "Me Too", Feminism is being explored as a great matter. There is a diversification in focuses, different perspectives, and a new profound interest in Ethics.

Who decides how and when the discipline changes?
When talking about a change in International Relations as a discipline, we talk about the change of subject matter or methods. It is hard to determine how it changes, as it's more of a question of why does it change. Indeed, there is no designated organization, corporation, or item that decides. Often, International relations changes in response to world events: war, social movements, emerging power... As analyzed before, new theories appeared after World War One, World War Two, or after the Cold War. However, a few organizations help manage world peace, which is one of the main focus in International Relations. Founded in 1945, The United Nations is an international organization that counts 193 Member States today. Their focuses are "on the issues confronting humanity in the 21st century, such as peace and security, climate change, sustainable development, human rights, disarmament, terrorism, humanitarian and health emergencies, gender equality, governance, food production...". The United Nations can take action and help physically (by sending help) or economically (by sending money) the countries and cities that find themselves in trouble.

History of Psychosurgery
Psychosurgery, also known as psychiatric neurosurgery, is the practice of treating mental disorders by means of surgical intervention, traditionally the severing or debilitating specific areas of the brain. The discipline lies at the intersection of neurosurgery and psychiatry. The brain and the mind are closely linked; therefore, it seems intuitive to look to the former to solve problems with the latter. Nevertheless, psychosurgery is not currently widely used. It is under-researched and regarded with caution due to its unethical history.

Is Psychosurgery a Discipline?
Despite the controversy, the practice used to occupy a place in academia, with its own research and thought leaders such as Egas Moniz. While many of these people tended to be neurologists or psychiatrists rather than psychosurgeons specifically, that can be attributed to the rapid development and decline of the discipline, which began gaining traction in the 1930s and started falling out of public favour in the 1950s. In the early stages of a discipline, it is not uncommon for experts to come from other, already established fields. Moreover, as the different fields of medicine became more specific, where psychosurgery persisted, it split from neurosurgery, as the former dealt with mental disorders whilst the latter addressed disorders of the nervous system. For example, some establishments such as the Regional Neurosurgical Centre of the Brook General Hospital in London have psychosurgical units. As such, psychosurgery can be regarded as a discipline, albeit a niche one.

The History of Psychosurgery
The roots of psychosurgery go back as far as the prehistoric era, with trepanations that took place in the Neolithic period being some of the earliest recorded surgical interventions. These procedures involved creating a hole in the bones of the skull, which was believed to relieve pressure.

Modern psychosurgery was founded by Gottlieb Burckhardt, who conducted various experiments on six subjects in 1888. He described three of the outcomes as effective. One of the others did not survive the intervention. Burckhardt abandoned his studies as his ideas were not received favourably. Various surgeons conducted similar experiments, the most famous of which being Egas Moniz, who popularised the practice of ablating the frontal cortex, which later became known as lobotomy. Almeida Lima was his colleague and a neurosurgeon, and he performed over a hundred early experimental lobotomies using a rod with a wire loop. The trials were considered a success after psychiatric and subjective examinations, although those were poorly recorded and many patients had to be returned to the asylum, their further fate unknown. However, this evidence was enough to help Moniz popularise the procedure. He was awarded a Nobel Prize in Medicine or Psychology for this work in 1949. The procedure caught on in multiple countries. American neurologist Walter Freeman and neurosurgeon James Watts developed a method called the transorbital frontal lobotomy, which meant that instead of making incisions in the skull, the instrument would be inserted through the eye socket. This made the procedure easier to perform. Consequently, it became used by people with no surgical training and was promoted as an easy solution to the widespread, expensive, and little understood problem of mental disorders.

Lobotomy
In modern times, the word "lobotomy" conjures up the ideas of barbaric and outdated medical practices, and rightly so. The frontal lobes of the brain are the region our personality stems from and the control center of our behaviour and emotions. They are responsible for expressive language and voluntary movement, as well as higher-level executive functions such as planning, initiative, self-monitoring, etc. The severing of these areas of the brain in most cases causes a decline in cognitive abilities. Some patients died as a result of the surgery, such as Josef Hassid, one of the greatest classical musicians of the 20th century. The mortality rate of lobotomies in the 1940s was 7.4%. In some cases, the subjects took their own lives following the surgery. Many survivors suffered disabling side effects, including paralysis, seizures and mutism, some reduced to a vegetative state. In most cases, the patients suffered effects such as stupor, a loss of personality, memory problems, diminished intellect and the inability to function independently. The doctors at the time described the procedure as a "surgically induced childhood". The effects were believed to wear off in time, but in most cases, they lasted for the rest of the patients' lives, although some made recoveries. Notably, the loss of cognitive function was not so much a side effect of the procedure as it was the goal, as it caused the reduction in symptoms and made the patients easier to manage for caretakers and asylums. Freeman suggested training adult patients with rewards and punishments as one would a child in order to achieve the desired behaviour.

The subjects of these surgeries also raise ethical concerns. Women made up 60% of lobotomy patients, and Freeman considered black women specifically the best candidates for the procedure. Lobotomies were sometimes performed on difficult children, even younger than six, with the consent of their guardians. Adult patients could also be subjected to the procedure without their consent, at the will of the doctors and guardians. While the original purpose of lobotomies was to make violent patients easier to manage and care for, they gained popularity and were prescribed as a treatment for disorders such as depression and PTSD, as well a treatment for chronic pain. The procedure was often performed with electroconvulsive treatment used as anesthesia, and many of the doctors administering the treatment were unqualified.

Lobotomies fell out of favour in the 1950s as their dangerous effects became better understood and as new treatments such as pharmacologic therapy were developed. Other forms of psychotherapy were researched, although none were ever implemented in the same widespread manner. In the 1970s, the question of creating a less violent society by means of altering the limbic system was brought up but was met with much backlash due to ethical concerns, yet again contributing to the historical baggage of psychosurgery as a discipline.

Psychosurgery Today
Current methods and approaches to psychosurgeries differ from those described above. Lobotomies have been phased out, the field is heavily regulated, and new techniques have replaced it, specifically anterior cingulotomy, subcaudate tractotomy, limbic leucotomy and anterior capsulotomy. These are much more specific than lobotomies, and some of them use electrodes to target specific areas of the brain. These methods have been shown to help patients with OCD and other mental disorders while having significantly reduced side effects. Some of these procedures, such as anterior cingulotomy, supposedly have very few side effects, while others, like subcausate tractomy, have temporary side effects (with no diminishing of cognitive abilities demonstrated). However, the research in the field is sorely lacking, and the studies conducted into these techniques are likely to be unreliable. For instance, there haven't been any placebo-controlled trials. After its fall from grace, the field of psychotherapy currently exists on the margins in the form of small niche studies and contradictory evidence. The discipline could hold great promise as a valuable tool in medicine, but all that currently exists is a dark history and an uncertain present. Psychosurgery as an idea in itself holds merit, but the irresponsible profit-driven and unethical approach it has been handled with throughout history have shaped it into a fringe research topic at its best, and a life-ruining horror at its worst.

What are social sciences
Social sciences began with the need to understand human behaviour throughout multiple aspects in society; how humans interact with each other and how they make decisions. Social sciences are a broad discipline that includes anthropology, sociology, social psychology, political science and economics.

Social sciences go back to Ancient Greece and their willingness to understand human nature, their morality and the social construction around a state of politics.

The term ‘social sciences’ was first brought by William Thompson in 1824 when he wrote his book An Inquiry into the Principles of the Distribution of Wealth Most Conducive to Human Happiness; applied to the newly proposed system of voluntary equality of wealth.

From then, the discipline grew, mainly with the organisation of philosophers, the enlightenment philosopher that consisted of Denis Diderot, Baruch Spinoza, John Lock and Montesquieu.

Slowly social sciences entered universities, at first with research and then with it being taught for master students at the end of the 19th century and the beginning of the 20th century. The end of World War II marks the increase in demand to study social sciences and especially political thinking to better understand the world.

History of Economics
Economics can be defined as the understanding of human behaviour when it comes to wanting unlimited goods in a world where there is a limited supply. Economics probably began in Ancient Greece when the poet Hesiod wrote economics precepts in his poem. During the Medieval Age, thinkers like Thomas Aquinas talked about the importance of private property. However, economics weren’t a real discipline or even a subject to talk about before the 18th century.

At the time, we talked about ‘political economy’ to understand how production and consumption were linked to the making of policies. The term ‘political economy’ changed for ‘economy’ thanks to the textbook Principles of Economics by Alfred Marshall with the rise of mathematical thinking to understand economics better.

The first time economics was studied dates back to the 19th century with the School of Salamanca in Spain.

Why is economics considered a social science?
Economics is considered to be the most scientific of social sciences. Indeed, the tools used to understand human behaviour in economics are mainly scientific, with the use of formulas, models, theories and hypotheses to prove their points. Likewise, studying economics in University and school requires, most of the time, a strong background in mathematics.

However, economics is considered a social science because its main purpose is to analyse, thanks to quantitative methods, human behaviour when it comes to having unlimited wants in a world where there is a limited supply. Economics is at the limit between social sciences and hard sciences has it uses hard science to prove human behaviour.

History of Anatomy
The anatomy of the human body has been an interest to scientists for a very long time. Therefore, what we might know today has notably changed and evolved since the beginning.

The first recorded school of anatomy was based in Alexandria, Egypt from approximately 300 to the 2nd century AD. Students were taught by two famous anatomists, Erasistratus and Herophilus, who would explain the structure of the human body by dissecting it publicly. These two anatomists were reportedly the last ones to perform a human dissection for education purposes.

Galen (129-216 CE) was one of the most renowned anatomists in history. For centuries, scientists used Galen’s work to learn about human anatomy. However, during his time, the dissection of a dead body wasn’t legal. Consequently, all of his discoveries on human anatomy were based on the dissection of animals, such as cats, dogs and monkeys.

During Renaissance, people’s interest in human anatomy grew much more and human dissection was permitted to anatomists. Andreas Vesalius, a physician, started comparing his discoveries to the work of Galen. To no surprise, there were a lot of inaccuracies in Galen’s books since he was relying on animal dissection. He released his own seven-volume book which was only based on human dissections. Andreas Vesalius was from then on known as the founder of modern anatomy.

Leonardo Da Vinci also contributed to the discipline of anatomy with his drawings and annotations of the human body, however much of Da Vinci’s work went unpublished for years. Due to Da Vinci’s own interdisciplinary talents, being an artist, sculptor and engineer, his drawings and dissections were also performed extremely skillfully, also having described coronary sinuses almost 200 years before they were officially named.

Since that time, scientists have built on Andreas Vesalius' knowledge and discovered more about the human body. As surgery and anatomy moved to academia during the late 17th and 18th centuries, anatomy became more systematically studied, driven by a demand for more effective medicine and the legitimisation of surgery as a medical practice. Moreover, the teaching of anatomy has changed from centuries ago. Dissecting in some schools is seen as outdated and in others it's still permitted.

Colonial beginnings
Social anthropology as a discipline began in the earth 20th century and gained momentum in the 1930s with an increase in financial backing. Social anthropology has its roots in colonialism as funding for research in the UK came from the British Colonial Office. The relationship between anthropologists and colonial authorities is described as “often uneasy, even occasionally combative”, yet the colonial context dictated the research performed from the 1920s through to the 1970s. In this context, research was often conducted in “exotic” (in the eyes of European researchers) communities far from the researcher's country of origin. The power dynamics in these situations were skewed and participants of research were exotified and commonly referred to as primitive, native and savage.

Key theoretical approaches
Inspired by the school of Durkheim and founded by anthropologists such as Malinowski and Radcliffe-Brown, structural functionalism was the social anthropological approach that dominated British research within the field in the middle of the 20th century. With a focus on empirical data, structural functionalism sought to understand the communities studied through a scientific approach.

The American school of social anthropology took a more interpretive approach. With the release of Clifford Geertz’s book, The Interpretation of Culture (1973) social anthropologists welcomed a more flexible approach to research now known as the postmodernist era of social anthropology. Relevant issues in the discipline of today include migration, gender and ethnicity. Postmodernism in social anthropology also brought into light more critical approaches to ethnographic research such as postcolonialism, critical race criticism, feminism and marxism.

The reflexive turn
For a long time, scholars and researchers within social anthropology were a homogenous group consisting primarily of white, European males. With the empirical approach of structural functionalism, these researchers aspired to be as objective as possible in their studies and were unreflexive in their research. This proved problematic for the production of knowledge as it excluded a crucial aspect of ethnographic research: the impact of the researchers' positionality on their study. The reflexive turn refers to a change in approach where ethnographers started recognising aspects of their intersectional identities that limit, challenge, assist or otherwise impact the methods of their research. This has resulted in a production of knowledge that can be described as “openly subjective”. This approach aims to reduce the unequal power dynamic as well as reflect on the anthropologist’s own cultural biases in relation to their research.

Summary - contemporary social anthropology
As a result of its history, here briefly explained, social anthropology as a contemporary discipline has moved away from its colonial roots where the focus lay on understanding “primitive” and “exotic” societies from a place of superiority. Today the discipline focuses more on understanding cultures and communities, seeing research participants as collaborators in the production of knowledge rather than subjects of study, using critical theories to understand the colonial history of anthropology and reflecting on the researcher’s intersectional identity and positionality and the ways in which this affects the outcome of their ethnography.

Historical Prevalence
Environmental conservation aims to preserve and maintain various natural areas and resources, though there is debate on whether conservation should focus on maintaining environments for their own sakes, or in conjunction with humanity’s usage of them. Various forms of conservation have been employed throughout human history, such as crop rotations used to maintain the arability of land. Though some records date back further, Leviticus 25:4, written around the 6th century BC, references letting land lie fallow every seven years, which is a preliminary form of the modern-day strategies. Before it became a discipline, instances of environmental conservation were a response to population growth and therefore, a necessary means of survival.

Evolution to a Discipline
With the rise of European colonialism and an increased need for building materials, deforestation became a concern for academics of the time. John Evelyn, an English diarist and tree-specialist, published Sylva, or A Discourse of Forest-Trees and the Propagation of Timber in His Majesty's Dominions in 1664, to call for reforestation after observing the excessive timber harvesting to keep up with military needs. It had support from the prestigious Royal Society. This was an influential work in forestry academia, as conservation shifted from a survival technique to a problem of potential higher scholarly merit. A century later, Reverend Jared Eliot published Essays Upon Field-Husbandry in New England, with conclusions pertaining to the ensuring of adequate soil maintenance, use of fertilizers, and management of irrigation and water runoff, after several years of experimentation and study. Ultimately, he correctly predicted problems that would occur as the population grew and land became less plentiful. As the issue worsened significantly with the Industrial Revolution and degradation began to show widespread effects, conservation’s promotion to a discipline led to the creation of higher educational degrees in Environmental Studies and related fields. Syracuse University was one of the first in the early 1950s, but it was followed by many other institutions, and today is a common staple at most multi-disciplinary universities.

Emergence Into Mainstream Concern
There occurred a revival of environmental debate in the 1960s, as Rachel Carson’s Silent Spring explained the world's bleak future given its reliance on excessive production and chemical usage. Around the same time, social ecology evolved the discipline further, courtesy of political philosopher Murray Bookchin, among others. The study of cultural and economic factors, among others, has become entrenched within the discipline as a valuable source of the cause and effect within climate change. Later on the 1980s, after concerns about carbon emissions grew too large to ignore, oil companies Shell and Exxon also carried out environmental research suggesting a 2 degree Celsius temperature increase within the next 50-80 years. The decades-long burial of these figures, in the interests of corporate success, highlights economic underpinnings that hinder effective environmental research, further strengthening the case of social ecology. Following those aforementioned corporate barriers, progression within the discipline has acquired another level of difficulty as governments across the world, such as the US and Brazil, also refuse to take science-backed action. Accounting for the enormous network of factors, setbacks, and misinformation occurring in the present day, environmental conservation has grown into a hugely complex field that, despite pushback, shows no sign of declining given the current precipitous position of humanity and its relation to the planet.

Psychology as a scientific discipline
Wilhelm Wundt is commonly referred to as the father of the modern discipline of psychology, having opened the Institute for Experimental Psychology at the University of Leipzig in 1879. His lab was the first psychology lab and its opening is often considered to be the start of psychology as a separate scientific discipline rather than a more vague philosophical one. . Wundt's school of thought is known as "structuralism" and this school of thought was also contributed to by Edward Titchener who had studied under Wundt.

Psychology as a philosophical discipline
It can also be argued that psychology as a holistic, more philosophical discipline began centuries ago in the Hellenic Age with writers and philosophers such as Homer, who discussed the human condition within the Iliad and the Odyssey. However, at this time, it wasn’t regarded as a scientific study, therefore it is difficult to determine whether it could be classified as the disciple of psychology.

Other schools of thought in psychology
A famous school of thought in psychology called “psychoanalysis” was developed by Freud in the 1890s and was an alternative to experimental psychology at the time that Wundt had opened his lab. Freud then, along with four other men, founded the “Psychological Wednesday Society” in 1902. Another dominant school of psychology is "Behaviourism" which became popular in the early 20th century. John B.Watson was one of the first psychologists to discuss the concept of "behaviourism," partially as a rejection of the older forms of psychology which had more philosophical roots.

Origins of the discipline
Ecology, despite being a relatively young discipline, has its roots as early as in the 4th century BC. During that time Aristotle and his student Theophrastus, both of whom devoted a substantial part of their works to the nature and behaviour of living organisms. Even though the latter attempted describing dependencies between animals and their environment, what may appear as a clearly ecological work, it can’t be classified as such in the modern sense. The reason for it is the very fact that, even while seeing the importance of the surroundings, animals and other living organisms were thought about in terms of essence, what led to a relatively static conception of nature.

Early modern approach
The next major contributions to the subject of ecology were made in the 18th century, mostly due to the emergence of colonial superpowers. As they run multiple expeditions in order to discover and conquer new lands, they were accompanied by many scientists whose job was to describe and classify new natural resources. German botanist Alexander von Humboldt was one of them. His work was of huge importance to the emergence of the discipline that is today known as ecology, as well as the ones closely related, such as botany, biogeography and meteorology. Similarly to his ancient predecessors, he focused on the relationship between organisms and their surroundings, however his view was much broader and his metodologies more specific, as he benefited from all the previous discoveries and entirely new lands. Humboldt was one of the first ones to take interest in the notions of climate and its changes.

Arguably the biggest breakthrough in the field took place after the publications of Charles Darwin's On the Origin of Species in 1859. Although it did not result in immediate ecologic change, it allowed thinking about nature in terms of natural selection and of ecosystems as dynamic and actively reacting to the environmental changes, what effectively overthrew the ideas of essentialism and idealism.

The term "ecology" was first used by Ernst Haeckel in 1866, he himself was a great advocate of acknowledgment of the links between the theory of evolution and ecological sciences.

During that time or shortly after came several other important contributions, such as discoveries of photosynthesis, description of symbiotic relationships or food chains that led to quick development in this freshly established field of science.

Contemporary view on Ecology
With the increasing level of knowledge came the increased level of generalisation, as (thanks to previous smaller-scale research) scientists were able to describe larger and larger systems. Soon, the concern of ecology became the whole ecosystem and relationships often ranging further than the surface of the singular continent. In the 21st century, ecological attention focuses mostly on anthropogenic changes affecting the whole planet, such as global warming and depleting natural resources.

Historical ecology

Historical ecology is a sub-discipline in ecology that studies the past and future of the relationship between humans and the environment. It takes into account other disciplines like geography, politics and even sociology to try and understand how this relationship between humans and the environment is shaping the world that we live in. It's a pluridisciplinarity project research that aims to gather decisions and outcomes from the past and put them to use for the future.

This discipline was first brought up after World War II because scholars were more and more interested in grasping concepts from different points of view which meant bringing together different disciplines around a subject.