User:GCooper316/sandbox/Approaches to Knowledge/2020-21/Seminar group 10/History

What is managerial economics?
Managerial economics, also known as Business Economics, refers to the application of economic theories, concepts and tools in the context of a business. It sits as the intersection between the disciplines of economics and management. The topic draws heavily from the microeconomic theory which deals with the behaviour of individual firms and consumers, as well as from quantitative tools such as statistical methods and calculus. It is often seen as the branch between economics in theory and in practice under a business setting, with the goal of optimising a firm’s objectives under the economic condition of scarcity. For example, the game theory addresses the uncertainty and lack of knowledge for businesses, and operations research provides scientific criteria to minimise cost and optimise profit.

Establishing managerial economics as a discipline
The establishment of a discipline can be defined as the formalisation of the discipline in society, such as being taught at a university level or officially researched by a group. Applying this to managerial economics, one can clearly see it is a discipline. Not only is it taught in most universities, but many universities also allow students to graduate with it as a major, with UCLA as an example. In addition to this, the topic is commonly researched and published due to the wide recognition of the uses of economic theories in business settings. A notable publication on this topic is the best selling “Managerial Economics and Business Strategy” by Micheal Baye. Overall, from this, it is clear managerial economics is a discipline.

The history of management and economics as separate disciplines
Economics refers to the study of the ways in which different parties such as firms, consumers, and governments allocate scarce resources and make choices. As a topic, it was fairly obscure prior to the 1870s, even with economics concepts such as “the invisible hand” being introduced in the 1750s by Adam Smith. It was only from the 1870s-1900s in which it received recognition as an independent discipline, gaining popularity amongst students. The subject also evolved in professional training, with cultural degrees being established in universities, followed by a professional doctorate.

Meanwhile, management refers to the study of principles and practices in administration within a business organisation. Management as a discipline emerged when the industrial revolution provided organisations with the need to develop detailed strategies to manage mass production, profit optimisation, and more. While business classes were taught in small amounts during the 1800s, the first institute for higher education on the topic was in 1881 when Wharton Business School was founded. Since then, the subject underwent significant development in concepts and theories and is one of the most popular majors in universities in the modern age.

For much of business history, business practitioners have resisted integrating economic concepts into decisions, citing the restriction assumptions that economic theories are often based upon. For example, one such assumption is that all parties are rational decision-makers, and consider options from a purely logical perspective rather than due to emotion or moral elements.

The emergence of managerial economics from business management and economics
Relative to economics and management, managerial economics as a discipline is relatively new, due to the aforementioned avoidance of economics by business practitioners. However, in the mid-1950s, this began to change when economists began to develop more realistic ways of applying economic theory in business situations. A key moment was economist Joel Dean’s publication of “Managerial Economics” in 1951, which caused the subject to begin to gain popularity, with Dean being termed “the founder of business economics” later on. Dean’s focus on making economics more practical led to economists in the 1970s and 1980s developing new theories that incorporated more realistic ideas of behaviour in humans and organisations. Since then, managerial economics has gained popularity amongst academics and is even taught as an essential part of many economics programs on top of being a separate major.

An Introduction to Applied Art as a discipline
Applied art, as the name suggests, is the field of study that is concerned with artistically producing functional everyday objects. It is its’ own distinct area of study that is not to be confused with fine art, which is the practice of creating art purely for aesthetic value with no emphasis placed on functionality. Some (sub)disciplines that fall under applied arts are architecture, industrial design, decorative art, and graphic design.

Though it is widely considered a category of disciplines, applied art can also be studied as a standalone course, whereby Bachelor of Applied Arts programmes are offered at universities such as Central Michigan University and the University of New Brunswick. However, it is more common for practitioners to specialise in one discipline under applied arts.

History of Applied Art and the Industrial Revolution’s Impact
Various forms of applied art can be traced back to ancient times, way before applied art was established as a genre. One of the earliest examples would include Chinese pottery, as the oldest piece of Chinese pottery is considered to be 20,000 years old. Furthermore, some disciplines that are considered to be an applied art today gained recognition as an individual profession before applied art itself. For example, architecture - governing bodies for architects existed years before applied art was a commonly known term. Yet, the recognition of applied art as an independent subject area is important for the disciplines that operate under it - it provides a sense of identity and clarity to each discipline. This happened in the late 19th century, thanks to the Arts & Crafts movement.

The Arts & Crafts Movement (1860-1920) originated from a collective distaste for the poor craftsmanship and production circumstances that decorative art was subjected to, which was widely believed to be the consequence of the Industrial Revolution, especially in Britain. The technological feats of the Industrial Revolution created unrealistic demands for quantity, resulting in a large number of workers to be forced into terrible working conditions even if it meant that they were unprepared. Criticism for this phenomenon was a significant part of the movement and led to the reformation of social and design principles. Rather than relying heavily on machines and exploitative commercial processes, people were encouraged to engage in production to create useful products with aesthetic intention. In other words, people were introduced and encouraged to participate in applied arts during this time.

Amidst the movement, the Art Workers’ Guild was formed in 1884. The Guild’s goal was “to create a meeting place for the fine arts and the applied arts on an equal footing”. It was one of the first organisations that allowed those involved in applied arts to partake in, and its’ initiation is likely to have led to the recognition and establishment of applied arts as a distinct discipline/subject area.

Technology’s Impact on the Applied Arts
Even though the history of applied art tells the tale of technology’s negative effects, it is worth mentioning that the disciplines under applied art have benefited from technology in positive ways.

Much of the development in the field of architecture owes to technological advancements. In the late 18th century, the creation of various new construction materials was made possible due to the technology available during the Industrial Revolution, expanding the possibilities within the profession. Unsurprisingly, new machinery and methods of manufacture also made architecture design easier, and this wasn’t something that only occurred during the Industrial Revolution. Technology is always improving, so design processes become more efficient, and the capacity of the discipline grows.

Technology also acts as a necessity for some disciplines - without inventions like the computer or factory machines, disciplines like Graphic Design or Industrial Design may be highly inconvenient to pursue or may cease to exist altogether.

What is Design Thinking?
Design thinking is an emerging approach to problem-solving, that uses scientific tools as well as the knowledge of human behaviour to address a variety of complex challenges. As defined by Tim Brown, the executive chair of IDEO where the term originated, design thinking is "a human-centred approach to innovation that draws from the designer’s toolkit to integrate the needs of people, the possibilities of technology, and the requirements for business success." Design thinking has been practised for centuries, however, as a methodology and a discipline, it is still in its infancy.

The History and Emergence of Design Thinking
Isambard Kingdom Brunel was perhaps the first unintentional practitioner of design thinking when designing the Great Western Railway in 1833, he talked about wanting to create an *experience* for the passenger, rather than simply designing tunnels and rails across valleys. Although this is an example we may now call the first work in the discipline, it does not mark the beginning of the history of Design Thinking as a discipline.

John E. Arnold, an American mechanical engineer was the first to use the term 'design thinking' in his 1959 publication "Creative Engineering: Promoting Innovation by Thinking Differently". He outlines the four main principles of design thinking: 1) novel functionality, 2) higher performance levels of a solution, 3) lower production costs 4) increased salability. Even though innovation is still an important aspect of design thinking, Arnold did not mention another fundamental aim: the importance of bringing about social good and design being human-centred. This goal of design thinking was outlined by another important publication in the history of design thinking: Ken Garland's First Things First Manifesto (1964).

The Establishment of Design Thinking as a Discipline
The real turning point in establishing design thinking as a product of society — the collective effect of market needs, technology, collaborative activity — and therefore a discipline came in 1978 when the design consultancy IDEO adapted design thinking as their learnable and teachable mindset. They are teaching the methodologies of a new discipline by bringing together professionals from the humanities (such as anthropologists, psychologists and lawyers) with designers, engineers and architects to work together and solve problems in human-centred ways. Their teaching has introduced various industries to the practice of design thinking. The first academic symposium on Research in Design Thinking was held in 1991 at Delft University. Since then, design thinking has been incorporated into the International Baccalaureate Diploma Programme and is taught at Stanford University's design school and the MIT D-lab among other notable institutions.

What is Textile Engineering
The concept of textile engineering can be defined as the combination of textile manufacturing and engineering, where principles of science, technologies and textile come together in order to create new fibre, products and apparel processes. Textile engineers work on the creation and development of new fibres and textiles and improve already existing ones, they find a new use for yarns, machinery, fabrics, textile and finishes.

Textile engeneering as discipline
Though the use of the term “textile engineering” can be traced back to the 20th century, the field is still underappreciated and not well recognised as its’ principles coincide with many other engineering degrees. Textile engineering is still too limited within academic and research environments compared to mechanical, civil, or electrical engineering, as the programme of education is still unclear, which is why it is not taught in the majority of universities.

Yet the discipline is already taking place in such universities such as NC State University, Technical University of Liberec or RWTH International Academy In most cases, it is a combination of interdisciplinary curriculum and engineering principles from other engineering degrees and specializations, since there is still not enough academic material and books to support the program. Nevertheless, this degree touches upon all aspects of creating fibres, textiles, and other materials needed for production. It teaches students not only how to produce already existing materials but gives the base of knowledge to create new, not yet discovered textiles or natural-based fibre which later can be applied to mechanics, biochemistry and bioengineering.

The need for Textile Engineering
The need in the discipline occurred back in the 18th century when with the technological progress and Industrial Revolution there was a need for creating new materials and learning new techniques to create them. Those were the times when cotton was created, followed by the first man-made fibre and rayon. The industry was only developing, discoveries of both natural and man-made fibres and textiles took place - it was a new ground and only a minority of people were educated well enough to participate in the creation of new materials. Nowadays as textile engineering has developed into a major labour-based industry serving billions of people and making trillions of dollars, the amount of well-trained professionals is still insufficient for a sustainable and always changing sphere. That is why it is necessary to have new specialists and a high degree of specificity in this profession.

Emergence of the discipline
Multiple debates are surrounding the emergence of the discipline of History, one could argue that the discipline emerged from the processes that Herodotus authored his work, The Histories. While there was evidence of historical records predating this period, the methods used by the historian, such as "distinguishing between more or less reliable accounts", were the first recorded time the skill of analysis and evaluation, integral concepts within the methodology of contemporary historical thought. This claim is further substantiated by the term that Munslow defines in his book, What History Is, which is essentially a discipline that utilizes a narrative form of writing to challenge, analyze and recount a series of past events.

On the other hand, however, some would argue that by this point the discipline had not become an actual discipline and instead just a form of narrative literature. Some would argue that the creation of contemporary historical thought only emerged from the works of Leopold Von Ranke in the 1830s through his book Geschichten der Romanischen und Germanischen Völker von 1494 bis 1514 (Histories of the Latin and Teutonic Peoples from 1494 to 1514) where he argues that history should be “collected carefully, examined objectively and put together with critical rigour” ; essentially pioneering the methodology of critical historiography. Historians such as Fritz Stern argue that the creation of the discipline occurred in the "mid-nineteenth century" where "history had ceased to be a branch of literature and had become an academic discipline" through the creation of academic journals such as the Historische Zeitschrift in 1859 and the Revue Historique in 1876 who argued to approach history objectively - as Ranke did. The ‘Historische Zeitschrift’ in particular, emphasizes the significance of objectivity in the study of the discipline with its preface detailing that the “periodical should, above all, be a scientific one”.

Rather than arguing which one is more suited as the emergence of the discipline, I suggest that these two events serve as the introduction of two different concepts within historiography; the emergence of the historical method and the creation of the modern historical thought.

The Paradigm Shift in Historical Approaches in the 20th Century
In 1961, E.H. Carr released What is History? whose arguments were antithetical to the mainstream empiricist historiography that dominated the 19th and early 20th century. As a result, the theory of historiography in Britain, in particular, shifted towards a new equilibrium that pivoted from the preexisting epistemological certainty. Keith Windschuttle, though a harsh critic of Carr's work, argues that What is History was "one of the most influential books written about historiography, and that very few historians working in the English language since the 1960s had not read it".

In response to Carr’s attack on the objectivist historical thought, G.C. Elton released the Practice of History in 1967 in defence of the Victorian style of writing which scathingly criticized how Carr's work embodied "an extraordinarily arrogant attitude both to the past and to the place of the historian studying it”. While some may argue that debate serves to demonstrate the prevailing tensions between the new and radical approach to historiography and the existing pillar of empiricism, it was one of the many examples that delineated a paradigm shift in the approaches of historical thought. R.G. Collingwood, whose works heavily influenced Carr’s book, was one of the many historians who spearheaded the emergence of relativist ideas into the conscious historical thought that appeared in the early 20th century. Carr's radical work served as just one example of how the predominant influence of empirical thought in history had already eroded.

Today, historians have opted to adopt a middle-ground approach that maintains the skeletal form of appropriate analysis and evaluation but not to the extreme extent of extreme empiricism. This development that spanned for a whole century presents the idea that unlike the more quantitative disciplines, there isn't a specific event in history that could be pinpointed as a "turning point" that instantly shifted the approaches and methodology applied in history but rather a gradual development that caused a shift in the then-existing hegemony of logical positivism.

What is sociology?
Sociology is a discipline within the social sciences that studies social behaviour, interaction and culture to understand how they work and finding social explanations to certain phenomena. Sociologists believe that our behaviour is forged by our social interactions, thus, depend on the social groups we interact with.

The emergence of sociology as a discipline
The concern towards understanding society was present way before the term sociology was invented. It caught the interest of some famous philosophers like Aristotle and Plato. Among the people who have studied the interactions between individuals and society in history, we can identify the one considered as the first sociologist: Ibn Khaldun (1332–1406). He research into many topics which are still significant in the present-day such as theories of social conflict, social cohesion, economy and power.

The nineteenth century was a real turning point for the emergence of sociology as a distinct discipline, principally due to the Industrial Revolution. This period of transition to modernity had a huge impact on many aspects of social life such as an upheaval in social classes and the rise of many issues (poverty, misery, disorder, disease, unemployment). Faced to this, theorists found it important to understand social mechanisms to help cure the social crises that were disrupting Europe. Some of the principal actors, mostly philosophers, who participated in the elaboration of sociological theories were Auguste Comte, Herbert Spencer, Emile Durkheim and Karl Marx. Auguste Comte was the one who popularized the word sociology and helped to establish it as a distinct discipline from simple philosophical interest. However, the subject only began to be taught at universities in the 1890s in the United States.

Since the 1980s, sociologists have been interested in how new paradigms --notably globalization and modernity-- are changing the nature of our society, as explained by Ulrich Beck in his book “Risk Society”.

The emergence of Digital Sociology
Since their creation, digital technologies have changed our world. They gradually became integrated into our daily life until they became an integral part of it, as a result, they had an impact on our way of communicating with others. Sociologists began interested in the influence of these new technological objects, which were becoming increasingly important, on our social life from the 1980s when computers began to be democratized. It was the beginning of a new age, the digital age. Over time, digital technologies have gained in importance and this subject of the investigation has become popular in the news. Consequently, the work of sociologies have multiplied. Even if none of them considered themselves as digital sociologists, mostly because the word wasn’t used at the time, their great involvement in research has allowed the emergence of this subdiscipline of sociology. In 2009, the word “digital sociology” appeared for the first time in an American journal and in 2012 the British Sociological Association recognized this new discipline, which led to the creation of the first digital sociology master degree at Goldsmith University in London. Since the discipline has been recognized worldly.

Interdisciplinary work
According to the sociologist Deborah Lupton, “Sociology itself, like any other discipline, is a permeable and dynamic entity.” Indeed, this discipline has adapted to the demand of its time, the emergence of the very recent sub-discipline of digital sociology is a good example. But sociology is not the only one that has undergone changes during this period. There has also been the emergence of other sub-disciplines that were interested in different aspects of the digital revolution such as digital anthropology and media studies. These disciplines came together to try to analyze the impact of this digitalization on society by taking different approaches (economic, social, cultural ...). Thanks to their collaboration, they were able to be more effective in understanding the global issue and, individually, were able to benefit from the knowledge of other disciplines for their own research.

Definition of Neuropsychology
Neuropsychology is the study of the relationship between human psychological behaviour and the brain through neurological observations. It is a branch of psychology that looks at the correlation between the nervous system and cognitive functions including language, memory, and emotion.

Early Beginnings
For ages, the brain functions and its effects on human behaviour have been ignored until Hippocrates believed that the brain is the centre of human behaviour rather than the heart. The philosopher Rene Descartes expanded Hippocrates’ idea, but still lacked scientific evidence. It was Thomas Willis who first applied a scientific and physiological approach to the brain by analyzing specialized structures of the brain. Later, a Neuro-anatomist Franz Joseph Gall stepped further theorizing the relationship between personality and structures of the brain.

The emergence of Neuropsychology as a discipline
Inspired by the study about the impact of the anterior region on the functioning on one's speech, which was carried out by Jean-Baptiste Bouilaud, Paul Broca continued to expand this idea by studying the ways by which speech becomes comprehensible and produced. This led to the deepening of the understanding of the left hemisphere of the brain; eventually making neuropsychology to be respected as a discipline. Later, Sigmund Freud emphasized the overall structure of the brain, instead of specializing in problems with the specific areas when studying some deficits in the brain. Thus, his arguments contributed to the further development of “neuropsychology". In 1944, Ralph Reitan, he laid down the groundwork of the field such as pathognomonic signs, right-left comparisons, patterns of test results, and so on. This made him known as the father of neuropsychology. In the 1960s, neuropsychology was finally and formally accepted as one of the official fields of scientific inquiry in terms of human behaviour and brain functions.