“I tell you: one must still have chaos within oneself, to give birth to a dancing star” (Nietzsche in Thus Spoke Zarathustra).
By conservative estimates, the European Renaissance spans the historical period of the 15 th and the 16 th centuries, an era when artists produced many of greatest masterworks ever created. During this time Columbus landed in the New World, the Protestant Reformation was launched, and the Scientific Revolution was initiated. A bridge between the Middle Ages and the Modern World, the Renaissance was the result of a completely new perspective, you could even say a completely new paradigm, of what it meant to be human . As elaborated by the modern philosopher Richard Tarnas:
“Man was now capable of penetrating and reflecting nature’s secrets, in art as well as science, with unparalleled mathematical sophistication, empirical precision, and numinous aesthetic power. He had immensely expanded the known world, discovered new continents, and rounded the globe. He could defy traditional authorities and assert a truth based on his own judgment. He could appreciate the riches of classical culture and yet also feel himself breaking beyond the ancient boundaries to reveal entirely new realms… Individual genius and independence were widely in evidence. No domain of knowledge, creativity, or exploration seemed beyond man’s reach.”
During the Middle Ages, the individual was considered virtually inconsequential, a mere shadow at the feet of political and religious institutions. But for the new man of the Renaissance, “human life in this world seemed to hold an immediate inherent value, an excitement and existential significance” (Tarnas: 2010). Every society has as its foundation a particular worldview, a collection of beliefs and ideas that determine how groups of human beings perceive and experience all things. Professor Keiron Le Grice (2011) explains that “with regard to entire civilizations, a collective world view, at its deepest level, determines the prevailing understanding of the nature of reality itself.”
What then was the genesis of the worldview of the Renaissance? How did the conception of a human existence with meaningful potential and the idea that the secrets of nature were worth exploring penetrate the collective consciousness after the long slumber of the Dark Ages? In this article we will explore the theory that historical ages like the Renaissance represent a type of emergent phenomenon resulting from a confluence of temporal world events and simultaneous developments deep within the collective psyche of Western man.
Moynet lithograph of a truck loaded with plague victims. (Wellcome Images / CC BY 4.0 )
The Apocalypse of the Fourteenth Century
Over the course of the fourteenth century, a series of disasters struck Europe that completely disintegrated the world of the Middle Ages. During the Great Famine of 1315-1322, crop failures and the mass death of cattle and sheep propelled society into a barbaric era of starvation, disease, cannibalism, and infanticide. Epidemics of crime, especially rape and murder, ran rampant. The crises were compounded in October of 1347, when a group of ships harbored at Messina in Italy, bringing with them the scourge of the Black Plague .
The Italian poet Giovanni Boccaccio wrote in The Decameron that while some people “formed themselves into groups and lived in isolation from everyone else” to avoid the plague, there were also those who “maintained that an infallible way of warding off this appalling evil was to drink heavily, enjoy life to the full, go round singing and merry-making, gratify all of one’s cravings and shrug the whole thing off as one enormous joke.” The plague peaked in Europe by 1351, having killed, by some estimates, well over half the regional population. Among the English royal family, the average life expectancy dropped to the age of 29 during the Famine, and to the age of 17 with the arrival of the Plague. In his Cronaca Senese (1348), the Italian chronicler Agnolo di Tura recorded the terrifying reality of the Black Plague :
“Father abandoned child, wife husband, one brother another; for this illness seemed to strike through breath and sight. And so they died. None could be found to bury the dead for money or friendship. Members of a household brought their dead to a ditch as best they could, without priest, without divine offices. In many places in Siena great pits were dug and piled deep with the multitude of dead. And they died by the hundreds, both day and night, and all were thrown in those ditches and covered with earth. And as soon as those ditches were filled, more were dug. And I, Agnolo di Tura called the Fat, buried my five children with my own hands… there was no one who wept for any death, for all awaited death. And so many died that all believed it was the end of the world.”
The Triumph of Death by Pieter Brueghel the Elder.
During this chaotic period, many believed that the unstoppable scourge was punishment from God or even the end of the world, an apocalyptic view that spread quickly and inspired all manner of fanaticism. But there was also a growing sentiment that the Plague seriously undermined the legitimacy of the authority assumed by the institutionalized Church, as moral corruption within its own ranks also grew increasingly apparent. Out of this climate there arose movements that questioned the soundness of Catholic dogmas , hierarchy, and the Papacy itself. The Plague also unfolded against the backdrop of the Hundred Years War (1337-1453), as the long-standing tensions between the English and French crowns erupted into the longest armed conflict in European history. The war further contributed to the devastation, taking with it an estimated 2.3 to 3.3 million human lives.
The crises of the late Middle Ages also set many transformations in motion. Land and food costs plummeted, leading to the eventual destabilization of feudalism. There was also a new focus on the physical life of man and medical research, as well as a new demand for religious art and iconography. By the mid-fifteenth century, Europe had the first operational printing presses, which Tarnas explains enabled “rapid dissemination of new and often revolutionary ideas throughout Europe.” This advance “helped free the individual from traditional ways of thinking, and from collective control of thinking.” Complementing this intellectual boon in every way was the new availability of gunpowder, which served to further erode the absolute power of the old feudalist system and the Catholic Church.
In the midst of this incredible moment of transition, the small independent city-states of Italy became the center of coalescence for the forces that gave birth to the Renaissance. Here a culture of scholarship, artistic endeavors, loyalty to family, commercial activities, and the contemplation of eternal truths emerged to follow after the tempests of the fourteenth century. The world of the Middle Ages was well and truly dead.
Several scholars have argued that the Renaissance began in Florence due to the role of wealthy patrons in stimulating the arts. Lorenzo de’ Medici, seen here in a painting by Giorgio Vasari, encouraged arts patronage as ruler of Florence.
The Seeds of Intellectual Rebirth
Thus far, we have only considered the worldly events that preceded the Renaissance in time and therefore conditioned Europe for its reception. This narrative is only one half of the equation however, for the great works of art, uncompromising individualism, scholastic and scientific genius, and even great commercial endeavors were manifestations of a new worldview, which championed individual potential, diverse interests, creativity, and progress. The seed of this worldview was the re-introduction of ancient Greek philosophy into Western consciousness.
“Implicit in all these activities was the half-inarticulate notion of a distant mythical golden age when all things had been known - the Garden of Eden, ancient classical times, a past era of great sages… just as in classical Athens the religion, art, and myth of the ancient Greeks met and interacted with the new and equally Greek spirit of rationalism and science” (Tarnas: 2010).
The seeds of the restoration of ancient wisdom were actually planted in the fourteenth century by Francesco Petrarcha (1304-1374). Better known as Petrarch, he recovered the letters of Cicero and invigorated a great movement to translate the philosophical texts of antiquity, which was enhanced by an influx of scholars and manuscripts from the collapsing Byzantine Empire in the East. Eventually, major philosophical works, including those of Plato and Plotinus, were in circulation among intellectual circles in Italy. During the fifteenth century, the wisdom of the old world would be synthesized with Western thought and religion by a philosopher whose work could be said to embody the very essence of the Renaissance: Marsilio Ficino (1433-1499).
Marsilio Ficino was an influential humanist philosopher of the early Italian Renaissance. He revived Neoplatonism and was able to make several vital contributions to the history of Western thought. He can be seen here (on the left) in a fresco entitled the Zachariah in the Temple by Domenico Ghirlandaio.
Although Ficino became a Catholic priest in 1473, his incredible range of interests included medicine, Platonic and Hermetic philosophy, and astrology. Ficino was adopted into the household of Cosimo de Medici as a youth, and it was partly due to Cosimo’s patronage that he was able to make several vital contributions to the history of Western thought, including a Latin translation of the dialogues of Plato from Greek manuscripts published in 1484.
Cosimo himself was also immersed in philosophy, and the idealism of the age prompted him to found the Neoplatonic Florentine Academy, which was lead by Ficino and included a range of Renaissance poets, philosophers, and scholars, such as Cristofero Landino, Gentile de Becci, and Pico della Mirandola. Other than his translations of Plato, Ficino produced his own body of influential philosophical works, including Theologia Platonica (Platonic Theology) and De vita libri tres (Three Books on Life). Angela Voss (2006) explains the appeal of Platonic philosophy for Renaissance thinkers like Ficino:
“Plato was revered because he upheld the divinity and immortality of the soul—a soul which was free-ranging and self-willed, able to traverse all dimensions of existence...the human soul could dwell with the beasts or with the angels; it could live a life limited by the senses, or, through the cultivation of philosophy, liberate itself through self-knowledge. It could penetrate deeply into the true nature of things, or remain bound to a short-sighted vision of human affairs.”
Plato had discussed altered states of consciousness in his writings as the divine manias or frenzies. For Ficino, such states represented “the phenomenon of internal experience or internal ‘consciousness’… a heightened state of mind, experienced independently of and even in opposition to all outward events” (Kristeller 1943). Ficino associated these states with awakening to greater realities, as poetically described in Book 14 of Platonic Theology : “usually those are less deceived who at some time, as happens occasionally during sleep, become suspicious and say to themselves: ‘Perhaps those things are not true which now appear to us; perhaps we are now dreaming.’” As explained by esoteric scholar Wouter Hanegraaff (2015), Ficino’s philosophy sought a “superior knowledge” which “required an unusual, ecstatic or trance-like state.”
- The Renaissance: The ‘Rebirth’ That Changed the World
- Culling the World: The Catastrophic Conquests of the Black Death
- The Age of Discovery: A New World Dawns
By the time of the Neoplatonic philosophers, the cosmos was conceived as a layering of multiple realms descending from Above to Below. All things proceed from the One, the Pythagorean Monad, as the highest source of all existence. Next in the dimensional hierarchy comes the intelligible realm of the Platonic ideas or archetypes, and then the intermediary realm of the fixed stars and planets, which exert influence over the lower elemental realm and serve as symbols for the qualities of moments of time. The invisible energies that shape the world descend from the highest intelligible realm to the material realm of earth below, passing through the domain and influence of the celestials. This was a living cosmos, an ongoing process of creation intended by the Creator to operate in complete harmony. Central to this cosmic scheme was the idea that man is a microcosm, containing within himself an interior reality reflecting all of the components of the “outer” cosmos. Man could therefore “know” or experience creation by turning inward.
These conceptions of the cosmos and man greatly inspired Renaissance philosophy, and informed the emerging concepts of human dignity and potential. Thinkers like Marsilio Ficino sought to overcome the false binary choice between philosophy and religion, studying the ancient writers while also practicing the Christian faith in the belief that man could enhance his vision of reality by drinking from both wells.
History of accounting
The early development of accounting dates to ancient Mesopotamia, and is closely related to developments in writing, counting and money    and early auditing systems by the ancient Egyptians and Babylonians.  By the time of the Roman Empire, the government had access to detailed financial information. 
In India Chanakya wrote a manuscript similar to a financial management book, during the period of the Mauryan Empire. His book "Arthashasthra" contains few detailed aspects of maintaining books of accounts for a Sovereign State.
The Italian Luca Pacioli, recognized as The Father of accounting and bookkeeping was the first person to publish a work on double-entry bookkeeping, and introduced the field in Italy.  
The modern profession of the chartered accountant originated in Scotland in the nineteenth century. Accountants often belonged to the same associations as solicitors, who often offered accounting services to their clients. Early modern accounting had similarities to today's forensic accounting. Accounting began to transition into an organized profession in the nineteenth century,  with local professional bodies in England merging to form the Institute of Chartered Accountants in England and Wales in 1880. 
The earliest humans must have had and passed on knowledge about plants and animals to increase their chances of survival. This may have included knowledge of human and animal anatomy and aspects of animal behavior (such as migration patterns). However, the first major turning point in biological knowledge came with the Neolithic Revolution about 10,000 years ago. Humans first domesticated plants for farming, then livestock animals to accompany the resulting sedentary societies. 
In around 3000 to 1200 BCE, the Ancient Egyptians and Mesopotamians made contributions to astronomy, mathematics, and medicine,   which later entered and shaped Greek natural philosophy of classical antiquity, a period that profoundly influenced the development of what came to be known as biology. 
Ancient Egypt Edit
Over a dozen medical papyri have been preserved, most notably the Edwin Smith Papyrus (the oldest extant surgical handbook) and the Ebers Papyrus (a handbook of preparing and using materia medica for various diseases), both from around 1600 BCE. 
Ancient Egypt is also known for developing embalming, which was used for mummification, in order to preserve human remains and forestall decomposition. 
The Mesopotamians seem to have had little interest in the natural world as such, preferring to study how the gods had ordered the universe. Animal physiology was studied for divination, including especially the anatomy of the liver, seen as an important organ in haruspicy. Animal behavior too was studied for divinatory purposes. Most information about the training and domestication of animals was probably transmitted orally, but one text dealing with the training of horses has survived. 
The ancient Mesopotamians had no distinction between "rational science" and magic.    When a person became ill, doctors prescribed both magical formulas to be recited and medicinal treatments.    The earliest medical prescriptions appear in Sumerian during the Third Dynasty of Ur (c. 2112 – c. 2004 BCE).  The most extensive Babylonian medical text, however, is the Diagnostic Handbook written by the ummânū, or chief scholar, Esagil-kin-apli of Borsippa,  during the reign of the Babylonian king Adad-apla-iddina (1069 – 1046 BCE).  In East Semitic cultures, the main medicinal authority was an exorcist-healer known as an āšipu.    The profession was passed down from father to son and was held in high regard.  Of less frequent recourse was the asu, a healer who treated physical symptoms using remedies composed of herbs, animal products, and minerals, as well as potions, enemas, and ointments or poultices. These physicians, who could be either male or female, also dressed wounds, set limbs, and performed simple surgeries. The ancient Mesopotamians also practiced prophylaxis and took measures to prevent the spread of disease. 
Observations and theories regarding nature and human health, separate from Western traditions, had emerged independently in other civilizations such as those in China and the Indian subcontinent.  In ancient China, earlier conceptions can be found dispersed across several different disciplines, including the work of herbologists, physicians, alchemists, and philosophers. The Taoist tradition of Chinese alchemy, for example, emphasized health (with the ultimate goal being the elixir of life). The system of classical Chinese medicine usually revolved around the theory of yin and yang, and the five phases.  Taoist philosophers, such as Zhuangzi in the 4th century BCE, also expressed ideas related to evolution, such as denying the fixity of biological species and speculating that species had developed differing attributes in response to differing environments. 
One of the oldest organised systems of medicine is known from the Indian subcontinent in the form of Ayurveda, which originated around 1500 BCE from Atharvaveda (one of the four most ancient books of Indian knowledge, wisdom and culture).
The ancient Indian Ayurveda tradition independently developed the concept of three humours, resembling that of the four humours of ancient Greek medicine, though the Ayurvedic system included further complications, such as the body being composed of five elements and seven basic tissues. Ayurvedic writers also classified living things into four categories based on the method of birth (from the womb, eggs, heat & moisture, and seeds) and explained the conception of a fetus in detail. They also made considerable advances in the field of surgery, often without the use of human dissection or animal vivisection.  One of the earliest Ayurvedic treatises was the Sushruta Samhita, attributed to Sushruta in the 6th century BCE. It was also an early materia medica, describing 700 medicinal plants, 64 preparations from mineral sources, and 57 preparations based on animal sources. 
The pre-Socratic philosophers asked many questions about life but produced little systematic knowledge of specifically biological interest—though the attempts of the atomists to explain life in purely physical terms would recur periodically through the history of biology. However, the medical theories of Hippocrates and his followers, especially humorism, had a lasting impact. 
The philosopher Aristotle was the most influential scholar of the living world from classical antiquity. Though his early work in natural philosophy was speculative, Aristotle's later biological writings were more empirical, focusing on biological causation and the diversity of life. He made countless observations of nature, especially the habits and attributes of plants and animals in the world around him, which he devoted considerable attention to categorizing. In all, Aristotle classified 540 animal species, and dissected at least 50. He believed that intellectual purposes, formal causes, guided all natural processes. 
Aristotle, and nearly all Western scholars after him until the 18th century, believed that creatures were arranged in a graded scale of perfection rising from plants on up to humans: the scala naturae or Great Chain of Being.  Aristotle's successor at the Lyceum, Theophrastus, wrote a series of books on botany—the History of Plants—which survived as the most important contribution of antiquity to botany, even into the Middle Ages. Many of Theophrastus' names survive into modern times, such as carpos for fruit, and pericarpion for seed vessel. Dioscorides wrote a pioneering and encyclopaedic pharmacopoeia, De Materia Medica, incorporating descriptions of some 600 plants and their uses in medicine. Pliny the Elder, in his Natural History, assembled a similarly encyclopaedic account of things in nature, including accounts of many plants and animals. 
A few scholars in the Hellenistic period under the Ptolemies—particularly Herophilus of Chalcedon and Erasistratus of Chios—amended Aristotle's physiological work, even performing dissections and vivisections.  Claudius Galen became the most important authority on medicine and anatomy. Though a few ancient atomists such as Lucretius challenged the teleological Aristotelian viewpoint that all aspects of life are the result of design or purpose, teleology (and after the rise of Christianity, natural theology) would remain central to biological thought essentially until the 18th and 19th centuries. Ernst W. Mayr argued that "Nothing of any real consequence happened in biology after Lucretius and Galen until the Renaissance."  The ideas of the Greek traditions of natural history and medicine survived, but they were generally taken unquestioningly in medieval Europe. 
The decline of the Roman Empire led to the disappearance or destruction of much knowledge, though physicians still incorporated many aspects of the Greek tradition into training and practice. In Byzantium and the Islamic world, many of the Greek works were translated into Arabic and many of the works of Aristotle were preserved. 
During the High Middle Ages, a few European scholars such as Hildegard of Bingen, Albertus Magnus and Frederick II wrote on natural history. The rise of European universities, though important for the development of physics and philosophy, had little impact on biological scholarship. 
The European Renaissance brought expanded interest in both empirical natural history and physiology. In 1543, Andreas Vesalius inaugurated the modern era of Western medicine with his seminal human anatomy treatise De humani corporis fabrica, which was based on dissection of corpses. Vesalius was the first in a series of anatomists who gradually replaced scholasticism with empiricism in physiology and medicine, relying on first-hand experience rather than authority and abstract reasoning. Via herbalism, medicine was also indirectly the source of renewed empiricism in the study of plants. Otto Brunfels, Hieronymus Bock and Leonhart Fuchs wrote extensively on wild plants, the beginning of a nature-based approach to the full range of plant life.  Bestiaries—a genre that combines both the natural and figurative knowledge of animals—also became more sophisticated, especially with the work of William Turner, Pierre Belon, Guillaume Rondelet, Conrad Gessner, and Ulisse Aldrovandi. 
Artists such as Albrecht Dürer and Leonardo da Vinci, often working with naturalists, were also interested in the bodies of animals and humans, studying physiology in detail and contributing to the growth of anatomical knowledge.  The traditions of alchemy and natural magic, especially in the work of Paracelsus, also laid claim to knowledge of the living world. Alchemists subjected organic matter to chemical analysis and experimented liberally with both biological and mineral pharmacology.  This was part of a larger transition in world views (the rise of the mechanical philosophy) that continued into the 17th century, as the traditional metaphor of nature as organism was replaced by the nature as machine metaphor. 
Systematizing, naming and classifying dominated natural history throughout much of the 17th and 18th centuries. Carl Linnaeus published a basic taxonomy for the natural world in 1735 (variations of which have been in use ever since), and in the 1750s introduced scientific names for all his species.  While Linnaeus conceived of species as unchanging parts of a designed hierarchy, the other great naturalist of the 18th century, Georges-Louis Leclerc, Comte de Buffon, treated species as artificial categories and living forms as malleable—even suggesting the possibility of common descent. Though he was opposed to evolution, Buffon is a key figure in the history of evolutionary thought his work would influence the evolutionary theories of both Lamarck and Darwin. 
The discovery and description of new species and the collection of specimens became a passion of scientific gentlemen and a lucrative enterprise for entrepreneurs many naturalists traveled the globe in search of scientific knowledge and adventure. 
Extending the work of Vesalius into experiments on still living bodies (of both humans and animals), William Harvey and other natural philosophers investigated the roles of blood, veins and arteries. Harvey's De motu cordis in 1628 was the beginning of the end for Galenic theory, and alongside Santorio Santorio's studies of metabolism, it served as an influential model of quantitative approaches to physiology. 
In the early 17th century, the micro-world of biology was just beginning to open up. A few lensmakers and natural philosophers had been creating crude microscopes since the late 16th century, and Robert Hooke published the seminal Micrographia based on observations with his own compound microscope in 1665. But it was not until Antonie van Leeuwenhoek's dramatic improvements in lensmaking beginning in the 1670s—ultimately producing up to 200-fold magnification with a single lens—that scholars discovered spermatozoa, bacteria, infusoria and the sheer strangeness and diversity of microscopic life. Similar investigations by Jan Swammerdam led to new interest in entomology and built the basic techniques of microscopic dissection and staining. 
As the microscopic world was expanding, the macroscopic world was shrinking. Botanists such as John Ray worked to incorporate the flood of newly discovered organisms shipped from across the globe into a coherent taxonomy, and a coherent theology (natural theology).  Debate over another flood, the Noachian, catalyzed the development of paleontology in 1669 Nicholas Steno published an essay on how the remains of living organisms could be trapped in layers of sediment and mineralized to produce fossils. Although Steno's ideas about fossilization were well known and much debated among natural philosophers, an organic origin for all fossils would not be accepted by all naturalists until the end of the 18th century due to philosophical and theological debate about issues such as the age of the earth and extinction. 
Up through the 19th century, the scope of biology was largely divided between medicine, which investigated questions of form and function (i.e., physiology), and natural history, which was concerned with the diversity of life and interactions among different forms of life and between life and non-life. By 1900, much of these domains overlapped, while natural history (and its counterpart natural philosophy) had largely given way to more specialized scientific disciplines—cytology, bacteriology, morphology, embryology, geography, and geology.
Use of the term biology Edit
The term biology in its modern sense appears to have been introduced independently by Thomas Beddoes (in 1799),  Karl Friedrich Burdach (in 1800), Gottfried Reinhold Treviranus (Biologie oder Philosophie der lebenden Natur, 1802) and Jean-Baptiste Lamarck (Hydrogéologie, 1802).   The word itself appears in the title of Volume 3 of Michael Christoph Hanow's Philosophiae naturalis sive physicae dogmaticae: Geologia, biologia, phytologia generalis et dendrologia, published in 1766.
Before biology, there were several terms used for the study of animals and plants. Natural history referred to the descriptive aspects of biology, though it also included mineralogy and other non-biological fields from the Middle Ages through the Renaissance, the unifying framework of natural history was the scala naturae or Great Chain of Being. Natural philosophy and natural theology encompassed the conceptual and metaphysical basis of plant and animal life, dealing with problems of why organisms exist and behave the way they do, though these subjects also included what is now geology, physics, chemistry, and astronomy. Physiology and (botanical) pharmacology were the province of medicine. Botany, zoology, and (in the case of fossils) geology replaced natural history and natural philosophy in the 18th and 19th centuries before biology was widely adopted.   To this day, "botany" and "zoology" are widely used, although they have been joined by other sub-disciplines of biology.
Natural history and natural philosophy Edit
Widespread travel by naturalists in the early-to-mid-19th century resulted in a wealth of new information about the diversity and distribution of living organisms. Of particular importance was the work of Alexander von Humboldt, which analyzed the relationship between organisms and their environment (i.e., the domain of natural history) using the quantitative approaches of natural philosophy (i.e., physics and chemistry). Humboldt's work laid the foundations of biogeography and inspired several generations of scientists. 
Geology and paleontology Edit
The emerging discipline of geology also brought natural history and natural philosophy closer together the establishment of the stratigraphic column linked the spatial distribution of organisms to their temporal distribution, a key precursor to concepts of evolution. Georges Cuvier and others made great strides in comparative anatomy and paleontology in the late 1790s and early 19th century. In a series of lectures and papers that made detailed comparisons between living mammals and fossil remains Cuvier was able to establish that the fossils were remains of species that had become extinct—rather than being remains of species still alive elsewhere in the world, as had been widely believed.  Fossils discovered and described by Gideon Mantell, William Buckland, Mary Anning, and Richard Owen among others helped establish that there had been an 'age of reptiles' that had preceded even the prehistoric mammals. These discoveries captured the public imagination and focused attention on the history of life on earth.  Most of these geologists held to catastrophism, but Charles Lyell's influential Principles of Geology (1830) popularised Hutton's uniformitarianism, a theory that explained the geological past and present on equal terms. 
Evolution and biogeography Edit
The most significant evolutionary theory before Darwin's was that of Jean-Baptiste Lamarck based on the inheritance of acquired characteristics (an inheritance mechanism that was widely accepted until the 20th century), it described a chain of development stretching from the lowliest microbe to humans.  The British naturalist Charles Darwin, combining the biogeographical approach of Humboldt, the uniformitarian geology of Lyell, Thomas Malthus's writings on population growth, and his own morphological expertise, created a more successful evolutionary theory based on natural selection similar evidence led Alfred Russel Wallace to independently reach the same conclusions. 
The 1859 publication of Darwin's theory in On the Origin of Species by Means of Natural Selection, or the Preservation of Favoured Races in the Struggle for Life is often considered the central event in the history of modern biology. Darwin's established credibility as a naturalist, the sober tone of the work, and most of all the sheer strength and volume of evidence presented, allowed Origin to succeed where previous evolutionary works such as the anonymous Vestiges of Creation had failed. Most scientists were convinced of evolution and common descent by the end of the 19th century. However, natural selection would not be accepted as the primary mechanism of evolution until well into the 20th century, as most contemporary theories of heredity seemed incompatible with the inheritance of random variation. 
Wallace, following on earlier work by de Candolle, Humboldt and Darwin, made major contributions to zoogeography. Because of his interest in the transmutation hypothesis, he paid particular attention to the geographical distribution of closely allied species during his field work first in South America and then in the Malay archipelago. While in the archipelago he identified the Wallace line, which runs through the Spice Islands dividing the fauna of the archipelago between an Asian zone and a New Guinea/Australian zone. His key question, as to why the fauna of islands with such similar climates should be so different, could only be answered by considering their origin. In 1876 he wrote The Geographical Distribution of Animals, which was the standard reference work for over half a century, and a sequel, Island Life, in 1880 that focused on island biogeography. He extended the six-zone system developed by Philip Sclater for describing the geographical distribution of birds to animals of all kinds. His method of tabulating data on animal groups in geographic zones highlighted the discontinuities and his appreciation of evolution allowed him to propose rational explanations, which had not been done before.  
The scientific study of heredity grew rapidly in the wake of Darwin's Origin of Species with the work of Francis Galton and the biometricians. The origin of genetics is usually traced to the 1866 work of the monk Gregor Mendel, who would later be credited with the laws of inheritance. However, his work was not recognized as significant until 35 years afterward. In the meantime, a variety of theories of inheritance (based on pangenesis, orthogenesis, or other mechanisms) were debated and investigated vigorously.  Embryology and ecology also became central biological fields, especially as linked to evolution and popularized in the work of Ernst Haeckel. Most of the 19th century work on heredity, however, was not in the realm of natural history, but that of experimental physiology.
Over the course of the 19th century, the scope of physiology expanded greatly, from a primarily medically oriented field to a wide-ranging investigation of the physical and chemical processes of life—including plants, animals, and even microorganisms in addition to man. Living things as machines became a dominant metaphor in biological (and social) thinking. 
Cell theory, embryology and germ theory Edit
Advances in microscopy also had a profound impact on biological thinking. In the early 19th century, a number of biologists pointed to the central importance of the cell. In 1838 and 1839, Schleiden and Schwann began promoting the ideas that (1) the basic unit of organisms is the cell and (2) that individual cells have all the characteristics of life, though they opposed the idea that (3) all cells come from the division of other cells. Thanks to the work of Robert Remak and Rudolf Virchow, however, by the 1860s most biologists accepted all three tenets of what came to be known as cell theory. 
Cell theory led biologists to re-envision individual organisms as interdependent assemblages of individual cells. Scientists in the rising field of cytology, armed with increasingly powerful microscopes and new staining methods, soon found that even single cells were far more complex than the homogeneous fluid-filled chambers described by earlier microscopists. Robert Brown had described the nucleus in 1831, and by the end of the 19th century cytologists identified many of the key cell components: chromosomes, centrosomes mitochondria, chloroplasts, and other structures made visible through staining. Between 1874 and 1884 Walther Flemming described the discrete stages of mitosis, showing that they were not artifacts of staining but occurred in living cells, and moreover, that chromosomes doubled in number just before the cell divided and a daughter cell was produced. Much of the research on cell reproduction came together in August Weismann's theory of heredity: he identified the nucleus (in particular chromosomes) as the hereditary material, proposed the distinction between somatic cells and germ cells (arguing that chromosome number must be halved for germ cells, a precursor to the concept of meiosis), and adopted Hugo de Vries's theory of pangenes. Weismannism was extremely influential, especially in the new field of experimental embryology. 
By the mid-1850s the miasma theory of disease was largely superseded by the germ theory of disease, creating extensive interest in microorganisms and their interactions with other forms of life. By the 1880s, bacteriology was becoming a coherent discipline, especially through the work of Robert Koch, who introduced methods for growing pure cultures on agar gels containing specific nutrients in Petri dishes. The long-held idea that living organisms could easily originate from nonliving matter (spontaneous generation) was attacked in a series of experiments carried out by Louis Pasteur, while debates over vitalism vs. mechanism (a perennial issue since the time of Aristotle and the Greek atomists) continued apace. 
Rise of organic chemistry and experimental physiology Edit
In chemistry, one central issue was the distinction between organic and inorganic substances, especially in the context of organic transformations such as fermentation and putrefaction. Since Aristotle these had been considered essentially biological (vital) processes. However, Friedrich Wöhler, Justus Liebig and other pioneers of the rising field of organic chemistry—building on the work of Lavoisier—showed that the organic world could often be analyzed by physical and chemical methods. In 1828 Wöhler showed that the organic substance urea could be created by chemical means that do not involve life, providing a powerful challenge to vitalism. Cell extracts ("ferments") that could effect chemical transformations were discovered, beginning with diastase in 1833. By the end of the 19th century the concept of enzymes was well established, though equations of chemical kinetics would not be applied to enzymatic reactions until the early 20th century. 
Physiologists such as Claude Bernard explored (through vivisection and other experimental methods) the chemical and physical functions of living bodies to an unprecedented degree, laying the groundwork for endocrinology (a field that developed quickly after the discovery of the first hormone, secretin, in 1902), biomechanics, and the study of nutrition and digestion. The importance and diversity of experimental physiology methods, within both medicine and biology, grew dramatically over the second half of the 19th century. The control and manipulation of life processes became a central concern, and experiment was placed at the center of biological education. 
At the beginning of the 20th century, biological research was largely a professional endeavour. Most work was still done in the natural history mode, which emphasized morphological and phylogenetic analysis over experiment-based causal explanations. However, anti-vitalist experimental physiologists and embryologists, especially in Europe, were increasingly influential. The tremendous success of experimental approaches to development, heredity, and metabolism in the 1900s and 1910s demonstrated the power of experimentation in biology. In the following decades, experimental work replaced natural history as the dominant mode of research. 
Ecology and environmental science Edit
In the early 20th century, naturalists were faced with increasing pressure to add rigor and preferably experimentation to their methods, as the newly prominent laboratory-based biological disciplines had done. Ecology had emerged as a combination of biogeography with the biogeochemical cycle concept pioneered by chemists field biologists developed quantitative methods such as the quadrat and adapted laboratory instruments and cameras for the field to further set their work apart from traditional natural history. Zoologists and botanists did what they could to mitigate the unpredictability of the living world, performing laboratory experiments and studying semi-controlled natural environments such as gardens new institutions like the Carnegie Station for Experimental Evolution and the Marine Biological Laboratory provided more controlled environments for studying organisms through their entire life cycles. 
The ecological succession concept, pioneered in the 1900s and 1910s by Henry Chandler Cowles and Frederic Clements, was important in early plant ecology.  Alfred Lotka's predator-prey equations, G. Evelyn Hutchinson's studies of the biogeography and biogeochemical structure of lakes and rivers (limnology) and Charles Elton's studies of animal food chains were pioneers among the succession of quantitative methods that colonized the developing ecological specialties. Ecology became an independent discipline in the 1940s and 1950s after Eugene P. Odum synthesized many of the concepts of ecosystem ecology, placing relationships between groups of organisms (especially material and energy relationships) at the center of the field. 
In the 1960s, as evolutionary theorists explored the possibility of multiple units of selection, ecologists turned to evolutionary approaches. In population ecology, debate over group selection was brief but vigorous by 1970, most biologists agreed that natural selection was rarely effective above the level of individual organisms. The evolution of ecosystems, however, became a lasting research focus. Ecology expanded rapidly with the rise of the environmental movement the International Biological Program attempted to apply the methods of big science (which had been so successful in the physical sciences) to ecosystem ecology and pressing environmental issues, while smaller-scale independent efforts such as island biogeography and the Hubbard Brook Experimental Forest helped redefine the scope of an increasingly diverse discipline. 
Classical genetics, the modern synthesis, and evolutionary theory Edit
1900 marked the so-called rediscovery of Mendel: Hugo de Vries, Carl Correns, and Erich von Tschermak independently arrived at Mendel's laws (which were not actually present in Mendel's work).  Soon after, cytologists (cell biologists) proposed that chromosomes were the hereditary material. Between 1910 and 1915, Thomas Hunt Morgan and the "Drosophilists" in his fly lab forged these two ideas—both controversial—into the "Mendelian-chromosome theory" of heredity.  They quantified the phenomenon of genetic linkage and postulated that genes reside on chromosomes like beads on string they hypothesized crossing over to explain linkage and constructed genetic maps of the fruit fly Drosophila melanogaster, which became a widely used model organism. 
Hugo de Vries tried to link the new genetics with evolution building on his work with heredity and hybridization, he proposed a theory of mutationism, which was widely accepted in the early 20th century. Lamarckism, or the theory of inheritance of acquired characteristics also had many adherents. Darwinism was seen as incompatible with the continuously variable traits studied by biometricians, which seemed only partially heritable. In the 1920s and 1930s—following the acceptance of the Mendelian-chromosome theory— the emergence of the discipline of population genetics, with the work of R.A. Fisher, J.B.S. Haldane and Sewall Wright, unified the idea of evolution by natural selection with Mendelian genetics, producing the modern synthesis. The inheritance of acquired characters was rejected, while mutationism gave way as genetic theories matured. 
In the second half of the century the ideas of population genetics began to be applied in the new discipline of the genetics of behavior, sociobiology, and, especially in humans, evolutionary psychology. In the 1960s W.D. Hamilton and others developed game theory approaches to explain altruism from an evolutionary perspective through kin selection. The possible origin of higher organisms through endosymbiosis, and contrasting approaches to molecular evolution in the gene-centered view (which held selection as the predominant cause of evolution) and the neutral theory (which made genetic drift a key factor) spawned perennial debates over the proper balance of adaptationism and contingency in evolutionary theory. 
In the 1970s Stephen Jay Gould and Niles Eldredge proposed the theory of punctuated equilibrium which holds that stasis is the most prominent feature of the fossil record, and that most evolutionary changes occur rapidly over relatively short periods of time.  In 1980 Luis Alvarez and Walter Alvarez proposed the hypothesis that an impact event was responsible for the Cretaceous–Paleogene extinction event.  Also in the early 1980s, statistical analysis of the fossil record of marine organisms published by Jack Sepkoski and David M. Raup led to a better appreciation of the importance of mass extinction events to the history of life on earth. 
Biochemistry, microbiology, and molecular biology Edit
By the end of the 19th century all of the major pathways of drug metabolism had been discovered, along with the outlines of protein and fatty acid metabolism and urea synthesis.  In the early decades of the 20th century, the minor components of foods in human nutrition, the vitamins, began to be isolated and synthesized. Improved laboratory techniques such as chromatography and electrophoresis led to rapid advances in physiological chemistry, which—as biochemistry—began to achieve independence from its medical origins. In the 1920s and 1930s, biochemists—led by Hans Krebs and Carl and Gerty Cori—began to work out many of the central metabolic pathways of life: the citric acid cycle, glycogenesis and glycolysis, and the synthesis of steroids and porphyrins. Between the 1930s and 1950s, Fritz Lipmann and others established the role of ATP as the universal carrier of energy in the cell, and mitochondria as the powerhouse of the cell. Such traditionally biochemical work continued to be very actively pursued throughout the 20th century and into the 21st. 
Origins of molecular biology Edit
Following the rise of classical genetics, many biologists—including a new wave of physical scientists in biology—pursued the question of the gene and its physical nature. Warren Weaver—head of the science division of the Rockefeller Foundation—issued grants to promote research that applied the methods of physics and chemistry to basic biological problems, coining the term molecular biology for this approach in 1938 many of the significant biological breakthroughs of the 1930s and 1940s were funded by the Rockefeller Foundation. 
Like biochemistry, the overlapping disciplines of bacteriology and virology (later combined as microbiology), situated between science and medicine, developed rapidly in the early 20th century. Félix d'Herelle's isolation of bacteriophage during World War I initiated a long line of research focused on phage viruses and the bacteria they infect. 
The development of standard, genetically uniform organisms that could produce repeatable experimental results was essential for the development of molecular genetics. After early work with Drosophila and maize, the adoption of simpler model systems like the bread mold Neurospora crassa made it possible to connect genetics to biochemistry, most importantly with Beadle and Tatum's one gene-one enzyme hypothesis in 1941. Genetics experiments on even simpler systems like tobacco mosaic virus and bacteriophage, aided by the new technologies of electron microscopy and ultracentrifugation, forced scientists to re-evaluate the literal meaning of life virus heredity and reproducing nucleoprotein cell structures outside the nucleus ("plasmagenes") complicated the accepted Mendelian-chromosome theory. 
Oswald Avery showed in 1943 that DNA was likely the genetic material of the chromosome, not its protein the issue was settled decisively with the 1952 Hershey–Chase experiment—one of many contributions from the so-called phage group centered around physicist-turned-biologist Max Delbrück. In 1953 James Watson and Francis Crick, building on the work of Maurice Wilkins and Rosalind Franklin, suggested that the structure of DNA was a double helix. In their famous paper "Molecular structure of Nucleic Acids", Watson and Crick noted coyly, "It has not escaped our notice that the specific pairing we have postulated immediately suggests a possible copying mechanism for the genetic material."  After the 1958 Meselson–Stahl experiment confirmed the semiconservative replication of DNA, it was clear to most biologists that nucleic acid sequence must somehow determine amino acid sequence in proteins physicist George Gamow proposed that a fixed genetic code connected proteins and DNA. Between 1953 and 1961, there were few known biological sequences—either DNA or protein—but an abundance of proposed code systems, a situation made even more complicated by expanding knowledge of the intermediate role of RNA. To actually decipher the code, it took an extensive series of experiments in biochemistry and bacterial genetics, between 1961 and 1966—most importantly the work of Nirenberg and Khorana. 
Expansion of molecular biology Edit
In addition to the Division of Biology at Caltech, the Laboratory of Molecular Biology (and its precursors) at Cambridge, and a handful of other institutions, the Pasteur Institute became a major center for molecular biology research in the late 1950s.  Scientists at Cambridge, led by Max Perutz and John Kendrew, focused on the rapidly developing field of structural biology, combining X-ray crystallography with Molecular modelling and the new computational possibilities of digital computing (benefiting both directly and indirectly from the military funding of science). A number of biochemists led by Frederick Sanger later joined the Cambridge lab, bringing together the study of macromolecular structure and function.  At the Pasteur Institute, François Jacob and Jacques Monod followed the 1959 PaJaMo experiment with a series of publications regarding the lac operon that established the concept of gene regulation and identified what came to be known as messenger RNA.  By the mid-1960s, the intellectual core of molecular biology—a model for the molecular basis of metabolism and reproduction— was largely complete. 
The late 1950s to the early 1970s was a period of intense research and institutional expansion for molecular biology, which had only recently become a somewhat coherent discipline. In what organismic biologist E. O. Wilson called "The Molecular Wars", the methods and practitioners of molecular biology spread rapidly, often coming to dominate departments and even entire disciplines.  Molecularization was particularly important in genetics, immunology, embryology, and neurobiology, while the idea that life is controlled by a "genetic program"—a metaphor Jacob and Monod introduced from the emerging fields of cybernetics and computer science—became an influential perspective throughout biology.  Immunology in particular became linked with molecular biology, with innovation flowing both ways: the clonal selection theory developed by Niels Jerne and Frank Macfarlane Burnet in the mid-1950s helped shed light on the general mechanisms of protein synthesis. 
Resistance to the growing influence of molecular biology was especially evident in evolutionary biology. Protein sequencing had great potential for the quantitative study of evolution (through the molecular clock hypothesis), but leading evolutionary biologists questioned the relevance of molecular biology for answering the big questions of evolutionary causation. Departments and disciplines fractured as organismic biologists asserted their importance and independence: Theodosius Dobzhansky made the famous statement that "nothing in biology makes sense except in the light of evolution" as a response to the molecular challenge. The issue became even more critical after 1968 Motoo Kimura's neutral theory of molecular evolution suggested that natural selection was not the ubiquitous cause of evolution, at least at the molecular level, and that molecular evolution might be a fundamentally different process from morphological evolution. (Resolving this "molecular/morphological paradox" has been a central focus of molecular evolution research since the 1960s.) 
Biotechnology, genetic engineering, and genomics Edit
Biotechnology in the general sense has been an important part of biology since the late 19th century. With the industrialization of brewing and agriculture, chemists and biologists became aware of the great potential of human-controlled biological processes. In particular, fermentation proved a great boon to chemical industries. By the early 1970s, a wide range of biotechnologies were being developed, from drugs like penicillin and steroids to foods like Chlorella and single-cell protein to gasohol—as well as a wide range of hybrid high-yield crops and agricultural technologies, the basis for the Green Revolution. 
Recombinant DNA Edit
Biotechnology in the modern sense of genetic engineering began in the 1970s, with the invention of recombinant DNA techniques.  Restriction enzymes were discovered and characterized in the late 1960s, following on the heels of the isolation, then duplication, then synthesis of viral genes. Beginning with the lab of Paul Berg in 1972 (aided by EcoRI from Herbert Boyer's lab, building on work with ligase by Arthur Kornberg's lab), molecular biologists put these pieces together to produce the first transgenic organisms. Soon after, others began using plasmid vectors and adding genes for antibiotic resistance, greatly increasing the reach of the recombinant techniques. 
Wary of the potential dangers (particularly the possibility of a prolific bacteria with a viral cancer-causing gene), the scientific community as well as a wide range of scientific outsiders reacted to these developments with both enthusiasm and fearful restraint. Prominent molecular biologists led by Berg suggested a temporary moratorium on recombinant DNA research until the dangers could be assessed and policies could be created. This moratorium was largely respected, until the participants in the 1975 Asilomar Conference on Recombinant DNA created policy recommendations and concluded that the technology could be used safely. 
Following Asilomar, new genetic engineering techniques and applications developed rapidly. DNA sequencing methods improved greatly (pioneered by Frederick Sanger and Walter Gilbert), as did oligonucleotide synthesis and transfection techniques.  Researchers learned to control the expression of transgenes, and were soon racing—in both academic and industrial contexts—to create organisms capable of expressing human genes for the production of human hormones. However, this was a more daunting task than molecular biologists had expected developments between 1977 and 1980 showed that, due to the phenomena of split genes and splicing, higher organisms had a much more complex system of gene expression than the bacteria models of earlier studies.  The first such race, for synthesizing human insulin, was won by Genentech. This marked the beginning of the biotech boom (and with it, the era of gene patents), with an unprecedented level of overlap between biology, industry, and law. 
Molecular systematics and genomics Edit
By the 1980s, protein sequencing had already transformed methods of scientific classification of organisms (especially cladistics) but biologists soon began to use RNA and DNA sequences as characters this expanded the significance of molecular evolution within evolutionary biology, as the results of molecular systematics could be compared with traditional evolutionary trees based on morphology. Following the pioneering ideas of Lynn Margulis on endosymbiotic theory, which holds that some of the organelles of eukaryotic cells originated from free living prokaryotic organisms through symbiotic relationships, even the overall division of the tree of life was revised. Into the 1990s, the five domains (Plants, Animals, Fungi, Protists, and Monerans) became three (the Archaea, the Bacteria, and the Eukarya) based on Carl Woese's pioneering molecular systematics work with 16S rRNA sequencing. 
The development and popularization of the polymerase chain reaction (PCR) in mid-1980s (by Kary Mullis and others at Cetus Corp.) marked another watershed in the history of modern biotechnology, greatly increasing the ease and speed of genetic analysis.  Coupled with the use of expressed sequence tags, PCR led to the discovery of many more genes than could be found through traditional biochemical or genetic methods and opened the possibility of sequencing entire genomes. 
The unity of much of the morphogenesis of organisms from fertilized egg to adult began to be unraveled after the discovery of the homeobox genes, first in fruit flies, then in other insects and animals, including humans. These developments led to advances in the field of evolutionary developmental biology towards understanding how the various body plans of the animal phyla have evolved and how they are related to one another. 
The Human Genome Project—the largest, most costly single biological study ever undertaken—began in 1988 under the leadership of James D. Watson, after preliminary work with genetically simpler model organisms such as E. coli, S. cerevisiae and C. elegans. Shotgun sequencing and gene discovery methods pioneered by Craig Venter—and fueled by the financial promise of gene patents with Celera Genomics— led to a public–private sequencing competition that ended in compromise with the first draft of the human DNA sequence announced in 2000. 
At the beginning of the 21st century, biological sciences converged with previously differentiated new and classic disciplines like Physics into research fields like Biophysics. Advances were made in analytical chemistry and physics instrumentation including improved sensors, optics, tracers, instrumentation, signal processing, networks, robots, satellites, and compute power for data collection, storage, analysis, modeling, visualization, and simulations. These technology advances allowed theoretical and experimental research including internet publication of molecular biochemistry, biological systems, and ecosystems science. This enabled worldwide access to better measurements, theoretical models, complex simulations, theory predictive model experimentation, analysis, worldwide internet observational data reporting, open peer-review, collaboration, and internet publication. New fields of biological sciences research emerged including Bioinformatics, Neuroscience, Theoretical biology, Computational genomics, Astrobiology and Synthetic Biology.
- ^ abcdefg Magner, Louis N. (2002). "The origins of the life sciences". A History of the Life Sciences (3rd ed.). New York: CRC Press. pp. 1–40. ISBN0824708245 .
- ^ ab
- Lindberg, David C. (2007). "Science before the Greeks". The beginnings of Western science: the European Scientific tradition in philosophical, religious, and institutional context (Second ed.). Chicago, Illinois: University of Chicago Press. pp. 1–20. ISBN978-0-226-48205-7 .
- Grant, Edward (2007). "Ancient Egypt to Plato". A History of Natural Philosophy: From the Ancient World to the Nineteenth Century (First ed.). New York, New York: Cambridge University Press. pp. 1–26. ISBN978-052-1-68957-1 .
- ^ ab
- McIntosh, Jane R. (2005). Ancient Mesopotamia: New Perspectives. Santa Barbara, California, Denver, Colorado, and Oxford, England: ABC-CLIO. pp. 273–276. ISBN978-1-57607-966-9 .
- ^ abcd
- Farber, Walter (1995). Witchcraft, Magic, and Divination in Ancient Mesopotamia. Civilizations of the Ancient Near East. 3. New York City, New York: Charles Schribner’s Sons, MacMillan Library Reference USA, Simon & Schuster MacMillan. pp. 1891–1908. ISBN9780684192796 . Retrieved 12 May 2018 .
- ^ abc
- Abusch, Tzvi (2002). Mesopotamian Witchcraft: Towards a History and Understanding of Babylonian Witchcraft Beliefs and Literature. Leiden, The Netherlands: Brill. p. 56. ISBN9789004123878 .
- ^ abc
- Brown, Michael (1995). Israel's Divine Healer. Grand Rapids, Michigan: Zondervan. p. 42. ISBN9780310200291 .
- R D. Biggs (2005). "Medicine, Surgery, and Public Health in Ancient Mesopotamia". Journal of Assyrian Academic Studies. 19 (1): 7–18.
- Heeßel, N. P. (2004). "Diagnosis, Divination, and Disease: Towards an Understanding of the Rationale Behind the Babylonian Diagonostic Handbook". In Horstmanshoff, H. F. J. Stol, Marten Tilburg, Cornelis (eds.). Magic and Rationality in Ancient Near Eastern and Graeco-Roman Medicine. Studies in Ancient Medicine. 27. Leiden, The Netherlands: Brill. pp. 97–116. ISBN978-90-04-13666-3 .
- ^ Marten Stol (1993), Epilepsy in Babylonia, p. 55, Brill Publishers, 90-72371-63-1.
- Needham, Joseph Ronan, Colin Alistair (1995). The Shorter Science and Civilisation in China: An Abridgement of Joseph Needham's Original Text, Vol. 1. Cambridge University Press. p. 101. ISBN978-0-521-29286-3 .
- Girish Dwivedi, Shridhar Dwivedi (2007). "History of Medicine: Sushruta – the Clinician – Teacher par Excellence" (PDF) . Indian J Chest Dis Allied Sci. National Informatics Centre. 49: 243–244. Archived from the original (PDF) on 10 October 2008 . Retrieved 8 October 2008 .
- ^ Mayr, The Growth of Biological Thought, pp 84–90, 135 Mason, A History of the Sciences, p 41–44
- ^ Mayr, The Growth of Biological Thought, pp 201–202 see also: Lovejoy, The Great Chain of Being
- ^ Mayr, The Growth of Biological Thought, pp 90–91 Mason, A History of the Sciences, p 46
- ^ Barnes, Hellenistic Philosophy and Science, p 383–384
- ^ Mayr, The Growth of Biological Thought, pp 90–94 quotation from p 91
- ^ Annas, Classical Greek Philosophy, p 252
- ^ Mayr, The Growth of Biological Thought, pp 91–94
- ^ Mayr, The Growth of Biological Thought, pp 91–94:
"As far as biology as a whole is concerned, it was not until the late eighteenth and early nineteenth century that the universities became centers of biological research."
The Birth of the Renaissance: Understanding the Genesis of a New Era - History
How disable people have been marginalized through the ages and their present struggle for their human rights.
Bodily difference has for centuries determined social structures by defining certain bodies as the norm, and defining those which fall outside the norm as 'Other' with the degree of 'Otherness' being defined by the degree of variation from the norm. In doing this, we have created an artificial 'paradigm of humanity' into which some of us fit neatly, and others fit very badly. Life outside the paradigm of humanity is likely to be characterized by isolation and abuse.
The story we have recorded of the lives of people with disability is a story of life lived on the margins. For people with disability, their history is largely a history of silence. The lives of people with disability have not only been constructed as 'Other', but frequently as 'the Other' of 'the Other'. People with disability are marginalized even by those who are themselves marginalized.
While it is difficult to know where our constructions end and the reality begins (for the constructions shape the reality), it is clear that other stories and constructions which might have created different realities have been selectively 'forgotten'. Models of inclusion - for example, among the Maori in Aotearoa where it is suggested that disability is accepted as being normal - have been erased from Western disability history. Disability activists are now facing the task of re-creating a culture which celebrates and embraces difference. In the West, however, the script we have written for people with disability is a narrow one.
The history of disability in the West has been characterized by the progressive development of several models of disability: the religious model of disability, the medical/genetic model of disability, and the rights-based model of disability. These models, or constructions of disability, have set the parameters for our response to people with disability. Through time, these models have become more sophisticated, yet their essence remains constant - otherness.
The Religious Model of Disability
In a Western Judea-Christian society, the roots of understanding bodily difference have been grounded in Biblical references, the consequent responses and impacts of the Christian church, and the effect of the enlightenment project underpinning the modern era. These embodied states were seen as the result of evil spirits, the devil, witchcraft or God's displeasure. Alternatively, such people were also signified as reflecting the "suffering Christ", and were often perceived to be of angelic or beyond-human status to be a blessing for others.
Therefore, themes which embrace notions of sin or sanctity, impurity and wholeness, undesirability and weakness, care and compassion, healing and burden have formed the dominant bases of Western conceptualisations of, and responses to, groups of people who, in a contemporary context, are described as disabled. In the past, various labels have been used for such people. These include crippled, lame, blind, dumb, deaf, mad, feeble, idiot, imbecile, and moron.
In the nomadic and/or agrarian societies of pre-industrialisation, when time was cyclic, people perceived with limitations often lived with their families. They were ascribed roles and tasks in line with their capabilities, and which fulfilled the co-operative requirements for corporate survival. Others, though, could not stay with their families. Some were ostracised, and their survival threatened, because of a popular conception that such persons were monsters, and therefore unworthy of human status. Some became homeless and dislocated for other reasons such as poverty or shame. Religious communities, often within the local precincts or parishes, responded to these groups of people in various ways. These included the promotion and seeking of cures by such actions as exorcisms, purging, rituals and so on or providing care, hospitality and service as acts of mercy and Christian duty to "needy strangers".
However, important changes were to occur with the evolvement of the modern era profoundly influenced by the enlightenment and industrialisation. During this time, religious values and modes were challenged by the uprising of reason and rationality.
The Medical Model of Disability
As medical and scientific knowledge expanded profusely, the doctor and the scientist replaced the priest as custodian of societal values and curing processes. Work and production became commodified, and time became linear. Human worth was then to be determined by perceived work value and profitability and lifestyles and lives became dictated by the mechanistic practices and institutions of the nation state. Universality replaced particularity, reason replaced mystery, and knowledge and state of the mind superseded the lived experience of the body. 'Normality', then, became determined by the ideal of the white, youthful, able, male body and otherness to this ideal became hierarchically placed as inferiority. Therefore, difference became redefined as deviance commanding control.
Events of this era were to have a major impact on the lives of those with bodily limitations. The lives of such people were reduced to little more than a medical label, and their futures defined by a medical prognosis. People with disability then became a class requiring physical removal from the "able-bodied" norms of what was developing as an urbanised society. As some commentators note, this was the era when cripples disappeared and disability was created.
As certain groups of people came to be viewed as unproductive and incapable, institutions were established as places with a dual purpose: (a) where such people could be placed whilst other family members could meet workers' obligations and (b) where such people could be skilled to become productive members of society.
But, with the modern era, there was also an increasing emphasis on scientism and social Darwinism and this resulted in the roles of special institutions shifting from agents of reform to agents of custody for social control and institutional segregation for those now described as sub-normal. Institutions became the instruments for the facilitation of social death. Through a presumed scientific status, care for people with disability became depoliticised, technicalised and professionalised, predicated on notions of tragedy, burden and helpless dependency.
In the post-industrial and post-enlightenment era, disability, in Western society, has been regarded as an individual affliction predominantly cast within scientific and medical discourses. Therefore, "disability" has come to be defined and signified as a power-neutral, objectively observable attribute or characteristic of an "afflicted" person. According to this model, it is the individual, and not society, who has the problem, and different interventions aim to provide the person with the appropriate skills to rehabilitate or deal with it. However, in a culture, supported by modern Western medicine, and which idealises the idea that the body can be objectified and controlled, those who cannot control their bodies are seen as failures.
In recent years, and with the influence of normalisation principles since the 1970's, the locus of an individualised conceptualisation has shifted from the state-run (public) institution to community-based facilities and care. However, the medical perspective of disability remains wedded to the economy, whereby personal capacity and ability are often assessed as incapacity and inability so as to determine a person's eligibility for financial assistance and benefits, and access to personal resources. An economic view narrows the complexity of disability to limitations and restrictions, with implications of whether "flawed" people can be educated or productive.
Lack of access to adequate material resources perpetuates a charity discourse which depicts certain people as in need of help, as objects of pity, as personally tragic, and as dependent and eternal children. It is a discourse of benevolence and altruism and like with the responses of early Christian communities, this discourse serves a complimentary relationship between perceivably helpless people as instruments for good and virtuous works of mercy and compassion by the more "privileged" members of society.
The Rights-Based Model of Disability
In more recent times, however, the notion of 'disability' has come to be conceptualised as a socio-political construct within a rights-based discourse. The emphasis has shifted from dependence to independence, as people with disability have sought a political voice, and become politically active against social forces of disablism. Disability activists, in engaging in identity politics, have adopted the strategies used by other social movements commanding human and civil rights, against such phenomena as sexism and racism. And these strategies have brought gains, but within certain limitations.
From the mid 1980's, some Western countries like Australia have enacted legislation which embraces a rights-based discourse rather than a custodial discourse and which seeks to address issues of social justice and discrimination. The legislation also embraces the conceptual shift from disability being seen as an individualised 'medical problem' to rather being about community membership and participation, and access to regular societal activities such as employment, education, recreation and so on. Where access is inappropriate, inadequate, difficult or ignored, advocacy processes have been initiated to address situations and promote the people's rights.
Yet, rights-based discourse, although employed as a political strategy, has also become a way of constructing disability by locking people with disability into an identity which is based upon membership of a minority group. Entitlements thus become contingent upon being able to define oneself as a person with disability. And the conceptual barrier between 'normal' and 'abnormal' goes unchallenged, so that while one may have entitlements legislatively guaranteed, 'community' which cannot be legislated for, remains elusive.
Looking to the Future
While rights-based discourse, at a strategic level, has brought some additional entitlements to people with disability, it has not significantly altered the way in which disability is constructed and so, despite legislative changes, some people's lives have not necessarily changed. In fact, new challenges such as genetic technology and reproductive technology threaten to further alienate the whole and integrated person (the body, mind and spirit) from the medically, or scientifically, diagnosed 'person' (the condition). We are now seeing the emergence of a genetic model of disability, a revamped medical model, which 'promises' to actually expand the population of people with disability to include people whose impairment is their 'bad' genes and their disability is the social response of avoidance, discrimination and even elimination which their impaired genes elicit in others.
Rights-based discourse fails to meet these challenges for, rather than seeking to dismantle the entire concept of disability, it actually relies upon such a construction to support its claims for rights and entitlements.
Some writers argue that we need to go beyond conceptions of constructed disability to a notion of universalism whereby, according to Canadian writer, Bickenbach, disability is actually a fluid and continuous condition which has no boundaries but which is, in fact, the essence of the human condition. And, as a condition which is experienced by us all, at some stage in our lives, disability is actually normal. This view is also supported by the Indian philosopher, Sarkar, who argues that bodily differences should not be allowed to mask our essential humanity.
At the level of our physical existence, diversity is a natural condition and the need is for us to welcome and embrace diversity outside of a hierarchical classification of difference. Yet, at another level, difference is simply a construction of ideology, not a state of reality - since we are all interconnected and have flowing through each of us the same life force. According to Sarkar, "the force that guides the stars you too". Yet, the history of disability has been a history of seeking to construct hierarchical difference out of an essential reality of oneness. The challenge is to create the reverse.
Baird, V., (1992) “Difference and Defiance,” The New Internationalist (Special Edition - Disabled Lives), Vol 233, July 1992, pp. 4-7.
Bickenbach, J., “Equity, Participation and the Politics of Disability”, Paper Presented at the Rehabilitation International 18 th World Congress, Auckland, New Zealand, September, 1996.
Branson, J. and Miller, D., (1989), “Beyond Integration Policy - The Deconstruction of Disability,” in Barton, L., ed., Integration: Myth or Reality? London: Falmer Press.
Clapton, J., (1996) “Disability, Inclusion and the Christian Church”, Paper Presented at Disability, Religion and Health Conference, Brisbane, October 18-20, 1996.
Fitzgerald, J., (1996) “Geneticizing Disability: The Human Genome Project and the Commodification of Self”, Paper presented at the Rehabilitation International Congress, Auckland, New Zealand, September, 1996.
Fitzgerald, J., (1996) “Reclaiming the Whole: Self, Spirit and Society”, Paper Presented at Disability, Religion and Health Conference, Brisbane, October 18-20, 1996.
Funk, R., (1987) “Disability Rights: From Castle to Class in the Context of Civil Rights,” in Gartner, A. and Joe, T., eds., Images of the Disabled, Disabling Images. New York: Praegar.
Fontaine, C.R., (1994) “Roundtable Discussion: Women With Disabilities - A Challenge To Feminist Theology,” Journal of Feminist Studies in Religion. 10(2), 1994, pp. 99-134.
Higgins, P., (1992), Making Disability: Exploring the Social Transformation of Human Variation. Springfield, Illinios: Charles C. Thomas Publisher.
Miles, M., (1995), “Disability in an Eastern Religious Context: Historical Perspectives,” Disability and Society, 10(1).
Wendell, S., (1992), “Toward a Feminist Theory of Disability,” in Bequart Holmes, H. & Purdy, L. M., eds., Feminist Perspectives in Medical Ethics. Bloomington and Indianapolis: Indiana University Press.
The early Renaissance had two principal characteristics. Of these the first is humanism, a term that did not carry the present-day ethical or antireligious sense but instead referred to the intensive study of a revived Classical antiquity. Humanism comprised an intense concern with the studia humanitatis (“studies of humanity”)—that is, grammar, rhetoric, history, poetry, and moral philosophy as read in Classical Latin and, sometimes, Greek texts. As such, it represented not a philosophical system but rather an educational program that largely excluded those subjects taught in the universities: logic, natural philosophy, metaphysics, astronomy, medicine, law, and theology.
The origins of humanism date back to the Italy of the 1290s, in which one finds, in many cities, friends coming together informally to study the ancient world and attempting to reproduce something of the spirit of the Latin classics in their own writings. That the movement should have originated in Italy is not surprising. It was natural that Italians should look back to Rome, particularly since the ruins of Roman civilization still stood about them. In addition, the study of the great corpus of Roman law in the universities of Padua and Bologna led easily to a wish to understand the society that had produced it. Yet even beyond that, in the secular world of the city-states, where lay literates rather than clerics dominated intellectual life, the secular civilization of the Classical world had an irresistible appeal. It was not that the humanists were un-Christian, rather that their Christianity was a lay and, in some sense, secularized Christianity.
The movement advanced in the middle of the 14th century through the work of two men, eminent both as humanists and for their roles in Italian and European literature: Francesco Petrarca ( Petrarch 1304–74) and Giovanni Boccaccio (1313–75). It was consolidated at the end of the century, above all in Florence. Here in the 1390s the inspired teaching of the Byzantine Manuel Chrysoloras made the city the leading centre for the study of Classical Greek in Europe, while Coluccio Salutati (1331–1406) and Leonardo Bruni (1370–1444), both of whom served for some time as chancellors of the republic, claimed that the disciplines of humanism were particularly suitable for the service of the state as studies appropriate to the “active life” of a republican citizen.
Thenceforth humanism dominated intellectual life in the peninsula (and later in much of Europe), influencing vernacular literature, the writing of history, art, education, and style of life. During the 15th century, for the first time, Florentine Greek studies turned scholars from moral back to metaphysical philosophy. Marsilio Ficino (1433–99) translated all of Plato’s writings, together with important Neoplatonic texts and the Greek mystical Corpus Hermeticum. From these sources he went on to develop his own philosophy of Christian Hermeticism, or Neoplatonism. Subsequently modified and developed by Giovanni Pico della Mirandola (1463–94), whose best-known essay bears the significant title Oratio de hominis dignitate (1486 Oration on the Dignity of Man), this philosophy, which argued that human beings could independently determine their own salvation by following the natural impulses of love and beauty, presented an immensely optimistic view of humanity and its place in the universe. It was to exercise a strong fascination, particularly over artists and poets, in the following hundred years.
History of the environmental movement
Concern for the impact on human life of problems such as air and water pollution dates to at least Roman times. Pollution was associated with the spread of epidemic disease in Europe between the late 14th century and the mid-16th century, and soil conservation was practiced in China, India, and Peru as early as 2,000 years ago. In general, however, such concerns did not give rise to public activism.
The contemporary environmental movement arose primarily from concerns in the late 19th century about the protection of the countryside in Europe and the wilderness in the United States and the health consequences of pollution during the Industrial Revolution. In opposition to the dominant political philosophy of the time, liberalism—which held that all social problems, including environmental ones, could and should be solved through the free market—most early environmentalists believed that government rather than the market should be charged with protecting the environment and ensuring the conservation of resources. An early philosophy of resource conservation was developed by Gifford Pinchot (1865–1946), the first chief of the U.S. Forest Service, for whom conservation represented the wise and efficient use of resources. Also in the United States at about the same time, a more strongly biocentric approach arose in the preservationist philosophy of John Muir (1838–1914), founder of the Sierra Club, and Aldo Leopold (1887–1948), a professor of wildlife management who was pivotal in the designation of Gila National Forest in New Mexico in 1924 as America’s first national wilderness area. Leopold introduced the concept of a land ethic, arguing that humans should transform themselves from conquerors of nature into citizens of it his essays, compiled posthumously in A Sand County Almanac (1949), had a significant influence on later biocentric environmentalists.
Environmental organizations established from the late 19th to the mid-20th century were primarily middle-class lobbying groups concerned with nature conservation, wildlife protection, and the pollution that arose from industrial development and urbanization. There were also scientific organizations concerned with natural history and with biological aspects of conservation efforts.
Although the United States led the world in such efforts during this time, other notable conservation developments were also occurring in Europe and Oceania. For example, a group of Swiss scientists and conservationists convinced the government to set aside 14,000 hectares (roughly 34,600 acres) of land in the Swiss Alps as Europe’s first national park by 1914. In New Zealand, the Native Bird Protection Society (later the Royal Forest and Bird Protection Society, or Forest & Bird) arose in 1923 in response to the devastation of Kapiti Island by livestock.
Beginning in the 1960s, the various philosophical strands of environmentalism were given political expression through the establishment of “green” political movements in the form of activist nongovernmental organizations and environmentalist political parties. Despite the diversity of the environmental movement, four pillars provided a unifying theme to the broad goals of political ecology: protection of the environment, grassroots democracy, social justice, and nonviolence. However, for a small number of environmental groups and individual activists who engaged in ecoterrorism, violence was viewed as a justified response to what they considered the violent treatment of nature by some interests, particularly the logging and mining industries. The political goals of the contemporary green movement in the industrialized West focused on changing government policy and promoting environmental social values. Examples include the campaigns in Tasmania in the 1970s and ’80s to block the flooding of Lake Pedder and the damming of the Franklin River protests in the United States and western Europe against nuclear power development, especially following the catastrophic accidents at Three Mile Island (1979) and Chernobyl (1986) the related decades-long controversy surrounding uranium mining in Australia’s Northern Territory, including at the Jabiluka mine protests against deforestation in Indonesia and the Amazon basin and campaigns in several countries to limit the volume of greenhouse gases released through human activities. In the less-industrialized or developing world, environmentalism has been more closely involved in “emancipatory” politics and grassroots activism on issues such as poverty, democratization, and political and human rights, including the rights of women and indigenous peoples. Examples include the Chipko movement in India, which linked forest protection with the rights of women, and the Assembly of the Poor in Thailand, a coalition of movements fighting for the right to participate in environmental and development policies.
The early strategies of the contemporary environmental movement were self-consciously activist and unconventional, involving direct-protest actions designed to obstruct and to draw attention to environmentally harmful policies and projects. Other strategies included public-education and media campaigns, community-directed activities, and conventional lobbying of policy makers and political representatives. The movement also attempted to set public examples in order to increase awareness of and sensitivity to environmental issues. Such projects included recycling, green consumerism (also known as “buying green”), and the establishment of alternative communities, including self-sufficient farms, workers’ cooperatives, and cooperative-housing projects.
The electoral strategies of the environmental movement included the nomination of environmental candidates and the registration of green political parties. These parties were conceived of as a new kind of political organization that would bring the influence of the grassroots environmental movement directly to bear on the machinery of government, make the environment a central concern of public policy, and render the institutions of the state more democratic, transparent, and accountable. The world’s first green parties—the Values Party, a nationally based party in New Zealand, and the United Tasmania Group, organized in the Australian state of Tasmania—were founded in the early 1970s. The first explicitly green member of a national legislature was elected in Switzerland in 1979 later, in 1981, four greens won legislative seats in Belgium. Green parties also have been formed in the former Soviet bloc, where they were instrumental in the collapse of some communist regimes, and in some developing countries in Asia, South America, and Africa, though they have achieved little electoral success there.
The most successful environmental party has been the German Green Party (die Grünen), founded in 1980. Although it failed to win representation in federal elections that year, it entered the Bundestag (parliament) in both 1983 and 1987, winning 5.6 percent and 8.4 percent of the national vote, respectively. The party did not win representation in 1990, but in 1998 it formed a governing coalition with the Social Democratic Party, and the party’s leader, Joschka Fischer, was appointed as the country’s foreign minister.
Throughout the last two decades of the 20th century, green parties won national representation in a number of countries and even claimed the office of mayor in European capital cities such as Dublin and Rome in the mid-1990s. Outside Europe, New Zealand’s Green Party, which was reconstituted from the former Values Party in 1990, won 7 percent of the vote in the 1990 general election its influence had grown to 9 of the country’s 121 parliamentary seats by 2002 and to 14 parliamentary seats by 2014.
By this time green parties had become broad political vehicles, though they continued to focus on the environment. In developing party policy, they attempted to apply the values of environmental philosophy to all issues facing their countries, including foreign policy, defense, and social and economic policies.
Despite the success of some environmental parties, environmentalists remained divided over the ultimate value of electoral politics. For some, participation in elections is essential because it increases the public’s awareness of environmental issues and encourages traditional political parties to address them. Others, however, have argued that the compromises necessary for electoral success invariably undermine the ethos of grassroots democracy and direct action. This tension was perhaps most pronounced in the German Green Party. The party’s Realos (realists) accepted the need for coalitions and compromise with other political parties, including traditional parties with views sometimes contrary to that of the Green Party. By contrast, the Fundis (fundamentalists) maintained that direct action should remain the major form of political action and that no pacts or alliances should be formed with other parties. Likewise, in Britain, where the Green Party achieved success in some local elections but failed to win representation at the national level (though it did win 15 percent of the vote in the 1989 European Parliament elections), this tension was evidenced in disputes between so-called “electoralists” and “radicals.”
The implementation of internal party democracy also caused fissures within environmental parties. In particular, earlier strategies such as continuous policy involvement by party members, grassroots control over all party institutions and decisions, and the legislative rotation of elected members to prevent the creation of career politicians were sometimes perceived as unhelpful and disruptive when green parties won representation to local, national, or regional assemblies.
By the late 1980s environmentalism had become a global as well as a national political force. Some environmental nongovernmental organizations (e.g., Greenpeace, Friends of the Earth, and the World Wildlife Fund) established a significant international presence, with offices throughout the world and centralized international headquarters to coordinate lobbying campaigns and to serve as campaign centres and information clearinghouses for their national affiliate organizations. Transnational coalition building was and remains another important strategy for environmental organizations and for grassroots movements in developing countries, primarily because it facilitates the exchange of information and expertise but also because it strengthens lobbying and direct-action campaigns at the international level.
Through its international activism, the environmental movement has influenced the agenda of international politics. Although a small number of bilateral and multilateral international environmental agreements were in force before the 1960s, since the 1972 United Nations Conference on the Human Environment in Stockholm, the variety of multilateral environmental agreements has increased to cover most aspects of environmental protection as well as many practices with environmental consequences, such as the burning of fossil fuels, the trade in endangered species, the management of hazardous waste, especially nuclear waste, and armed conflict. The changing nature of public debate on the environment was reflected also in the organization of the 1992 United Nations Conference on Environment and Development (the Earth Summit) in Rio de Janeiro, Brazil, which was attended by some 180 countries and various business groups, nongovernmental organizations, and the media. In the 21st century the environmental movement has combined the traditional concerns of conservation, preservation, and pollution with more contemporary concerns with the environmental consequences of economic practices as diverse as tourism, trade, financial investment, and the conduct of war. Environmentalists are likely to intensify the trends of the late 20th century, during which some environmental groups increasingly worked in coalition not just with other emancipatory organizations, such as human rights and indigenous-peoples groups, but also with corporations and other businesses.
The Birth of the Renaissance: Understanding the Genesis of a New Era - History
I have superimposed perspective lines illustrating the use of 1-point linear perspective in "View of an Ideal City", a painting by Piero della Francesca. The point of convergence is called the vanishing point .
The Italian Renaissance is considered by historians the beginning of the modern age. The name itself literally means "rebirth", an accurate description of this period of innovation in both the sciences and the arts. The literary arts were also given much attention, as Renaissance thinkers turned to the lost texts of the ancient world for new understanding. This renewed interest in history, literature, and the arts was the birth of a whole new way of thinking, one which centered on the world of mankind as much as a concern for the hereafter (which was the sole concern of medieval man). This new way of thinking is called humanism , tracing back to the Greek concept of "man as the measure of all things". With the invention of movable type during the Renaissance, new ideas and ancient scholarship spread faster than ever before.
The general dates given for the Renaissance period are 1400-1550, and its birth-place was unmistakably Florence, a prosperous merchant town in Italy . It was necessary that the cultivation of great ideas and art would begin in a center of great wealth. for it required such prosperity to fund the building of great cathedrals which were elaborately decorated by the best artists that the region had to offer. Wealthy citizens often donated their money for specific art commissions, for both religious and secular projects. The greatest art patrons in Florence were the Medici family , who decorated their city with sculptures brought from Greece and Rome, commissioned artists and architects to create, and who also funded the first universities.
The most obvious changes during Renaissance times are seen in the paintings and sculptures. Though they continued the medieval tradition of using religious subjects, illustrating stories from the Bible, they combined this interest with classical ideals of the human figure and an increased interest in depicting nature. Secular works were also popular, often inspired from Greek and Roman mythology. Artists began to experiment for the first time with oil-based paints , mixing powdered pigments with linseed oil (gradually abandoning the Medieval technique of egg tempera ). The paints dried slowly, and remained workable for a few months. The fresco technique was employed on plaster walls (reaching perfection with artists such as Michelangelo). Sculpture began to be conceived "in the round", instead of as relief decorations on cathedrals. Perspective and light were also introduced into art, perfecting the sense of three-dimensional reality. The artists of the Renaissance made such a dramatic impact in their concept of space and form that they have changed the way we look at the world for all time.
The Early Renaissance : Innovations in Linear Perspective and Human Anatomy
Giotto (1267-1337) is considered the "Father of the Renaissance". Characterized as a Proto-Renaissance painter, his work is a transition from the late medieval (Gothic). His innovations were the use of approximate perspective, increased volume of figures, and a depth of emotion which suggests human feeling instead of static and passive icons.
Filippo Brunelleschi (1337-1446) was a Florentine architect and engineer the first to carry out a series of optical experiments that led to a mathematical theory of perspective. Brunelleschi devised the method of perspective for architectural purposes, but once the method of perspective was published in 1435 (by Alberti), it would have a dramatic impact on the depiction of 3-dimensional space in the arts. See perspective illustration at top of page.
Masaccio (1401- 1428) was the one of the first artists to apply the new method of linear perspective in his fresco of the Holy Trinity . The barrel vaulted ceiling imitates with precision the actual appearance of the architectural space as it would appear from the viewer's point of view. His figures are accurate in their description of human anatomy, influenced by the artist's study of sculpture.
In this painting, the vanishing point resides below the feet of Jesus. The illusion of the architecture is so real that one feels as if the wall has been opened up to reveal the scene. Jesus, the Father, and the Holy Ghost (symbolized by the dove) are joined by Mary and St.John the Evangelist. Flanked on the sides are the donors (whose tomb was discovered beneath the mural). A painted skeleton lies on an illusionary sarcophagus below the inscription: "What you are, I once was what I am, you will become".
The Holy Trinity , fresco
(see enlarged image)
|Masaccio includes three different moments a the story in the same scene (a technique known as "continuous narrative"): At center, Peter asks Jesus why he should have to pay the tax collector's since his allegiance is only to God and not the Romans. Jesus's response is to "give to the Romans what is due to them and to the Lord what is due to Him. He intructs Peter to find the money by going fishing (at the left, Peter extracts a coin from the fish's mouth) and, to the right, Peter hands the tribute money to the tax collector in front of his house.|
Piero della Francesca (1416-1492) was another early Renaissance artist who expressed an obsession with perspective. His work is characterized by carefully analyzed architectural spaces, a sensitivity to geometric purity of shapes, and a sculptural understanding of the figure. He was so obsessed with perspective and geometry, that he wrote several treatises on the subject.
Piero della Francesca, The Discovery and Proving of the True Cross , fresco, 1452-59
(Web Gallery of Art: http://www.kfki.hu/
This is just one of several murals within a "cycle" depicting the legend of the "true cross". The cross is discovered with the two crosses (of thieves who died beside Jesus). The true cross is identified by its power to bring a dead youth back to life.
Donatello (1386-1466) brought a new sense of naturalism to sculpture. His were some of the earliest pieces to come off of the walls of cathedrals, occupying three-dimensional space. His figures use the classical contrapposto stance (relaxed and not rigid). His David is also believed to be the very first full-scale nude sculpture since ancient times. David is the biblical youth who conquers the giant, Goliath. Though difficult to see in this photograph, David stands with his left foot on top of Goliath's head. It is interesting to compare this sculpture with Michelangelo's later version. David , cast bronze (158 cm.), 1444-46
Andrea Mantegna (1430-1506) created unusual vantage points in his paintings, often looking at figures from below or, in Lamentation Of the Dead Christ , from the feet of the subject, requiring deep foreshortening . The position was very effective in placing the viewer at the scene, adding to one's sense of empathy.
Lamentation of the Dead Christ , tempera on canvas, 1466
Sandro Botticelli (1445-1510) was the first artist to paint a full-length female nude in his Birth of Venus . The figure actually recalls the exact pose of a Greek sculpture (the Venus de Medici , which he had access to under their patronage), though he has added flowing hair and elongated limbs. The figure occupies the center of the canvas, traditionally reserved only for the subject of the Virgin. Referring to classical mythology, this is perhaps the most pagan image of the entire Renaissance. Primavera (Spring) is another painting of classical subject commissioned for the Medici family.
Humanism was an intellectual movement embraced by scholars, writers, and civic leaders in 14th century Italy.
Assess how Humanism gave rise to the art of the Renasissance
- Humanists reacted against the utilitarian approach to education, seeking to create a citizenry who were able to speak and write with eloquence and thus able to engage the civic life of their communities.
- The movement was largely founded on the ideals of Italian scholar and poet Francesco Petrarca, which were often centered around humanity’s potential for achievement.
- While Humanism initially began as a predominantly literary movement, its influence quickly pervaded the general culture of the time, reintroducing classical Greek and Roman art forms and leading to the Renaissance .
- Donatello became renowned as the greatest sculptor of the Early Renaissance, known especially for his Humanist, and unusually erotic, statue of David.
- While medieval society viewed artists as servants and craftspeople, Renaissance artists were trained intellectuals, and their art reflected this newfound point of view.
- In humanist painting, the treatment of the elements of perspective and depiction of light became of particular concern.
- High Renaissance: The period in art history denoting the apogee of the visual arts in the Italian Renaissance. The High Renaissance period is traditionally thought to have begun in the 1490s—with Leonardo’s fresco of The Last Supper in Milan and the death of Lorenzo de’ Medici in Florence—and to have ended in 1527, with the Sack of Rome by the troops of Charles V.
Humanism, also known as Renaissance Humanism, was an intellectual movement embraced by scholars, writers, and civic leaders in 14th- and early-15th-century Italy. The movement developed in response to the medieval scholastic conventions in education at the time, which emphasized practical, pre-professional, and scientific studies engaged in solely for job preparation, and typically by men alone. Humanists reacted against this utilitarian approach, seeking to create a citizenry who were able to speak and write with eloquence and thus able to engage the civic life of their communities. This was to be accomplished through the study of the “studia humanitatis,” known today as the humanities: grammar, rhetoric, history, poetry, and moral philosophy. Humanism introduced a program to revive the cultural—and particularly the literary—legacy and moral philosophy of classical antiquity . The movement was largely founded on the ideals of Italian scholar and poet Francesco Petrarca, which were often centered around humanity’s potential for achievement.
While Humanism initially began as a predominantly literary movement, its influence quickly pervaded the general culture of the time, re-introducing classical Greek and Roman art forms and contributing to the development of the Renaissance. Humanists considered the ancient world to be the pinnacle of human achievement, and thought its accomplishments should serve as the model for contemporary Europe. There were important centers of Humanism in Florence, Naples, Rome , Venice , Genoa, Mantua, Ferrara, and Urbino .
Humanism was an optimistic philosophy that saw man as a rational and sentient being, with the ability to decide and think for himself. It saw man as inherently good by nature, which was in tension with the Christian view of man as the original sinner needing redemption. It provoked fresh insight into the nature of reality, questioning beyond God and spirituality, and provided knowledge about history beyond Christian history.
Renaissance Humanists saw no conflict between their study of the Ancients and Christianity. The lack of perceived conflict allowed Early Renaissance artists to combine classical forms, classical themes, and Christian theology freely. Early Renaissance sculpture is a great vehicle to explore the emerging Renaissance style . The leading artists of this medium were Donatello, Filippo Brunelleschi, and Lorenzo Ghiberti. Donatello became renowned as the greatest sculptor of the Early Renaissance, known especially for his classical, and unusually erotic, statue of David, which became one of the icons of the Florentine republic.
Donatello’s David: Donatello’s David is regarded as an iconic Humanist work of art.
Humanism affected the artistic community and how artists were perceived. While medieval society viewed artists as servants and craftspeople, Renaissance artists were trained intellectuals, and their art reflected this newfound point of view. Patronage of the arts became an important activity, and commissions included secular subject matter as well as religious. Important patrons , such as Cosimo de’ Medici, emerged and contributed largely to the expanding artistic production of the time.
In painting, the treatment of the elements of perspective and light became of particular concern. Paolo Uccello, for example, who is best known for “The Battle of San Romano,” was obsessed by his interest in perspective, and would stay up all night in his study trying to grasp the exact vanishing point . He used perspective in order to create a feeling of depth in his paintings. In addition, the use of oil paint had its beginnings in the early part of the 16th century, and its use continued to be explored extensively throughout the High Renaissance .
“The Battle of San Romano” by Paolo Uccello: Italian Humanist paintings were largely concerned with the depiction of perspective and light.
Some of the first Humanists were great collectors of antique manuscripts, including Petrarch, Giovanni Boccaccio, Coluccio Salutati, and Poggio Bracciolini. Of the three, Petrarch was dubbed the “Father of Humanism” because of his devotion to Greek and Roman scrolls. Many worked for the organized church and were in holy orders (like Petrarch), while others were lawyers and chancellors of Italian cities (such as Petrarch’s disciple Salutati, the Chancellor of Florence) and thus had access to book-copying workshops.
In Italy, the Humanist educational program won rapid acceptance and, by the mid-15th century, many of the upper classes had received Humanist educations, possibly in addition to traditional scholastic ones. Some of the highest officials of the church were Humanists with the resources to amass important libraries. Such was Cardinal Basilios Bessarion, a convert to the Latin church from Greek Orthodoxy, who was considered for the papacy and was one of the most learned scholars of his time.
Following the Crusader sacking of Constantinople and the end of the Byzantine Empire in 1453, the migration of Byzantine Greek scholars and émigrés, who had greater familiarity with ancient languages and works, furthered the revival of Greek and Roman literature and science.
Cesarean Section - A Brief History
Cesarean section has been part of human culture since ancient times and there are tales in both Western and non-Western cultures of this procedure resulting in live mothers and offspring. According to Greek mythology Apollo removed Asclepius, founder of the famous cult of religious medicine, from his mother's abdomen. Numerous references to cesarean section appear in ancient Hindu, Egyptian, Grecian, Roman, and other European folklore. Ancient Chinese etchings depict the procedure on apparently living women. The Mischnagoth and Talmud prohibited primogeniture when twins were born by cesarean section and waived the purification rituals for women delivered by surgery.
The extraction of Asclepius from the abdomen of his mother Coronis by his father Apollo. Woodcut from the 1549 edition of Alessandro Beneditti's De Re Medica.
Yet, the early history of cesarean section remains shrouded in myth and is of dubious accuracy. Even the origin of "cesarean" has apparently been distorted over time. It is commonly believed to be derived from the surgical birth of Julius Caesar, however this seems unlikely since his mother Aurelia is reputed to have lived to hear of her son's invasion of Britain. At that time the procedure was performed only when the mother was dead or dying, as an attempt to save the child for a state wishing to increase its population. Roman law under Caesar decreed that all women who were so fated by childbirth must be cut open hence, cesarean. Other possible Latin origins include the verb "caedare," meaning to cut, and the term "caesones" that was applied to infants born by postmortem operations. Ultimately, though, we cannot be sure of where or when the term cesarean was derived. Until the sixteenth and seventeenth centuries the procedure was known as cesarean operation. This began to change following the publication in 1598 of Jacques Guillimeau's book on midwifery in which he introduced the term "section." Increasingly thereafter "section" replaced "operation."
One of the earliest printed illustrations of Cesarean section. Purportedly the birth of Julius Caesar. A live infant being surgically removed from a dead woman. From Suetonius' Lives of the Twelve Caesars, 1506 woodcut.
During its evolution cesarean section has meant different things to different people at different times. The indications for it have changed dramatically from ancient to modern times. Despite rare references to the operation on living women, the initial purpose was essentially to retrieve the infant from a dead or dying mother this was conducted either in the rather vain hope of saving the baby's life, or as commonly required by religious edicts, so the infant might be buried separately from the mother. Above all it was a measure of last resort, and the operation was not intended to preserve the mother's life. It was not until the nineteenth century that such a possibility really came within the grasp of the medical profession.
Cesarean section performed on a living woman by a female practitioner. Miniature from a fourteenth-century "Historie Ancienne."
There were, though, sporadic early reports of heroic efforts to save women's lives. While the Middle Ages have been largely viewed as a period of stagnation in science and medicine, some of the stories of cesarean section actually helped to develop and sustain hopes that the operation could ultimately be accomplished. Perhaps the first written record we have of a mother and baby surviving a cesarean section comes from Switzerland in 1500 when a sow gelder, Jacob Nufer, performed the operation on his wife. After several days in labor and help from thirteen midwives, the woman was unable to deliver her baby. Her desperate husband eventually gained permission from the local authorities to attempt a cesarean. The mother lived and subsequently gave birth normally to five children, including twins. The cesarean baby lived to be 77 years old. Since this story was not recorded until 82 years later historians question its accuracy. Similar skepticism might be applied to other early reports of abdominal delivery þ those performed by women on themselves and births resulting from attacks by horned livestock, during which the peritoneal cavity was ripped open.
The female pelvic anatomy. From Andreas Vesalius' De Corporis Humani Fabrica, 1543.
The history of cesarean section can be understood best in the broader context of the history of childbirth and general medicine þ histories that also have been characterized by dramatic changes. Many of the earliest successful cesarean sections took place in remote rural areas lacking in medical staff and facilities. In the absence of strong medical communities, operations could be carried out without professional consultation. This meant that cesareans could be undertaken at an earlier stage in failing labor when the mother was not near death and the fetus was less distressed. Under these circumstances the chances of one or both surviving were greater. These operations were performed on kitchen tables and beds, without access to hospital facilities, and this was probably an advantage until the late nineteenth century. Surgery in hospitals was bedeviled by infections passed between patients, often by the unclean hands of medical attendants. These factors may help to explain such successes as Jacob Nufer's.
By dint of his work in animal husbandry, Nufer also possessed a modicum of anatomical knowledge. One of the first steps in performing any operation is understanding the organs and tissues involved, knowledge that was scarcely obtainable until the modern era. During the sixteenth and seventeenth centuries with the blossoming of the Renaissance, numerous works illustrated human anatomy in detail. Andreas Vesalius's monumental general anatomical text De Corporis Humani Fabrica, for example, published in 1543, depicts normal female genital and abdominal structures. In the eighteenth and early nineteenth centuries anatomists and surgeons substantially extended their knowledge of the normal and pathological anatomy of the human body. By the later 1800s, greater access to human cadavers and changing emphases in medical education permitted medical students to learn anatomy through personal dissection. This practical experience improved their understanding and better prepared them to undertake operations.
At the time, of course, this new type of medical education was still only available to men. With gathering momentum since the seventeenth century, female attendants had been demoted in the childbirth arena. In the early 1600s, the Chamberlen clan in England introduced obstetrical forceps to pull from the birth canal fetuses that otherwise might have been destroyed. Men's claims to authority over such instruments assisted them in establishing professional control over childbirth. Over the next three centuries or more, the male-midwife and obstetrician gradually wrested that control from the female midwife, thus diminishing her role.
Last reviewed: 08 April 2011
Last updated: 26 July 2013
First published: 27 April 1998
Petrarch was undoubtedly one of the most significant influences on the Renaissance, not only in Italy but throughout Europe. His poetry inspired other poets in the period and later to examine their interior life and emotions and celebrate the natural world and see love as something spiritual. His literary forms, such as the sonnet and autobiography, persuaded many writers to adopt a more personal style. Petrarch was also, if not the ‘Father of Humanism’ certainly one of its leading lights.
For example, his works and scholarship did much to encourage an appreciation of the Graeco-Roman civilization. This was radical as it helped to counter the stifling influence of the Church and Papacy. His writings and philosophy promoted a more secular and rational worldview and promoted greater awareness of its importance. This had important repercussions and encouraged a belief that this world was important and not just salvation. This encouraged a rediscovery of the ancient world and a growing investigation of the world and society that led to a more modern outlook and was not wholly influenced by Christianity.
Petrarch. F. My Secret Book, (Secretum), translated by Nicholas Mann. Harvard University Press.
Petrarch, F. Canzoniere, translated by Anthony Mortimer (London: Penguin, 2002).
Minta, Stephen. Petrarch and Petrarchism: the English and French Traditions (Manchester: Manchester University Press, 1980).
Giustiniani, Vito "Homo, Humanus, and the Meanings of Humanism." Journal of the History of Ideas 46 (1985), pp 167 – 95