THE INTELLECT THEORY - The Emerald Kingdom - joie de vivre

Search
Go to content

Main menu:

COINSAT

INTELLECT THEORY INTELLCTUAL HUMANCULTURAL CAPITAL SATELLITE

LETTERSNUMBERSWORDSSHAPESLINESNAMESTITLESWRITING
A letter is a
segmental
symbol of a
phonemic
writing system.
The inventory
of all letters
forms the
alphabet.

Numbers can be
represented in
language with
number words.

The concept of "word"
is usually
distinguished from
that of a morpheme,
which is the smallest
unit of speech which
has a meaning, even if
it will not stand on
its own.

A shape is the form
of an object or its
external boundary,
outline, or external
surface, as opposed
to other properties
such as color,
texture or material
type.

In graphics, a
line can be
described as a
single point
that continues
for a distance,
or as the
connection
between two
points.

A name is a term
used for
identification. They
can identify a class
or category of
things, or a single
thing, either
uniquely, or within a
given context.

A title is one or more
words used before or
after a person's name,
in certain contexts.
It may signify either
generation, an official
position, or a
professional or
academic qualification.

Writing is a
medium of human
communication
that involves the
representation of
a language with
symbols.

A letter can
have multiple
variants, or
allographs,
related to
variation in
style of
handwriting or
printing.

The first known
system with place
value was the
Mesopotamian base
60 system (c. 3400
BC) and the
earliest known
base 10 system
dates to 3100 BC
in Egypt.

Philosophers have
found words objects of
fascination since at
least the 5th century
BC, with the
foundation of the
philosophy of
language.

Combining these
permutations
gives S ( v , w
, u ) = ( 1 − p
) − 1 {\displaystyle
S(v,w,u)=(1 p)^{-1}.}
S(v,w,u)=(1 p)^{{-1}}.

Anything longer
than an
individual point
may be
considered a
line and lines
can continue
indefinitely.

The entity identified
by a name is called
its referent. A
personal name
identifies, not
necessarily uniquely,
a specific individual
human.

An advocate is a
professional in the
field of law. Different
countries' legal
systems use the term
with somewhat differing
meanings.

A syllabary is a
set of written
symbols that
represent (or
approximate)
syllables.

Phonology
Infinity
Utterance
Area
Straight lines
Names of names
Principal
Mesoamerica
EtymologySubsets
MorphologyGeonsCurvilinear
Religious names
AdvocateSocial media
WELCOME TO THE INTELLECT THEORY ~ English languages, I probed all the forms from the 11th Century (Old English). Although _All Americans doesn't hold it the same. I have tumbled to trial the rare spellings and practice of letters. I don't particularly think that England consort. Yet this is where I understand it has arisen. Not yet here I counter raffish. And I ruled to adopt it as one of the core for the ciphers or method while transforming writing a new word problem. That went about when I recited how Arabic numbers were a conversion from; to English numbers. Fortunately for me, the first book I opened ever in the Library of Congress in my hands was Ancient Arabic Numbers. It feels amazing to have some overpasses to my expert of tool use. At the beginning styling the Intellect Theory; Hell on Earth. Individuals would drive furthermore while I inhaled cacophanies from traffic losing my sense of breathing. In a good way, I sway they had as if expressed. Helps with not affirming self deductive fallacy. There, I'm counting my grid... typing what squares do in an algorithm.

I prevailed striving for innovation for years until I shifted fastest climber to pass the gate of understanding. Recrudescence in every thinking hang fire eye when we use the word bottom. Truly. I started with a snow mobile.
A systematic story, you know the compos mentis explanation to; response for opposite of. I marketed a Volkswagen. Starting from the beginning I got into a mess of trouble (bottom). The verdict for these evils: restitution, and probation. The first day of right I gleaned a job working at Dulles International Airport (IAD). This kitchen breathed a Hawiian owned facility so you could imagine the amount of good service. Except it wasn't all fun for me. I can vouch for driving past Innovation on the right headed towards Rt. 7 on Rt. 28. I trained for a serveying company for a year and they advanced me to the nature of the America Greed.

I did get the etiquette of substaintial nine to five responsibilities. Not quite the failure I wanted; recreation from being given free guidance. I wouldn't be a good citizen or become a husband first if I didn't wake up with what determined what happens next if it didn't signify. Or maybe it was befriend strategizes the next move. I would open my eyes agape when I pulled into the parking space at the side of the building. Just to walk pass the all window side in brick surrounding. It was unique that everything goes up with a gun (theodolite). And reaches engineers to replete immanence. Not that America's big on exactness beyond lines God Himself. I mean, I'll attest it was moderately emboldened; how? But the man wanted man, machine and to be exact more which meant. Coffee, rest, and books. I met them all once on the corporate floor, and felt like the balcony was a warning if I didn't straighten out. Who cares, I couldn't stop falling. There was nothing that could change me other than a woman. At the same time, it was weakness. Being; or wanting to be a Christian was just too hard besides those books. I had one out of three because I didn't give much to coffee and rest, I did drugs the most.

Enough of that, all's well and ends. My first walk on dangerous grounds, losing my car. I quit. I found work at an temping service I think. Not important. All I know is that before that I owned a Ford sports car and that was repo'd because of being completely immoral. Trying to fit in with a shiny car is bogus when saving is rational. I got fired from the Wholesale club in Fairlakes. I complained to the new management that my back was hurting. Except the truth was I was puffing on a blunt roll of some really good stuff. She told me to go home. I was done. She was a looker. It didn't help any trying to tell myself to stay at the register or get a quick look at the good looking curly hair woman. Luckily for me the stuff must've worn off or at least that's what it felt like. I found a new job at Embassy on rt 28 Manassas the same day driving home. Sign outside. Now Hiring. Pulled in and got hired the same day. The only damning regret of it everything happened at the brink of winter and fall I ponder. IT WAS FREEZING EVERYDAY‼️ I sold my soul to the devil. That would be good when I get back to working in the office. I hope he didn't hear me because this story is going to tell you how I'm left with reaching to an office to make enough to upgrade to Promised land.

Welcome to the intellectual human. This is the good of expounding. I denominated the other MISUSE. But they're good and bad on this site. Intellectual Human: Good. EIII2_MISUSE: Bad (a cybersecurity algorithm). This is the harangue of what happened. Later I will interpret what is guilty and what is not innocent. Sounds heard of, and explained; it has no importance least. Either way, the winter has left and a quantitative of other events that are irrelevant. In other words, I hit rock bottom. It spun a web to me jumping on a rogue plane to South Florida, landing me in jail with.... I made it back home with the help of roots. I started working for day labor shops about eight years total. Off and on, while cleaning for a few local small business owners. Still do, but that's not of no concern for the word importance: CLASS ACTION. A dialect about that on the other page. Like I said: we're disclosing how? Afterwards, we; one of those business persons handed me to another and since the occupation was weather affective. In the end I didn't work for two weeks. Me and my home got into a bit of a dissension and I was thrown out. At first it was for nothingness impetus to last being home alone and vomiting on the hearth. Only this time it was outrageous. There was no reason, no experiment to understand. Just superintend how life become shift. I retaliated with windows and stones {the same as now just its a work of art progression/ that and it's a computer and not a home and car} and Revelation 15.

After court and a duplicitous... don't need to get into that. I almost died.

Speaking of which, I essentially died this one occasion during all of these years of phenomena happened. A Dodge Ram came screeching in the back of the mid SUV (and) I was laying face forward in the crash. I opened my eyes the second my mortal essence agreed with my immaturity to play with death to slip.
Back to the case, earning money was major. Larger than it ever was. I got in this place and worked hours. For years {details || in Class Action details} until differences set. More differences than if any, somewhat trying to control these flaws. Management scouted to purge me out the door. Except a little before that, as mentioned. I hit Easterns Automotive. The dealer I illustrated everything about how I want to build a website; my work hours were 10 hour days; the airport and government just raised minimum wage. I had overtime and my choice if I wanted those three days off from the 4 10 hour days. I'm a business, I want to be able to work from my car. I was discussing the E Class Mercedes. However, I was 10 grand short with no down payment. He handed me a VW from the wreckage lot in the tail. Said they hadn't took pictures of it yet. My guess is it would never be allowed to be on the platform. I bought any, with a few calls home to get auto insurance because my license was invalid at the time of sale so the insurance company that was risking to add on the invoice wouldn't provide me with coverage. After asking someone to help get it back to my residents. I managed to get to DMV too late with expired tags to register my vehicle and personalize my license plates. Next thing I know I'm the guy who bought a VW next to GMR exposing secret material and recipes.
While the car was still in admirable expectations both of what they sold me and affording to pay on time. I chose to leave this place doing the same siblings combine.

As again, the same confining's before having and before having again.
In the next words or more and/ symbol of and/ meaning; only just its one: Injustice. Though only I found another staffing company, I was at the bottom with what was on the table because of on the wrong tack. It came from Sunday worship and accepting the unfortunate and disfigured. From the beginning if I had followed my heritage and gave everyone the cold shoulder. Might just had I... I wouldn't be in this denouement: the theory we're watching and will try to with any genius.

My car was up for its Inspection. It had a few trips to the garage but again I won't explain that yet. Dejectedly the VW dealership, attained all of the information they needed. Then handed me to some presupposed Tech. Where later when you hear the results of the lesser end and VW Group of America. It'll mansion chaos before your eyes as such. Develop \the two lost pies or pie.... the actual format sentencing you should be reading now. Ergo, we've been administratively bound ignorantly in a play of Battleship and I don't use that word out of lexicon, fore; the $9,ooo Sentara bill added (as pain and suffering).
My car was fetched by me, with questions/s of leaving to remain at the facility. There are ups and downs as to how much help could be there, except. Customer satisfaction was poorly expressed and trust were revolute. After sanctioning my car sit in my lot for weeks. I lost my job and my car repossessed. Everything was gone because a car with nothing wrong with it. Didn't qualify to pass Inspection.

I don't understand it either. So we're going to do something about it with a website and the worldwide web. And ask every business why am I declaring about your company on my companies website out|side: of mien.
WIKIPEDIA [GRID] EARLY MAN

Early modern human (EMH) or anatomically modern human (AMH) are terms used to distinguish Homo sapiens (the only extant human species) that are anatomically consistent with the range of phenotypes seen in contemporary humans from extinct archaic human species. This distinction is useful especially for times and regions where anatomically modern and archaic humans co-existed, for example, in Paleolithic Europe. As of 2017, the oldest known skeleton of an anatomically modern human is the Omo-Kibish I, which dates to about 196,000 years ago.

Extinct species of the genus Homo include Homo erectus (extant from roughly 2 to 0.1 million years ago) and a number of other species (by some authors considered subspecies of either H. sapiens or H. erectus). The divergence of the lineage leading to H. sapiens out of ancestral H. erectus (or an intermediate species such as Homo antecessor) is estimated to have occurred in Africa roughly 500,000 years ago. The earliest fossil evidence of early modern humans appears in Africa around 300,000 years ago, with the earliest genetic splits among modern people, according to some evidence, dating to around the same time. Sustained archaic human admixture with modern humans is known to have taken place both in Africa and (following the recent Out-Of-Africa expansion) in Eurasia, between about 100,000 and 30,000 years ago.


~Anatomically
The human body is the structure of a human being. It is composed of many different types of cells that together create tissues and subsequently organ systems. They ensure homeostasis and the viability of the human body.

It comprises a head, neck, trunk (which includes the thorax and abdomen), arms and hands, legs and feet.

The study of the human body involves anatomy, physiology, histology and embryology. The body varies anatomically in known ways. Physiology focuses on the systems and organs of the human body and their functions. Many systems and mechanisms interact in order to maintain homeostasis, with safe levels of substances such as sugar and oxygen in the blood.

The body is studied by health professionals, physiologists, anatomists, and by artists to assist them in their work.



Homo erectus (meaning 'upright man') is an extinct species of archaic human from the Pleistocene, with its earliest occurrence about 2 million years ago, and its specimens are among the first recognisable members of the genus Homo. H. erectus was the first human ancestor to spread throughout the Old World, having a distribution in Eurasia extending from the Iberian Peninsula to Java. African populations of H. erectus are likely to be the direct ancestors to several human species, such as H. heidelbergensis and H. antecessor, with the former generally considered to have been the direct ancestor to Neanderthals and Denisovans, and sometimes also modern humans. Asian populations of H. erectus may be ancestral to H. floresiensis and possibly to H. luzonensis. As a chronospecies, the time of the disappearance of H. erectus is a matter of contention. There are also several proposed subspecies with varying levels of recognition. The last known record of morphologically recognisable H. erectus are the Solo Man specimens from Java, around 117–108,000 years ago.

H. erectus had a humanlike gait and body proportions, and was the first human species to have exhibited a flat face, prominent nose, and possibly sparse body hair coverage. Brain capacity varied widely depending on the population, ranging from 546–1,251 cc (33.3–76.3 cu in), and maximum brain size was likely achieved early on in life, suggesting a shorter childhood and lesser parental care than in modern humans. Size also ranged widely from 146–185 cm (4 ft 9 in–6 ft 1 in) in height and 40–68 kg (88–150 lb) in weight. H. erectus men and women may have been roughly the same size as each other (exhibit reduced sexual dimorphism) like modern humans, which could indicate monogamy in line with general trends exhibited in primates. Skin colour potentially varied with location.

H. erectus is associated with the Acheulean stone tool industry, and is thought to have been the earliest human ancestor capable of using fire, hunting and gathering in coordinated groups, caring for injured or sick group members, seafaring, and possibly art-making. Sites generally show consumption of medium to large animals, such as bovines or elephants, and a high reliance on meat is associated with increasing brain size. Though groups were more social than ancestor species, it is unclear if H. erectus was anatomically capable of speech, though it is postulated they communicated using some proto-language.



A society is a group of individuals involved in persistent social interaction, or a large social group sharing the same spatial or social territory, typically subject to the same political authority and dominant cultural expectations. Societies are characterized by patterns of relationships (social relations) between individuals who share a distinctive culture and institutions; a given society may be described as the sum total of such relationships among its constituent of members. In the social sciences, a larger society often exhibits stratification or dominance patterns in subgroups.

Societies construct patterns of behavior by deeming certain actions or speech as acceptable or unacceptable. These patterns of behavior within a given society are known as societal norms. Societies, and their norms, undergo gradual and perpetual changes.

Insofar as it is collaborative, a society can enable its members to benefit in ways that would otherwise be difficult on an individual basis; both individual and social (common) benefits can thus be distinguished, or in many cases found to overlap. A society can also consist of like-minded people governed by their own norms and values within a dominant, larger society. This is sometimes referred to as a subculture, a term used extensively within criminology.

More broadly, and especially within structuralist thought, a society may be illustrated as an economic, social, industrial or cultural infrastructure, made up of, yet distinct from, a varied collection of individuals. In this regard society can mean the objective relationships people have with the material world and with other people, rather than "other people" beyond the individual and their familiar social environment.



Medieval philosophy is the philosophy that existed through the Middle Ages, the period roughly extending from the fall of the Western Roman Empire in the 5th century to the Renaissance in the 15th century. Medieval philosophy, understood as a project of independent philosophical inquiry, began in Baghdad, in the middle of the 8th century, and in France, in the itinerant court of Charlemagne, in the last quarter of the 8th century. It is defined partly by the process of rediscovering the ancient culture developed in Greece and Rome during the Classical period, and partly by the need to address theological problems and to integrate sacred doctrine with secular learning.

The history of medieval philosophy is traditionally divided into two main periods: the period in the Latin West following the Early Middle Ages until the 12th century, when the works of Aristotle and Plato were rediscovered, translated, and studied upon, and the "golden age" of the 12th, 13th and 14th centuries in the Latin West, which witnessed the culmination of the recovery of ancient philosophy, along with the reception of its Arabic commentators, and significant developments in the fields of philosophy of religion, logic, and metaphysics.

The Medieval Era was disparagingly treated by the Renaissance humanists, who saw it as a barbaric "middle period" between the Classical age of Greek and Roman culture, and the rebirth or renaissance of Classical culture. Modern historians consider the medieval era to be one of philosophical development, heavily influenced by Christian theology. One of the most notable thinkers of the era, Thomas of Aquinas, never considered himself a philosopher, and criticized philosophers for always "falling short of the true and proper wisdom".

The problems discussed throughout this period are the relation of faith to reason, the existence and simplicity of God, the purpose of theology and metaphysics, and the problems of knowledge, of universals, and of individuation.



Greek East and Latin West are terms used to distinguish between the two parts of the Greco-Roman world, specifically the eastern regions where Greek was the lingua franca (Anatolia, Greece, the Balkans, the Levant and Egypt) and the western parts where Latin filled this role (The Maghreb, Central Europe, Gaul, Iberia, Italy and the British Isles). During the Roman Empire a divide had persisted between Latin- and Greek-speaking areas; this divide was encouraged by administrative changes in the empire's structure between the 3rd and 5th centuries, which led ultimately to the establishment of separate administrations for the Eastern and Western halves of the Empire.
The Roman Empire in 330 AD.

After the fall of the Western Part, pars occidentalis, of the Empire, the terms Greek East and Latin West are applied to areas that were formerly part of the Eastern or Western Parts of the Empire, and also to areas that fell under the Greek or Latin cultural sphere but that had never been part of the Roman Empire. This has given rise to two modern dichotomies. The first is the split of Chalcedonian Christianity that developed in Europe between Western Christianity (the forerunner of Roman Catholicism which Protestantism split from in 1517) and Eastern Orthodoxy. Second, Europeans have traditionally viewed the Greco Roman Mediterranean (extending from Spain to Syria) as having an East/West cultural split. Cultures associated with the historical Romance, Germanic, Scandinavian, Hungarians, Finns, Balts, Celts, Catholic Slavs and the historical Western Churches (Central and Western Europe) have traditionally been considered Western; these cultures adopted Latin as their lingua franca in the Middle Ages. Cultures associated with the Eastern Roman Empire and Russian Empire (Greeks, Orthodox Slavs, Romanians, Georgians and to a lesser extent Thracian and Anatolian Turks, Albanians and Bosniaks) have traditionally been considered Eastern; these cultures all used Greek or Old Church Slavonic as a lingua franca during the early Middle Ages.



Historians typically regard the Early Middle Ages or Early Medieval Period, sometimes referred to as the Dark Ages, as lasting from the 5th or 6th century to the 10th century. They marked the start of the Middle Ages of European history. The alternative term "Late Antiquity" emphasizes elements of continuity with the Roman Empire, while "Early Middle Ages" is used to emphasize developments characteristic of the earlier medieval period. As such the concept overlaps with Late Antiquity, following the decline of the Western Roman Empire, and precedes the High Middle Ages (c. 11th to 13th centuries).

The period saw a continuation of trends evident since late classical antiquity, including population decline, especially in urban centres, a decline of trade, a small rise in global warming and increased migration. In the 19th century the Early Middle Ages were often labelled the "Dark Ages", a characterization based on the relative scarcity of literary and cultural output from this time. However, the Eastern Roman Empire, or Byzantine Empire, continued to survive, though in the 7th century the Rashidun Caliphate and the Umayyad Caliphate conquered swathes of formerly Roman territory.

Many of the listed trends reversed later in the period. In 800 the title of "Emperor" was revived in Western Europe with Charlemagne, whose Carolingian Empire greatly affected later European social structure and history. Europe experienced a return to systematic agriculture in the form of the feudal system, which adopted such innovations as three-field planting and the heavy plough. Barbarian migration stabilized in much of Europe, although the Viking expansion greatly affected Northern Europe.



The "Dark Ages" is a historical periodization traditionally referring to the Middle Ages (c. 5th–15th century) that asserts that a demographic, cultural, and economic deterioration occurred in Western Europe following the decline of the Roman Empire.

The term employs traditional light-versus-darkness imagery to contrast the era's "darkness" (lack of records) with earlier and later periods of "light" (abundance of records). The concept of a "Dark Age" originated in the 1330s with the Italian scholar Petrarch, who regarded the post-Roman centuries as "dark" compared to the "light" of classical antiquity. The phrase "Dark Age" itself derives from the Latin saeculum obscurum, originally applied by Caesar Baronius in 1602 to a tumultuous period in the 10th and 11th centuries. The concept thus came to characterize the entire Middle Ages as a time of intellectual darkness in Europe between the fall of Rome and the Renaissance; this became especially popular during the 18th-century Age of Enlightenment.

As the accomplishments of the era came to be better understood in the 19th and 20th centuries, scholars began restricting the "Dark Ages" appellation to the Early Middle Ages (c. 5th–10th century), and now scholars also reject its usage in this period. The majority of modern scholars avoid the term altogether due to its negative connotations, finding it misleading and inaccurate. Petrarch's pejorative meaning remains in use, typically in popular culture which often mischaracterises the Middle Ages as a time of violence and backwardness.



Neuroscience (or neurobiology) is the scientific study of the nervous system. It is a multidisciplinary science that combines physiology, anatomy, molecular biology, developmental biology, cytology, mathematical modeling, and psychology to understand the fundamental and emergent properties of neurons and neural circuits. The understanding of the biological basis of learning, memory, behavior, perception, and consciousness has been described by Eric Kandel as the "ultimate challenge" of the biological sciences.

The scope of neuroscience has broadened over time to include different approaches used to study the nervous system at different scales and the techniques used by neuroscientists have expanded enormously, from molecular and cellular studies of individual neurons to imaging of sensory, motor and cognitive tasks in the brain.



Science (from the Latin word scientia, meaning "knowledge") is a systematic enterprise that builds and organizes knowledge in the form of testable explanations and predictions about the universe.

The earliest roots of science can be traced to Ancient Egypt and Mesopotamia in around 3500 to 3000 BCE. Their contributions to mathematics, astronomy, and medicine entered and shaped Greek natural philosophy of classical antiquity, whereby formal attempts were made to provide explanations of events in the physical world based on natural causes. After the fall of the Western Roman Empire, knowledge of Greek conceptions of the world deteriorated in Western Europe during the early centuries (400 to 1000 CE) of the Middle Ages but was preserved in the Muslim world during the Islamic Golden Age. The recovery and assimilation of Greek works and Islamic inquiries into Western Europe from the 10th to 13th century revived "natural philosophy", which was later transformed by the Scientific Revolution that began in the 16th century as new ideas and discoveries departed from previous Greek conceptions and traditions. The scientific method soon played a greater role in knowledge creation and it was not until the 19th century that many of the institutional and professional features of science began to take shape; along with the changing of "natural philosophy" to "natural science."

Modern science is typically divided into three major branches that consist of the natural sciences (e.g., biology, chemistry, and physics), which study nature in the broadest sense; the social sciences (e.g., economics, psychology, and sociology), which study individuals and societies; and the formal sciences (e.g., logic, mathematics, and theoretical computer science), which study abstract concepts. There is disagreement, however, on whether the formal sciences actually constitute a science as they do not rely on empirical evidence. Disciplines that use existing scientific knowledge for practical purposes, such as engineering and medicine, are described as applied sciences.

Science is based on research, which is commonly conducted in academic and research institutions as well as in government agencies and companies. The practical impact of scientific research has led to the emergence of science policies that seek to influence the scientific enterprise by prioritizing the development of commercial products, armaments, health care, and environmental protection.



A neural circuit is a population of neurons interconnected by synapses to carry out a specific function when activated. Neural circuits interconnect to one another to form large scale brain networks. Biological neural networks have inspired the design of artificial neural networks, but artificial neural networks are usually not strict copies of their biological counterparts.



A brain is an organ that serves as the center of the nervous system in all vertebrate and most invertebrate animals. It is located in the head, usually close to the sensory organs for senses such as vision. It is the most complex organ in a vertebrate's body. In a human, the cerebral cortex contains approximately 14–16 billion neurons, and the estimated number of neurons in the cerebellum is 55–70 billion. Each neuron is connected by synapses to several thousand other neurons. These neurons typically communicate with one another by means of long fibers called axons, which carry trains of signal pulses called action potentials to distant parts of the brain or body targeting specific recipient cells.

Physiologically, brains exert centralized control over a body's other organs. They act on the rest of the body both by generating patterns of muscle activity and by driving the secretion of chemicals called hormones. This centralized control allows rapid and coordinated responses to changes in the environment. Some basic types of responsiveness such as reflexes can be mediated by the spinal cord or peripheral ganglia, but sophisticated purposeful control of behavior based on complex sensory input requires the information integrating capabilities of a centralized brain.

The operations of individual brain cells are now understood in considerable detail but the way they cooperate in ensembles of millions is yet to be solved. Recent models in modern neuroscience treat the brain as a biological computer, very different in mechanism from an electronic computer, but similar in the sense that it acquires information from the surrounding world, stores it, and processes it in a variety of ways.

This article compares the properties of brains across the entire range of animal species, with the greatest attention to vertebrates. It deals with the human brain insofar as it shares the properties of other brains. The ways in which the human brain differs from other brains are covered in the human brain article. Several topics that might be covered here are instead covered there because much more can be said about them in a human context. The most important is brain disease and the effects of brain damage, that are covered in the human brain article.



The active intellect (Latin: intellectus agens; also translated as agent intellect, active intelligence, active reason, or productive intellect) is a concept in classical and medieval philosophy. The term refers to the formal (morphe) aspect of the intellect (nous), in accordance with the theory of hylomorphism.

The nature of the active intellect was the subject of intense discussion in medieval philosophy, as various Muslim, Jewish and Christian thinkers sought to reconcile their commitment to Aristotle's account of the body and soul to their own theological commitments. At stake in particular was in what way Aristotle's account of an incorporeal soul might contribute to understanding of the nature of eternal life.



The theory of multiple intelligences proposes the differentiation of human intelligence into specific “modalities of intelligence”, rather than defining intelligence as a single, general ability. The theory has been criticized by mainstream psychology for its lack of empirical evidence, and its dependence on subjective judgement.



Human intelligence is the intellectual capability of humans, which is marked by complex cognitive feats and high levels of motivation and self-awareness.

Through intelligence, humans possess the cognitive abilities to learn, form concepts, understand, apply logic, and reason, including the capacities to recognize patterns, plan, innovate, solve problems, make decisions, retain information, and use language to communicate.



Reason is the capacity of consciously making sense of things, applying logic, and adapting or justifying practices, institutions, and beliefs based on new or existing information. It is closely associated with such characteristically human activities as philosophy, science, language, mathematics, and art, and is normally considered to be a distinguishing ability possessed by humans. Reason is sometimes referred to as rationality.

Reasoning is associated with the acts of thinking and cognition, and involves using one's intellect. The field of logic studies the ways in which humans can use formal reasoning to produce logically valid arguments. Reasoning may be subdivided into forms of logical reasoning, such as: deductive reasoning, inductive reasoning, and abductive reasoning. Aristotle drew a distinction between logical discursive reasoning (reason proper), and intuitive reasoning, in which the reasoning process through intuition—however valid—may tend toward the personal and the subjectively opaque. In some social and political settings logical and intuitive modes of reasoning may clash, while in other contexts intuition and formal reason are seen as complementary rather than adversarial. For example, in mathematics, intuition is often necessary for the creative processes involved with arriving at a formal proof, arguably the most difficult of formal reasoning tasks.

Reasoning, like habit or intuition, is one of the ways by which thinking moves from one idea to a related idea. For example, reasoning is the means by which rational individuals understand sensory information from their environments, or conceptualize abstract dichotomies such as cause and effect, truth and falsehood, or ideas regarding notions of good or evil. Reasoning, as a part of executive decision making, is also closely identified with the ability to self-consciously change, in terms of goals, beliefs, attitudes, traditions, and institutions, and therefore with the capacity for freedom and self-determination.

In contrast to the use of "reason" as an abstract noun, a reason is a consideration given which either explains or justifies events, phenomena, or behavior. Reasons justify decisions, reasons support explanations of natural phenomena; reasons can be given to explain the actions (conduct) of individuals.

Using reason, or reasoning, can also be described more plainly as providing good, or the best, reasons. For example, when evaluating a moral decision, "morality is, at the very least, the effort to guide one's conduct by reason—that is, doing what there are the best reasons for doing—while giving equal [and impartial] weight to the interests of all those affected by what one does."

Psychologists and cognitive scientists have attempted to study and explain how people reason, e.g. which cognitive and neural processes are engaged, and how cultural factors affect the inferences that people draw. The field of automated reasoning studies how reasoning may or may not be modeled computationally. Animal psychology considers the question of whether animals other than humans can reason.



Rationality is the quality or state of being rational – that is, being based on or agreeable to reason. Rationality implies the conformity of one's beliefs with one's reasons to believe, and of one's actions with one's reasons for action. "Rationality" has different specialized meanings in philosophy, economics, sociology, psychology, evolutionary biology, game theory and political science.

To determine what behavior is the most rational, one needs to make several key assumptions, and also needs a logical formulation of the problem. When the goal or problem involves making a decision, rationality factors in all information that is available (e.g. complete or incomplete knowledge). Collectively, the formulation and background assumptions are the models within which rationality applies. Rationality is relative: if one accepts a model in which benefitting oneself is optimal, then rationality is equated with behavior that is self-interested to the point of being selfish; whereas if one accepts a model in which benefiting the group is optimal, then purely selfish behavior is deemed irrational. It is thus meaningless to assert rationality without also specifying the background model assumptions describing how the problem is framed and formulated.



In psychology, decision-making (also spelled decision making and decisionmaking) is regarded as the cognitive process resulting in the selection of a belief or a course of action among several possible alternative options, it could be either rational or irrational. Decision-making process is a reasoning process based on assumptions of values, preferences and beliefs of the decision-maker. Every decision-making process produces a final choice, which may or may not prompt action.

Research about decision-making is also published under the label problem solving, particularly in European psychological research.



Empirical evidence is information that verifies the truth (which accurately corresponds to reality) or falsity (inaccuracy) of a claim. In the empiricist view, one can claim to have knowledge only when based on empirical evidence (although some empiricists believe that there are other ways of gaining knowledge). This stands in contrast to the rationalist view under which reason or reflection alone is considered evidence for the truth or falsity of some propositions. Empirical evidence is information acquired by observation or experimentation, in the form of recorded data, which may be the subject of analysis (e.g. by scientists). This is the primary source of empirical evidence. Secondary sources describe, discuss, interpret, comment upon, analyze, evaluate, summarize, and process primary sources. Secondary source materials can be articles in newspapers or popular magazines, book or movie reviews, or articles found in scholarly journals that discuss or evaluate someone else's original research.

In a second sense "empirical" in science may be synonymous with "experimental." In this sense, an empirical result is an experimental observation. In this context, the term semi-empirical is used for qualifying theoretical methods that use, in part, basic axioms or postulated scientific laws and experimental results. Such methods are opposed to theoretical ab initio methods, which are purely deductive and based on first principles. Typical examples of both ab initio and semi-empirical methods can be found in computational chemistry.

In science, empirical evidence is required for a hypothesis to gain acceptance in the scientific community. Normally, this validation is achieved by the scientific method of forming a hypothesis, experimental design, peer review, reproduction of results, conference presentation, and journal publication. This requires rigorous communication of hypothesis (usually expressed in mathematics), experimental constraints and controls (expressed necessarily in terms of standard experimental apparatus), and a common understanding of measurement.

Statements and arguments depending on empirical evidence are often referred to as a posteriori ("following experience") as distinguished from a priori (preceding it). A priori knowledge or justification is independent of experience (for example "All bachelors are unmarried"), whereas a posteriori knowledge or justification is dependent on experience or empirical evidence (for example "Some bachelors are very happy"). The notion that the distinction between a posteriori and a priori is tantamount to the distinction between empirical and non-empirical knowledge comes from Kant's Critique of Pure Reason.

The standard positivist view of empirically acquired information has been that observation, experience, and experiment serve as neutral arbiters between competing theories. However, since the 1960s, a persistent critique most associated with Thomas Kuhn,[page needed] has argued that these methods are influenced by prior beliefs and experiences. Consequently, it cannot be expected that two scientists when observing, experiencing, or experimenting on the same event will make the same theory-neutral observations. The role of observation as a theory-neutral arbiter may not be possible. Theory-dependence of observation means that, even if there were agreed methods of inference and interpretation, scientists may still disagree on the nature of empirical data.



Truth is the property of being in accord with fact or reality. In everyday language, truth is typically ascribed to things that aim to represent reality or otherwise correspond to it, such as beliefs, propositions, and declarative sentences.

Truth is usually held to be the opposite of falsity. The concept of truth is discussed and debated in various contexts, including philosophy, art, theology, and science. Most human activities depend upon the concept, where its nature as a concept is assumed rather than being a subject of discussion; these include most of the sciences, law, journalism, and everyday life. Some philosophers view the concept of truth as basic, and unable to be explained in any terms that are more easily understood than the concept of truth itself. Most commonly, truth is viewed as the correspondence of language or thought to a mind-independent world. This is called the correspondence theory of truth.

Various theories and views of truth continue to be debated among scholars, philosophers, and theologians. There are many different questions about the nature of truth which are still the subject of contemporary debates, such as: How do we define truth? Is it even possible to give an informative definition of truth? What things are truthbearers and are therefore capable of being true or false? Are truth and falsity bivalent, or are there other truth values? What are the criteria of truth that allow us to identify it and to distinguish it from falsity? What role does truth play in constituting knowledge? And is truth always absolute, or can it be relative to one's perspective?



Reality is the sum or aggregate of all that is real or existent within a system, as opposed to that which is only imaginary. The term is also used to refer to the ontological status of things, indicating their existence. In physical terms, reality is the totality of a system, known and unknown. Philosophical questions about the nature of reality or existence or being are considered under the rubric of ontology, which is a major branch of metaphysics in the Western philosophical tradition. Ontological questions also feature in diverse branches of philosophy, including the philosophy of science, philosophy of religion, philosophy of mathematics, and philosophical logic. These include questions about whether only physical objects are real (i.e., Physicalism), whether reality is fundamentally immaterial (e.g., Idealism), whether hypothetical unobservable entities posited by scientific theories exist, whether God exists, whether numbers and other abstract objects exist, and whether possible worlds exist.



A fact is an occurrence in the real world. The usual test for a statement of fact is verifiability—that is whether it can be demonstrated to correspond to experience. Standard reference works are often used to check facts. Scientific facts are verified by repeatable careful observation or measurement by experiments or other means.

For example, "This sentence contains words." is a linguistic fact, and "The sun is a star." is an astronomical fact. Further, "Abraham Lincoln was the 16th President of the United States." and "Abraham Lincoln was assassinated." are both historical facts. Generally speaking, facts are independent of belief.



Information can be thought of as the resolution of uncertainty; it is that which answers the question of "What an entity is" and thus defines both its essence and nature of its characteristics. The concept of information has different meanings in different contexts. Thus the concept becomes related to notions of constraint, communication, control, data, form, education, knowledge, meaning, understanding, mental stimuli, pattern, perception, representation, and entropy.

Information is associated with data, as data represent values attributed to parameters, and information is data in context and with meaning attached. Information also relates to knowledge, as knowledge signifies understanding of an abstract or concrete concept.

In terms of communication, information is expressed either as the content of a message or through direct or indirect observation. That which is perceived can be construed as a message in its own right, and in that sense, information is always conveyed as the content of a message.

Information can be encoded into various forms for transmission and interpretation (for example, information may be encoded into a sequence of signs, or transmitted via a signal). It can also be encrypted for safe storage and communication.

The uncertainty of an event is measured by its probability of occurrence and is inversely proportional to that. The more uncertain an event, the more information is required to resolve uncertainty of that event. The bit is a typical unit of information, but other units such as the nat may be used. For example, the information encoded in one "fair" coin flip is log2(2/1) = 1 bit, and in two fair coin flips is log2(4/1) = 2 bits.



In information theory, the entropy of a random variable is the average level of "information", "surprise", or "uncertainty" inherent in the variable's possible outcomes. The concept of information entropy was introduced by Claude Shannon in his 1948 paper "A Mathematical Theory of Communication". As an example, consider a biased coin with probability p of landing on heads and probability 1-p of landing on tails. The maximum surprise is for p = 1/2, when there is no reason to expect one outcome over another, and in this case a coin flip has an entropy of one bit. The minimum surprise is when p = 0 or p = 1, when the event is known and the entropy is zero bits. Other values of p give different entropies between zero and one bits.

Given a discrete random variable X {\displaystyle X} X, with possible outcomes x 1 , . . . , x n {\displaystyle x_{1},...,x_{n}} x_{1},...,x_{n}, which occur with probability P ( x 1 ) , . . . , P ( x n ) {\displaystyle \mathrm {P} (x_{1}),...,\mathrm {P} (x_{n})} {\displaystyle \mathrm {P} (x_{1}),...,\mathrm {P} (x_{n})}, the entropy of X {\displaystyle X} X is formally defined as:

H ( X ) = − ∑ i = 1 n P ( x i ) log ⁡ P ( x i ) {\displaystyle \mathrm {H} (X)=-\sum _{i=1}^{n}{\mathrm {P} (x_{i})\log \mathrm {P} (x_{i})}} {\displaystyle \mathrm {H} (X)=-\sum _{i=1}^{n}{\mathrm {P} (x_{i})\log \mathrm {P} (x_{i})}}

where Σ {\displaystyle \Sigma } \Sigma denotes the sum over the variable's possible values and log {\displaystyle \log } \log is the logarithm, the choice of base varying between different applications. Base 2 gives the unit of bits (or "shannons"), while base e gives the "natural units" nat, and base 10 gives a unit called "dits", "bans", or "hartleys". An equivalent definition of entropy is the expected value of the self-information of a variable.

The entropy was originally created by Shannon as part of his theory of communication, in which a data communication system is composed of three elements: a source of data, a communication channel, and a receiver. In Shannon's theory, the "fundamental problem of communication" – as expressed by Shannon – is for the receiver to be able to identify what data was generated by the source, based on the signal it receives through the channel. Shannon considered various ways to encode, compress, and transmit messages from a data source, and proved in his famous source coding theorem that the entropy represents an absolute mathematical limit on how well data from the source can be losslessly compressed onto a perfectly noiseless channel. Shannon strengthened this result considerably for noisy channels in his noisy-channel coding theorem.

Entropy in information theory is directly analogous to the entropy in statistical thermodynamics. Entropy has relevance to other areas of mathematics such as combinatorics. The definition can be derived from a set of axioms establishing that entropy should be a measure of how "surprising" the average outcome of a variable is. For a continuous random variable, differential entropy is analogous to entropy.


MENTAL STIMULI
Stimulation is the encouragement of development or the cause of activity generally. For example, "The press provides stimulation of political discourse." An interesting or fun activity can be described as "stimulating", regardless of its physical effects on senses. Stimulate means to act as a stimulus to; stimulus means something that rouses the recipient to activity; stimuli is the plural of stimulus.

A particular use of the term is physiological stimulation, which refers to sensory excitation, the action of various agents or forms of energy (stimuli) on receptors that generate impulses that travel through nerves to the brain (afferents). There are sensory receptors on or near the surface of the body, such as photoreceptors in the retina of the eye, hair cells in the cochlea of the ear, touch receptors in the skin and chemical receptors in the mouth and nasal cavity. There are also sensory receptors in the muscles, joints, digestive tract, and membranes around organs such as the brain, the abdominal cavity, the bladder and the prostate (providing one source of sexual stimulation). Stimulation to the external or internal senses may evoke involuntary activity or guide intentions in action. Such emotional or motivating stimulation typically is also experienced subjectively (enters awareness, is in consciousness). Perception can be regarded as conceptualised stimulation, used in reasoning and intending, for example. When bodily stimulation is perceived it is traditionally called a sensation, such as a kind of touch or a taste or smell, or a painful or pleasurable sensation. This can be thought of as psychological stimulation, which is a stimulus affecting a person's thinking or feeling processes.
WIKIPEDIA THE INTELLECT THEORY INTELLECTUAL HUMAN EIII2_MISUSE

In the study of the human mind, the term Intellect refers to and identifies the ability of the mind to reach correct conclusions about what is true and what is false, and about how to solve problems. The term intellect derives from the Ancient Greek philosophy term nous, which translates to the Latin intellectus (from intelligere, “to understand”) and into the French and English languages as intelligence. Discussion of the intellect is in two areas of knowledge, wherein the terms intellect and intelligence are related terms.



The mind is the set of faculties including cognitive aspects such as consciousness, imagination, perception, thinking, intelligence, judgement, language and memory, as well as noncognitive aspects such as emotion and instinct. Under the scientific physicalist interpretation, the mind is produced at least in part by the brain. The primary competitors to the physicalist interpretations of the mind are idealism, substance dualism, and types of property dualism, and by some lights eliminative materialism and anomalous monism. There is a lengthy tradition in philosophy, religion, psychology, and cognitive science about what constitutes a mind and what are its distinguishing properties.



Cognition (/kɒɡˈnɪʃ(ə)n/ (About this soundlisten)) refers to "the mental action or process of acquiring knowledge and understanding through thought, experience, and the senses". It encompasses many aspects of intellectual functions and processes such as: attention, the formation of knowledge, memory and working memory, judgment and evaluation, reasoning and "computation", problem solving and decision making, comprehension and production of language. Cognitive processes use existing knowledge and generate new knowledge.



The mind is the set of faculties including cognitive aspects such as consciousness, imagination, perception, thinking, intelligence, judgement, language and memory, as well as noncognitive aspects such as emotion and instinct. Under the scientific physicalist interpretation, the mind is produced at least in part by the brain. The primary competitors to the physicalist interpretations of the mind are idealism, substance dualism, and types of property dualism, and by some lights eliminative materialism and anomalous monism. There is a lengthy tradition in philosophy, religion, psychology, and cognitive science about what constitutes a mind and what are its distinguishing properties.

One open question regarding the nature of the mind is the mind–body problem, which investigates the relation of the mind to the physical brain and nervous system. Older viewpoints included dualism and idealism, which considered the mind somehow non-physical. Modern views often center around physicalism and functionalism, which hold that the mind is roughly identical with the brain or reducible to physical phenomena such as neuronal activity though dualism and idealism continue to have many supporters. Another question concerns which types of beings are capable of having minds (New Scientist 8 September 2018 p10). For example, whether mind is exclusive to humans, possessed also by some or all animals, by all living things, whether it is a strictly definable characteristic at all, or whether mind can also be a property of some types of human-made machines.



Intelligence has been defined in many ways: the capacity for logic, understanding, self-awareness, learning, emotional knowledge, reasoning, planning, creativity, critical thinking, and problem-solving. More generally, it can be described as the ability to perceive or infer information, and to retain it as knowledge to be applied towards adaptive behaviors within an environment or context.

Intelligence is most often studied in humans but has also been observed in both non-human animals and in plants. Intelligence in machines is called artificial intelligence, which is commonly implemented in computer systems using programs and, sometimes, specialized hardware.



Humans (Homo sapiens) are a species of highly intelligent primates. They are the only extant members of the subtribe Hominina and—together with chimpanzees, gorillas, and orangutans—are part of the family Hominidae (the great apes, or hominids). Humans are terrestrial animals, characterized by their erect posture and bipedal locomotion; high manual dexterity and heavy tool use compared to other animals; open-ended and complex language use compared to other animal communications; larger, more complex brains than other primates; and highly advanced and organized societies.

Several early hominins used fire and occupied much of Eurasia. Early modern humans are thought to have diverged in Africa from an earlier hominin around 300,000 years ago, with the earliest fossil evidence of Homo sapiens also appearing around 300,000 years ago in Africa. Humans began to exhibit evidence of behavioral modernity at least by about 100,000–70,000 years ago (and possibly earlier). In several waves of migration, H. sapiens ventured out of Africa and populated most of the world. The spread of the large and increasing population of humans has profoundly affected the biosphere and millions of species worldwide. Advantages that explain this evolutionary success include a larger, well-developed brain, which enables advanced abstract reasoning, language, problem solving, sociality, and culture through social learning. Humans use tools more frequently and effectively than any other animal: they are the only extant species to build fires, cook food, clothe themselves, and create and use numerous other technologies and arts.

Humans uniquely use systems of symbolic communication as language and art to express themselves and exchange ideas and also organize themselves into purposeful groups. Humans create complex social structures composed of many cooperating and competing groups, from families and kinship networks to political states. Social interactions between humans have established an extremely wide variety of values, social norms, and rituals, which together undergird human society. Curiosity and the human desire to understand and influence the environment and to explain and manipulate phenomena (or events) have motivated humanity's development of science, philosophy, mythology, religion, and other fields of knowledge.

Though most of human existence has been sustained by hunting and gathering in band societies, many human societies transitioned to sedentary agriculture approximately 10,000 years ago, domesticating plants and animals, thus enabling the growth of civilization. These human societies subsequently expanded, establishing various forms of government, religion, and culture around the world, and unifying people within regions to form states and empires. The rapid advancement of scientific and medical understanding in the 19th and 20th centuries permitted the development of fuel-driven technologies and increased lifespans, causing the human population to rise exponentially. The global human population was estimated to be near 7.8 billion in 2019.




WIKIPEDIA MANUAL DEXERITY

Behavioral modernity is a suite of behavioral and cognitive traits that distinguishes current Homo sapiens from other anatomically modern humans, hominins, and primates. Most scholars agree that modern human behavior can be characterized by abstract thinking, planning depth, symbolic behavior (e.g., art, ornamentation), music and dance, exploitation of large game, and blade technology, among others. Underlying these behaviors and technological innovations are cognitive and cultural foundations that have been documented experimentally and ethnographically by evolutionary and cultural anthropologists. These human universal patterns include cumulative cultural adaptation, social norms, language, and extensive help and cooperation beyond close kin.

Within the tradition of evolutionary anthropology and related disciplines, it has been argued that the development of these modern behavioral traits, in combination with the climatic conditions of the Last Glacial Period and Last Glacial Maximum causing population bottlenecks, contributed to the evolutionary success of Homo sapiens worldwide relative to Neanderthals, Denisovans, and other archaic humans.

Arising from differences in the archaeological record, debate continues as to whether anatomically modern humans were behaviorally modern as well. There are many theories on the evolution of behavioral modernity. These generally fall into two camps: gradualist and cognitive approaches. The Later Upper Paleolithic Model theorises that modern human behavior arose through cognitive, genetic changes in Africa abruptly around 40,000–50,000 years ago around the time of the Out-of-Africa migration, prompting the movement of modern humans out of Africa and across the world. Other models focus on how modern human behavior may have arisen through gradual steps, with the archaeological signatures of such behavior appearing only through demographic or subsistence-based changes. Many cite evidence of behavioral modernity earlier (by at least 100,000–70,000 years ago and possibly earlier) namely in the African Middle Stone Age. Sally McBrearty and Alison S. Brooks are notable proponents of gradualism, challenging European-centric models by situating more change in the Middle Stone Age of African pre-history, though this version of the story is more difficult to develop in concrete terms due to a thinning fossil record as one goes further back in time.



Bipedalism is a form of terrestrial locomotion where an organism moves by means of its two rear limbs or legs. An animal or machine that usually moves in a bipedal manner is known as a biped /ˈbaɪpɛd/, meaning "two feet" (from the Latin bis for "double" and pes for "foot"). Types of bipedal movement include walking, running, or hopping.

Few modern species are habitual bipeds whose normal method of locomotion is two-legged. Within mammals, habitual bipedalism has evolved multiple times, with the macropods, kangaroo rats and mice, springhare, hopping mice, pangolins and hominin apes (australopithecines and humans) as well as various other extinct groups evolving the trait independently. In the Triassic period some groups of archosaurs (a group that includes crocodiles and dinosaurs) developed bipedalism; among the dinosaurs, all the early forms and many later groups were habitual or exclusive bipeds; the birds are members of a clade of exclusively bipedal dinosaurs, the theropods.

A larger number of modern species intermittently or briefly use a bipedal gait. Several lizard species move bipedally when running, usually to escape from threats. Many primate and bear species will adopt a bipedal gait in order to reach food or explore their environment, though there are a few cases where they walk on their hind-limbs only. Several arboreal primate species, such as gibbons and indriids, exclusively walk on two legs during the brief periods they spend on the ground. Many animals rear up on their hind legs whilst fighting or copulating. Some animals commonly stand on their hind-legs, in order to reach food, to keep watch, to threaten a competitor or predator, or to pose in courtship, but do not move bipedally.



Mammals (from Latin mamma "breast") are a group of vertebrate animals constituting the class Mammalia (/məˈmeɪliə/), and characterized by the presence of mammary glands which in females produce milk for feeding (nursing) their young, a neocortex (a region of the brain), fur or hair, and three middle ear bones. These characteristics distinguish them from reptiles and birds, from which they diverged in the late Carboniferous, approximately 300 million years ago. Around 6,400 extant species of mammals have been described. The largest orders are the rodents, bats and Eulipotyphla (hedgehogs, moles, shrews, and others). The next three are the Primates (apes including humans, monkeys, and others), the Artiodactyla (cetaceans and even-toed ungulates), and the Carnivora (cats, dogs, seals, and others).

In terms of cladistics, which reflects evolutionary history, mammals are the only living members of the Synapsida; this clade, together with Sauropsida (reptiles and birds), constitutes the larger Amniota clade. The early synapsid mammalian ancestors were sphenacodont pelycosaurs, a group that included the non-mammalian Dimetrodon. At the end of the Carboniferous period around 300 million years ago, this group diverged from the sauropsid line that led to today's reptiles and birds. The line following the stem group Sphenacodontia split into several diverse groups of non-mammalian synapsids—sometimes incorrectly referred to as mammal-like reptiles—before giving rise to Therapsida in the Early Permian period. The modern mammalian orders arose in the Paleogene and Neogene periods of the Cenozoic era, after the extinction of non-avian dinosaurs, and have been the dominant terrestrial animal group from 66 million years ago to the present.

The basic body type is quadruped, and most mammals use their four extremities for terrestrial locomotion; but in some, the extremities are adapted for life at sea, in the air, in trees, underground, or on two legs. Mammals range in size from the 30–40 mm (1.2–1.6 in) bumblebee bat to the 30 m (98 ft) blue whale—possibly the largest animal to have ever lived. Maximum lifespan varies from two years for the shrew to 211 years for the bowhead whale. All modern mammals give birth to live young, except the five species of monotremes, which are egg-laying mammals. The most species-rich group of mammals, the cohort called placentals, have a placenta, which enables the feeding of the fetus during gestation.

Most mammals are intelligent, with some possessing large brains, self-awareness, and tool use. Mammals can communicate and vocalize in several ways, including the production of ultrasound, scent-marking, alarm signals, singing, and echolocation. Mammals can organize themselves into fission-fusion societies, harems, and hierarchies—but can also be solitary and territorial. Most mammals are polygynous, but some can be monogamous or polyandrous.

Domestication of many types of mammals by humans played a major role in the Neolithic revolution, and resulted in farming replacing hunting and gathering as the primary source of food for humans. This led to a major restructuring of human societies from nomadic to sedentary, with more co-operation among larger and larger groups, and ultimately the development of the first civilizations. Domesticated mammals provided, and continue to provide, power for transport and agriculture, as well as food (meat and dairy products), fur, and leather. Mammals are also hunted and raced for sport, and are used as model organisms in science. Mammals have been depicted in art since Palaeolithic times, and appear in literature, film, mythology, and religion. Decline in numbers and extinction of many mammals is primarily driven by human poaching and habitat destruction, primarily deforestation.  {Fine motor skill (or dexterity) is the coordination of small muscles, in movements—usually involving the synchronisation of hands and fingers—with the eyes. The complex levels of manual dexterity that humans exhibit can be attributed to and demonstrated in tasks controlled by the nervous system. Fine motor skills aid in the growth of intelligence and develop continuously throughout the stages of human development.}  ERECT POSTURE The evolution of human bipedalism, which began in primates about four million years ago, or as early as seven million years ago with Sahelanthropus, or about 12 million years ago with Danuvius guggenmosi, has led to morphological alterations to the human skeleton including changes to the arrangement and size of the bones of the foot, hip size and shape, knee size, leg length, and the shape and orientation of the vertebral column. The evolutionary factors that produced these changes have been the subject of several theories.

WIKIPEDIA THE BRAINS

Subjectivity is a central philosophical concept, related to consciousness, agency, personhood, reality, and truth, which has been variously defined by sources. Three common definitions include that subjectivity is the quality or condition of:

   Something being a subject, narrowly meaning an individual who possesses conscious experiences, such as perspectives, feelings, beliefs, and desires.
   Something being a subject, broadly meaning an entity that has agency, meaning that it acts upon or wields power over some other entity (an object).
   Some information, idea, situation, or physical thing considered true only from the perspective of a subject or subjects.

These various definitions of subjectivity are sometimes joined together in philosophy. The term is most commonly used as an explanation for that which influences, informs, and biases people's judgments about truth or reality; it is the collection of the perceptions, experiences, expectations, and personal or cultural understanding of, and beliefs about, an external phenomenon, that are specific to a subject.

Subjectivity is contrasted to the philosophy of objectivity, which is described as a view of truth or reality that is free of any individual's biases, interpretations, feelings, and imaginings.



In many religious, philosophical, and mythological traditions, the soul is the incorporeal essence of a living being. Soul or psyche (Ancient Greek: ψυχή psykhḗ, of ψύχειν psýkhein, "to breathe") comprises the mental abilities of a living being: reason, character, feeling, consciousness, qualia, memory, perception, thinking, etc. Depending on the philosophical system, a soul can either be mortal or immortal.

Greek philosophers, such as Socrates, Plato, and Aristotle, understood that the soul (ψυχή psūchê) must have a logical faculty, the exercise of which was the most divine of human actions. At his defense trial, Socrates even summarized his teachings as nothing other than an exhortation for his fellow Athenians to excel in matters of the psyche since all bodily goods are dependent on such excellence (Apology 30a–b).

In Judaism and in some Christian denominations, only human beings have immortal souls (although immortality is disputed within Judaism and the concept of immortality may have been influenced by Plato). For example, the Catholic theologian Thomas Aquinas attributed "soul" (anima) to all organisms but argued that only human souls are immortal. Other religions (most notably Hinduism and Jainism) hold that all living things from the smallest bacterium to the largest of mammals are the souls themselves (Atman, jiva) and have their physical representative (the body) in the world. The actual self is the soul, while the body is only a mechanism to experience the karma of that life. Thus if we see a tiger then there is a self-conscious identity residing in it (the soul), and a physical representative (the whole body of the tiger, which is observable) in the world. Some teach that even non-biological entities (such as rivers and mountains) possess souls. This belief is called animism.



The cosmos (UK: /ˈkɒzmɒs/, US: /-moʊs/) is the Universe. Using the word cosmos rather than the word universe implies viewing the universe as a complex and orderly system or entity; the opposite of chaos. The cosmos, and our understanding of the reasons for its existence and significance, are studied in cosmology – a very broad discipline covering any scientific, religious, or philosophical contemplation of the cosmos and its nature, or reasons for existing. Religious and philosophical approaches may include in their concepts of the cosmos various spiritual entities or other matters deemed to exist outside our physical universe.

LARGER BRAINS {larger, well-developed brain} Encephalization quotient (EQ), encephalization level (EL), or just encephalization is a relative brain size measure that is defined as the ratio between observed to predicted brain mass for an animal of a given size, based on nonlinear regression on a range of reference species. It has been used as a proxy for intelligence and thus as a possible way of comparing the intelligences of different species. For this purpose it is a more refined measurement than the raw brain-to-body mass ratio, as it takes into account allometric effects. Expressed as a formula, the relationship has been developed for mammals and may not yield relevant results when applied outside this group.



A hunter-gatherer is a human living in a society in which most or all food is obtained by foraging (collecting wild plants and pursuing wild animals). Hunter-gatherer societies stand in contrast to agricultural societies, which rely mainly on domesticated species, although the boundaries between the two are not distinct.

Hunting and gathering was humanity's first and most successful adaptation, occupying at least 90 percent of human history. Following the invention of agriculture, hunter-gatherers who did not change have been displaced or conquered by farming or pastoralist groups in most parts of the world. However the division between the two is no longer presumed to be a fundamental marker in human history, and there is not necessarily a hierarchy which places agriculture and industry at the top as a goal to be reached.

Only a few contemporary societies are classified as hunter-gatherers, and many supplement their foraging activity with horticulture or pastoralism. Contrary to common misconception, hunter-gatherers are mostly well-fed, rather than starving.



A civilization (or civilisation) is any complex society characterized by urban development, social stratification, a form of government and symbolic systems of communication such as writing.

Civilizations are intimately associated with and often further defined by other socio-politico-economic characteristics, including centralization, the domestication of both humans and other organisms, specialization of labour, culturally ingrained ideologies of progress and supremacism, monumental architecture, taxation, societal dependence upon farming and expansionism.

Historically, civilization has often been understood as a larger and "more advanced" culture, in contrast to smaller, supposedly primitive cultures. In this broad sense, a civilization contrasts with non-centralized tribal societies, including the cultures of nomadic pastoralists, Neolithic societies or hunter-gatherers, however sometimes it also contrasts with the cultures found within civilizations themselves. Civilizations are organized in densely populated settlements divided into hierarchical social classes with a ruling elite and subordinate urban and rural populations, which engage in intensive agriculture, mining, small-scale manufacture and trade. Civilization concentrates power, extending human control over the rest of nature, including over other human beings.

Civilization, as its etymology (below) suggests, is a concept originally linked to towns and cities. The earliest emergence of civilizations is generally associated with the final stages of the Neolithic Revolution, culminating in the relatively rapid process of urban revolution and state formation, a political development associated with the appearance of a governing elite.
WIKIPEDIA [DILIGENT] TOOL USE

A tool is an object used to extend the ability of an individual to modify features of the surrounding environment. Although many animals use simple tools, only human beings, whose use of stone tools dates back hundreds of millennia, have been observed using tools to make other tools. The set of tools required to perform different tasks that are part of the same activity is called gear or equipment.

While one may apply the term tool loosely to many things that are means to an end (e.g., a fork), strictly speaking an object is a tool only if, besides being constructed to be held, it is also made of a material that allows its user to apply to it various degrees of force. If repeated use wears part of the tool down (like a knife blade), it may be possible to restore it; if it wears the tool out or breaks it, the tool must be replaced. Thus tool falls under the taxonomic category implement, and is on the same taxonomic rank as instrument, utensil, device, or ware.

Stimulation is the encouragement of development or the cause of activity generally. For example, "The press provides stimulation of political discourse." An interesting or fun activity can be described as "stimulating", regardless of its physical effects on senses. Stimulate means to act as a stimulus to; stimulus means something that rouses the recipient to activity; stimuli is the plural of stimulus.

A particular use of the term is physiological stimulation, which refers to sensory excitation, the action of various agents or forms of energy (stimuli) on receptors that generate impulses that travel through nerves to the brain (afferents). There are sensory receptors on or near the surface of the body, such as photoreceptors in the retina of the eye, hair cells in the cochlea of the ear, touch receptors in the skin and chemical receptors in the mouth and nasal cavity. There are also sensory receptors in the muscles, joints, digestive tract, and membranes around organs such as the brain, the abdominal cavity, the bladder and the prostate (providing one source of sexual stimulation). Stimulation to the external or internal senses may evoke involuntary activity or guide intentions in action. Such emotional or motivating stimulation typically is also experienced subjectively (enters awareness, is in consciousness). Perception can be regarded as conceptualised stimulation, used in reasoning and intending, for example. When bodily stimulation is perceived it is traditionally called a sensation, such as a kind of touch or a taste or smell, or a painful or pleasurable sensation. This can be thought of as psychological stimulation, which is a stimulus affecting a person's thinking or feeling processes.



Technology ("science of craft", from Greek τέχνη, techne, "art, skill, cunning of hand"; and -λογία, -logia) is the sum of techniques, skills, methods, and processes used in the production of goods or services or in the accomplishment of objectives, such as scientific investigation. Technology can be the knowledge of techniques, processes, and the like, or it can be embedded in machines to allow for operation without detailed knowledge of their workings. Systems (e.g. machines) applying technology by taking an input, changing it according to the system's use, and then producing an outcome are referred to as technology systems or technological systems.

The simplest form of technology is the development and use of basic tools. The prehistoric discovery of how to control fire and the later Neolithic Revolution increased the available sources of food, and the invention of the wheel helped humans to travel in and control their environment. Developments in historic times, including the printing press, the telephone, and the Internet, have lessened physical barriers to communication and allowed humans to interact freely on a global scale.

Technology has many effects. It has helped develop more advanced economies (including today's global economy) and has allowed the rise of a leisure class. Many technological processes produce unwanted by-products known as pollution and deplete natural resources to the detriment of Earth's environment. Innovations have always influenced the values of a society and raised new questions in the ethics of technology. Examples include the rise of the notion of efficiency in terms of human productivity, and the challenges of bioethics.

Philosophical debates have arisen over the use of technology, with disagreements over whether technology improves the human condition or worsens it. Neo-Luddism, anarcho-primitivism, and similar reactionary movements criticize the pervasiveness of technology, arguing that it harms the environment and alienates people; proponents of ideologies such as transhumanism and techno-progressivism view continued technological progress as beneficial to society and the human condition.



The scientific method is an empirical method of acquiring knowledge that has characterized the development of science since at least the 17th century. It involves careful observation, applying rigorous skepticism about what is observed, given that cognitive assumptions can distort how one interprets the observation. It involves formulating hypotheses, via induction, based on such observations; experimental and measurement-based testing of deductions drawn from the hypotheses; and refinement (or elimination) of the hypotheses based on the experimental findings. These are principles of the scientific method, as distinguished from a definitive series of steps applicable to all scientific enterprises.

Though diverse models for the scientific method are available, there is in general a continuous process that includes observations about the natural world. People are naturally inquisitive, so they often come up with questions about things they see or hear, and they often develop ideas or hypotheses about why things are the way they are. The best hypotheses lead to predictions that can be tested in various ways. The most conclusive testing of hypotheses comes from reasoning based on carefully controlled experimental data. Depending on how well additional tests match the predictions, the original hypothesis may require refinement, alteration, expansion or even rejection. If a particular hypothesis becomes very well supported, a general theory may be developed.

Although procedures vary from one field of inquiry to another, they are frequently the same from one to another. The process of the scientific method involves making conjectures (hypotheses), deriving predictions from them as logical consequences, and then carrying out experiments or empirical observations based on those predictions. A hypothesis is a conjecture, based on knowledge obtained while seeking answers to the question. The hypothesis might be very specific, or it might be broad. Scientists then test hypotheses by conducting experiments or studies. A scientific hypothesis must be falsifiable, implying that it is possible to identify a possible outcome of an experiment or observation that conflicts with predictions deduced from the hypothesis; otherwise, the hypothesis cannot be meaningfully tested.

The purpose of an experiment is to determine whether observations agree with or conflict with the predictions derived from a hypothesis. Experiments can take place anywhere from a garage to CERN's Large Hadron Collider. There are difficulties in a formulaic statement of method, however. Though the scientific method is often presented as a fixed sequence of steps, it represents rather a set of general principles. Not all steps take place in every scientific inquiry (nor to the same degree), and they are not always in the same order.



Observation inseparable from theory
When making observations, scientists look through telescopes, study images on electronic screens, record meter readings, and so on. Generally, on a basic level, they can agree on what they see, e.g., the thermometer shows 37.9 degrees C. But, if these scientists have different ideas about the theories that have been developed to explain these basic observations, they may disagree about what they are observing. For example, before Albert Einstein's general theory of relativity, observers would have likely interpreted an image of the Einstein cross as five different objects in space. In light of that theory, however, astronomers will tell you that there are actually only two objects, one in the center and four different images of a second object around the sides. Alternatively, if other scientists suspect that something is wrong with the telescope and only one object is actually being observed, they are operating under yet another theory. Observations that cannot be separated from theoretical interpretation are said to be theory-laden.

All observation involves both perception and cognition. That is, one does not make an observation passively, but rather is actively engaged in distinguishing the phenomenon being observed from surrounding sensory data. Therefore, observations are affected by one's underlying understanding of the way in which the world functions, and that understanding may influence what is perceived, noticed, or deemed worthy of consideration. In this sense, it can be argued that all observation is theory-laden.



The Internet (or internet) is the global system of interconnected computer networks that uses the Internet protocol suite (TCP/IP) to communicate between networks and devices. It is a network of networks that consists of private, public, academic, business, and government networks of local to global scope, linked by a broad array of electronic, wireless, and optical networking technologies. The Internet carries a vast range of information resources and services, such as the inter-linked hypertext documents and applications of the World Wide Web (WWW), electronic mail, telephony, and file sharing.

The origins of the Internet date back to the development of packet switching and research commissioned by the United States Department of Defense in the 1960s to enable time-sharing of computers. The primary precursor network, the ARPANET, initially served as a backbone for interconnection of regional academic and military networks in the 1970s. The funding of the National Science Foundation Network as a new backbone in the 1980s, as well as private funding for other commercial extensions, led to worldwide participation in the development of new networking technologies, and the merger of many networks. The linking of commercial networks and enterprises by the early 1990s marked the beginning of the transition to the modern Internet, and generated a sustained exponential growth as generations of institutional, personal, and mobile computers were connected to the network. Although the Internet was widely used by academia in the 1980s, commercialization incorporated its services and technologies into virtually every aspect of modern life.

Most traditional communication media, including telephony, radio, television, paper mail and newspapers are reshaped, redefined, or even bypassed by the Internet, giving birth to new services such as email, Internet telephony, Internet television, online music, digital newspapers, and video streaming websites. Newspaper, book, and other print publishing are adapting to website technology, or are reshaped into blogging, web feeds and online news aggregators. The Internet has enabled and accelerated new forms of personal interactions through instant messaging, Internet forums, and social networking. Online shopping has grown exponentially for major retailers, small businesses, and entrepreneurs, as it enables firms to extend their "brick and mortar" presence to serve a larger market or even sell goods and services entirely online. Business-to-business and financial services on the Internet affect supply chains across entire industries.

The Internet has no single centralized governance in either technological implementation or policies for access and usage; each constituent network sets its own policies. The overreaching definitions of the two principal name spaces in the Internet, the Internet Protocol address (IP address) space and the Domain Name System (DNS), are directed by a maintainer organization, the Internet Corporation for Assigned Names and Numbers (ICANN). The technical underpinning and standardization of the core protocols is an activity of the Internet Engineering Task Force (IETF), a non-profit organization of loosely affiliated international participants that anyone may associate with by contributing technical expertise. In November 2006, the Internet was included on USA Today's list of New Seven Wonders.



In social science, a social relation or social interaction is any relationship between two or more individuals. Social relations derived from individual agency form the basis of social structure and the basic object for analysis by social scientists. Fundamental inquiries into the nature of social relations feature in the work of sociologists such as Max Weber in his theory of social action. Social relationships are composed of both positive (affiliative) and negative (agonistic) interactions, representing opposing effects

Social relationships are a special case of social relations that can exist without any communication taking place between the actors involved. Categorizing social interactions enables observational and other social research, such as Gemeinschaft and Gesellschaft (lit. 'community and society'), collective consciousness, etc. However different schools and theories of sociology and other social sciences dispute the methods used for such investigations.



WIKIPEDIA HUMAN MIND

In sociology, social action, also known as Weberian social action, is an act which takes into account the actions and reactions of individuals (or 'agents'). According to Max Weber, "an Action is 'social' if the acting individual takes account of the behavior of others and is thereby oriented in its course".



An action is a concept that involves an agent performing something. In common speech, the term is often used interchangeably with the term "behaviour". However, in the behavioural sciences, the social sciences, and the very philosophy of action, a distinction is made: behavior is automatic and reflexive activity, while action is an intentional, purposive, conscious and subjectively meaningful activity. Thus, things like running or throwing a ball are instances of actions; they involve intention, a goal, and a bodily movement guided by the agent. On the other hand, catching a cold is not considered an action because it is something which happens to a person, not something done by one.



Instinct or innate behavior is the inherent inclination of a living organism towards a particular complex behavior. The simplest example of an instinctive behavior is a fixed action pattern (FAP), in which a very short to medium length sequence of actions, without variation, are carried out in response to a corresponding clearly defined stimulus.

Any behavior is instinctive if it is performed without being based upon prior experience (that is, in the absence of learning), and is therefore an expression of innate biological factors. Sea turtles, newly hatched on a beach, will instinctively move toward the ocean. A marsupial climbs into its mother's pouch upon being born. Honeybees communicate by dancing in the direction of a food source without formal instruction. Other examples include animal fighting, animal courtship behavior, internal escape functions, and the building of nests. Though an instinct is defined by its invariant innate characteristics, details of its performance can be changed by experience; for example, a dog can improve its fighting skills by practice.

Instincts are inborn complex patterns of behavior that exist in most members of the species, and should be distinguished from reflexes, which are simple responses of an organism to a specific stimulus, such as the contraction of the pupil in response to bright light or the spasmodic movement of the lower leg when the knee is tapped. The absence of volitional capacity must not be confused with an inability to modify fixed action patterns. For example, people may be able to modify a stimulated fixed action pattern by consciously recognizing the point of its activation and simply stop doing it, whereas animals without a sufficiently strong volitional capacity may not be able to disengage from their fixed action patterns, once activated.

Instinctual behaviour in humans has been studied, and is a controversial topic.



Life is a characteristic that distinguishes physical entities that have biological processes, such as signaling and self-sustaining processes, from those that do not, either because such functions have ceased (they have died), or because they never had such functions and are classified as inanimate. Various forms of life exist, such as plants, animals, fungi, protists, archaea, and bacteria. Biology is the science concerned with the study of life.

There is currently no consensus regarding the definition of life. One popular definition is that organisms are open systems that maintain homeostasis, are composed of cells, have a life cycle, undergo metabolism, can grow, adapt to their environment, respond to stimuli, reproduce and evolve. Other definitions sometimes include non-cellular life forms such as viruses and viroids.

Abiogenesis is the natural process of life arising from non-living matter, such as simple organic compounds. The prevailing scientific hypothesis is that the transition from non-living to living entities was not a single event, but a gradual process of increasing complexity. Life on Earth first appeared as early as 4.28 billion years ago, soon after ocean formation 4.41 billion years ago, and not long after the formation of the Earth 4.54 billion years ago. The earliest known life forms are microfossils of bacteria. Researchers generally think that current life on Earth descends from an RNA world, although RNA-based life may not have been the first life to have existed. The classic 1952 Miller–Urey experiment and similar research demonstrated that most amino acids, the chemical constituents of the proteins used in all living organisms, can be synthesized from inorganic compounds under conditions intended to replicate those of the early Earth. Complex organic molecules occur in the Solar System and in interstellar space, and these molecules may have provided starting material for the development of life on Earth.

Since its primordial beginnings, life on Earth has changed its environment on a geologic time scale, but it has also adapted to survive in most ecosystems and conditions. Some microorganisms, called extremophiles, thrive in physically or geochemically extreme environments that are detrimental to most other life on Earth. The cell is considered the structural and functional unit of life. There are two kinds of cells, prokaryotic and eukaryotic, both of which consist of cytoplasm enclosed within a membrane and contain many biomolecules such as proteins and nucleic acids. Cells reproduce through a process of cell division, in which the parent cell divides into two or more daughter cells.

In the past, there have been many attempts to define what is meant by "life" through obsolete concepts such as odic force, hylomorphism, spontaneous generation and vitalism, that have now been disproved by biological discoveries. Aristotle is considered to be the first person to classify organisms. Later, Carl Linnaeus introduced his system of binomial nomenclature for the classification of species. Eventually new groups and categories of life were discovered, such as cells and microorganisms, forcing dramatic revisions of the structure of relationships between living organisms. Though currently only known on Earth, life need not be restricted to it, and many scientists speculate in the existence of extraterrestrial life. Artificial life is a computer simulation or human-made reconstruction of any aspect of life, which is often used to examine systems related to natural life.

Death is the permanent termination of all biological processes which sustain an organism, and as such, is the end of its life. Extinction is the term describing the dying out of a group or taxon, usually a species. Fossils are the preserved remains or traces of organisms.



Property dualism describes a category of positions in the philosophy of mind which hold that, although the world is composed of just one kind of substance—the physical kind—there exist two distinct kinds of properties: physical properties and mental properties. In other words, it is the view that non-physical, mental properties (such as beliefs, desires and emotions) exist in, or naturally supervene upon, certain physical substances (namely brains).

Substance dualism, on the other hand, is the view that there exist in the universe two fundamentally different kinds of substance: physical (matter) and non-physical (mind or consciousness), and subsequently also two kinds of properties which adhere in those respective substances. Substance dualism is thus more susceptible to the mind–body problem. Both substance and property dualism are opposed to reductive physicalism.



A mental property or a mind property is a property of a/the mind. The term is mostly used in philosophy of mind, without prejudice as to the ontological status of mental properties. Examples might include general properties, such as being able to think, understand or remember, or more specific actions such as "having a thought about Paris". The term is often used in the context of the mind body problem. For (non eliminative) physicalists, mental properties are a kind of high level property which can be understood in terms of fine-grained neural activity. Property dualists, on the other hand, claim that no such reductive explanation is possible. Eliminativists may reject the existence of mental properties, or at least of those corresponding to folk psychological categories such as thought and memory. Some philosophers seek to find a unifying characteristic for the generally accepted mental properties: a famous example is Franz Brentano's claim that all mental properties are characterised by intentionality or "aboutness".



Religion is a social-cultural system of designated behaviors and practices, morals, worldviews, texts, sanctified places, prophecies, ethics, or organizations, that relates humanity to supernatural, transcendental, or spiritual elements. However, there is no scholarly consensus over what precisely constitutes a religion.

Different religions may or may not contain various elements ranging from the divine, sacred things, faith, a supernatural being or supernatural beings or "some sort of ultimacy and transcendence that will provide norms and power for the rest of life". Religious practices may include rituals, sermons, commemoration or veneration (of deities and/or saints), sacrifices, festivals, feasts, trances, initiations, funerary services, matrimonial services, meditation, prayer, music, art, dance, public service, or other aspects of human culture. Religions have sacred histories and narratives, which may be preserved in sacred scriptures, and symbols and holy places, that aim mostly to give a meaning to life. Religions may contain symbolic stories, which are sometimes said by followers to be true, that have the side purpose of explaining the origin of life, the universe, and other things. Traditionally, faith, in addition to reason, has been considered a source of religious beliefs.

There are an estimated 10,000 distinct religions worldwide. About 84% of the world's population is affiliated with Christianity, Islam, Hinduism, Buddhism, or some form of folk religion. The religiously unaffiliated demographic includes those who do not identify with any particular religion, atheists, and agnostics. While the religiously unaffiliated have grown globally, many of the religiously unaffiliated still have various religious beliefs.

The study of religion encompasses a wide variety of academic disciplines, including theology, comparative religion and social scientific studies. Theories of religion offer various explanations for the origins and workings of religion, including the ontological foundations of religious being and belief.



Ethics involves systematizing, defending, and recommending concepts of right and wrong behavior. A central aspect of ethics is "the good life", the life worth living or life that is simply satisfying, which is held by many philosophers to be more important than traditional moral conduct.

Most religions have an ethical component, often derived from purported supernatural revelation or guidance. Some assert that religion is necessary to live ethically. Simon Blackburn states that there are those who "would say that we can only flourish under the umbrella of a strong social order, cemented by common adherence to a particular religious tradition".



Artificial intelligence (AI), is intelligence demonstrated by machines, unlike the natural intelligence displayed by humans and animals. Leading AI textbooks define the field as the study of "intelligent agents": any device that perceives its environment and takes actions that maximize its chance of successfully achieving its goals. Colloquially, the term "artificial intelligence" is often used to describe machines (or computers) that mimic "cognitive" functions that humans associate with the human mind, such as "learning" and "problem solving".

As machines become increasingly capable, tasks considered to require "intelligence" are often removed from the definition of AI, a phenomenon known as the AI effect. A quip in Tesler's Theorem says "AI is whatever hasn't been done yet."For instance, optical character recognition is frequently excluded from things considered to be AI, having become a routine technology. Modern machine capabilities generally classified as AI include successfully understanding human speech, competing at the highest level in strategic game systems (such as chess and Go), autonomously operating cars, intelligent routing in content delivery networks, and military simulations.

Artificial intelligence was founded as an academic discipline in 1955, and in the years since has experienced several waves of optimism, followed by disappointment and the loss of funding (known as an "AI winter"), followed by new approaches, success and renewed funding. For most of its history, AI research has been divided into sub-fields that often fail to communicate with each other. These sub-fields are based on technical considerations, such as particular goals (e.g. "robotics" or "machine learning"), the use of particular tools ("logic" or artificial neural networks), or deep philosophical differences. Sub-fields have also been based on social factors (particular institutions or the work of particular researchers).

The traditional problems (or goals) of AI research include reasoning, knowledge representation, planning, learning, natural language processing, perception and the ability to move and manipulate objects. General intelligence is among the field's long-term goals. Approaches include statistical methods, computational intelligence, and traditional symbolic AI. Many tools are used in AI, including versions of search and mathematical optimization, artificial neural networks, and methods based on statistics, probability and economics. The AI field draws upon computer science, information engineering, mathematics, psychology, linguistics, philosophy, and many other fields.

The field was founded on the assumption that human intelligence "can be so precisely described that a machine can be made to simulate it". This raises philosophical arguments about the mind and the ethics of creating artificial beings endowed with human-like intelligence. These issues have been explored by myth, fiction and philosophy since antiquity. Some people also consider AI to be a danger to humanity if it progresses unabated. Others believe that AI, unlike previous technological revolutions, will create a risk of mass unemployment.

In the twenty-first century, AI techniques have experienced a resurgence following concurrent advances in computer power, large amounts of data, and theoretical understanding; and AI techniques have become an essential part of the technology industry, helping to solve many challenging problems in computer science, software engineering and operations research.



In artificial intelligence, an intelligent agent (IA) refers to an autonomous entity which acts, directing its activity towards achieving goals (i.e. it is an agent), upon an environment using observation through sensors and consequent actuators (i.e. it is intelligent). Intelligent agents may also learn or use knowledge to achieve their goals. They may be very simple or very complex. A reflex machine, such as a thermostat, is considered an example of an intelligent agent.

Intelligent agents are often described schematically as an abstract functional system similar to a computer program. Researchers such as Russell & Norvig (2003) consider goal-directed behavior to be the essence of intelligence; a normative agent can be labeled with a term borrowed from economics, "rational agent". In this rational-action paradigm, an AI possesses an internal "model" of its environment. This model encapsulates all the agent's beliefs about the world. The agent also has an "objective function" that encapsulates all the AI's goals. Such an agent is designed to create and execute whatever plan will, upon completion, maximize the expected value of the objective function. A reinforcement learning agent can have a "reward function" that allows the programmers to shape the AI's desired behavior, and an evolutionary algorithm's behavior is shaped by a "fitness function". Abstract descriptions of intelligent agents are sometimes called abstract intelligent agents (AIA) to distinguish them from their real world implementations as computer systems, biological systems, or organizations. Some autonomous intelligent agents are designed to function in the absence of human intervention. As intelligent agents become more popular, there are increasing legal risks involved.

Intelligent agents in artificial intelligence are closely related to agents in economics, and versions of the intelligent agent paradigm are studied in cognitive science, ethics, the philosophy of practical reason, as well as in many interdisciplinary socio-cognitive modeling and computer social simulations.

Intelligent agents are also closely related to software agents (an autonomous computer program that carries out tasks on behalf of users). In computer science, an intelligent agent is a software agent that has some intelligence, for example, autonomous programs used for operator assistance or data mining (sometimes referred to as bots) are also called "intelligent agents".



Reinforcement learning (RL) is an area of machine learning concerned with how software agents ought to take actions in an environment in order to maximize the notion of cumulative reward. Reinforcement learning is one of three basic machine learning paradigms, alongside supervised learning and unsupervised learning.

Reinforcement learning differs from supervised learning in not needing labelled input/output pairs be presented, and in not needing sub-optimal actions to be explicitly corrected. Instead the focus is on finding a balance between exploration (of uncharted territory) and exploitation (of current knowledge).

The environment is typically stated in the form of a Markov decision process (MDP), because many reinforcement learning algorithms for this context use dynamic programming techniques.[2] The main difference between the classical dynamic programming methods and reinforcement learning algorithms is that the latter do not assume knowledge of an exact mathematical model of the MDP and they target large MDPs where exact methods become infeasible.



Intentionality is a philosophical concept defined as "the power of minds to be about, to represent, or to stand for, things, properties and states of affairs". The idea fell out of discussion with the end of the medieval scholastic period, but in recent times was resurrected by empirical psychologist Franz Brentano and later adopted by contemporary phenomenological philosopher Edmund Husserl. Today, intentionality is a live concern among philosophers of mind and language. The earliest theory of intentionality is associated with Anselm of Canterbury's ontological argument for the existence of God, and with his tenets distinguishing between objects that exist in the understanding and objects that exist in reality.



Contemporary philosophy is the present period in the history of Western philosophy beginning at the early 20th century with the increasing professionalization of the discipline and the rise of analytic and continental philosophy.

The phrase "contemporary philosophy" is a piece of technical terminology in philosophy that refers to a specific period in the history of Western philosophy (namely the philosophy of the 20th and 21st centuries). However, the phrase is often confused with modern philosophy (which refers to an earlier period in Western philosophy), postmodern philosophy (which refers to continental philosophers' criticisms of modern philosophy), and with a non-technical use of the phrase referring to any recent philosophic work.



Phenomenology (from Greek phainómenon "that which appears" and lógos "study") is the philosophical study of the structures of experience and consciousness. As a philosophical movement it was founded in the early years of the 20th century by Edmund Husserl and was later expanded upon by a circle of his followers at the universities of Göttingen and Munich in Germany. It then spread to France, the United States, and elsewhere, often in contexts far removed from Husserl's early work.

Phenomenology is not a unified movement; rather, different authors share a common family resemblance but also with many significant differences. Gabriella Farina states:

   A unique and final definition of phenomenology is dangerous and perhaps even paradoxical as it lacks a thematic focus. In fact, it is not a doctrine, nor a philosophical school, but rather a style of thought, a method, an open and ever-renewed experience having different results, and this may disorient anyone wishing to define the meaning of phenomenology.

Phenomenology, in Husserl's conception, is primarily concerned with the systematic reflection on and study of the structures of consciousness and the phenomena that appear in acts of consciousness. Phenomenology can be clearly differentiated from the Cartesian method of analysis which sees the world as objects, sets of objects, and objects acting and reacting upon one another.

Husserl's conception of phenomenology has been criticized and developed not only by himself but also by students and colleagues such as Edith Stein, Max Scheler, Roman Ingarden, and Dietrich von Hildebrand, by existentialists such as Nicolai Hartmann, Gabriel Marcel, Maurice Merleau-Ponty, and Jean-Paul Sartre, by hermeneutic philosophers such as Martin Heidegger, Hans-Georg Gadamer, and Paul Ricoeur, by later French philosophers such as Jean-Luc Marion, Michel Henry, Emmanuel Levinas, and Jacques Derrida, and by sociologists such as Alfred Schütz and Eric Voegelin.



The existence of God is a subject of debate in the philosophy of religion and popular culture.

A wide variety of arguments for and against the existence of God can be categorized as metaphysical, logical, empirical, subjective or scientific. In philosophical terms, the question of the existence of God involves the disciplines of epistemology (the nature and scope of knowledge) and ontology (study of the nature of being, existence, or reality) and the theory of value (since some definitions of God include "perfection").

The Western tradition of philosophical discussion of the existence of God began with Plato and Aristotle, who made arguments that would now be categorized as cosmological. Other arguments for the existence of God have been proposed by St. Anselm, who formulated the first ontological argument; Ibn Rushd (Averroes) and Thomas Aquinas, who presented their own versions of the cosmological argument (the kalam argument and the first way, respectively); René Descartes, who said that the existence of a benevolent God is logically necessary for the evidence of the senses to be meaningful. John Calvin argued for a sensus divinitatis, which gives each human a knowledge of God's existence. Atheists view arguments for the existence of God as insufficient, mistaken or outweighed by arguments against it, whereas some religions, such as Jainism, reject the possibility of a creator deity. Philosophers who have provided arguments against the existence of God include Friedrich Nietzsche and Bertrand Russell.



Anomalous monism is a philosophical thesis about the mind–body relationship. It was first proposed by Donald Davidson in his 1970 paper "Mental Events". The theory is twofold and states that mental events are identical with physical events, and that the mental is anomalous, i.e. under their mental descriptions, relationships between these mental events are not describable by strict physical laws. Hence, Davidson proposes an identity theory of mind without the reductive bridge laws associated with the type-identity theory. Since the publication of his paper, Davidson has refined his thesis and both critics and supporters of anomalous monism have come up with their own characterizations of the thesis, many of which appear to differ from Davidson's.



Animism (from Latin: anima, 'breath, spirit, life') is the belief that objects, places, and creatures all possess a distinct spiritual essence. Potentially, animism perceives all things—animals, plants, rocks, rivers, weather systems, human handiwork, and perhaps even words—as animated and alive. Animism is used in the anthropology of religion as a term for the belief system of many indigenous peoples, especially in contrast to the relatively more recent development of organised religions.

Although each culture has its own different mythologies and rituals, animism is said to describe the most common, foundational thread of indigenous peoples' "spiritual" or "supernatural" perspectives. The animistic perspective is so widely held and inherent to most indigenous peoples that they often do not even have a word in their languages that corresponds to "animism" (or even "religion"); the term is an anthropological construct.

Largely due to such ethnolinguistic and cultural discrepancies, opinion has differed on whether animism refers to an ancestral mode of experience common to indigenous peoples around the world, or to a full-fledged religion in its own right. The currently accepted definition of animism was only developed in the late 19th century (1871) by Sir Edward Tylor, who formulated it as "one of anthropology's earliest concepts, if not the first."

Animism encompasses the beliefs that all material phenomena have agency, that there exists no hard and fast distinction between the spiritual and physical (or material) world and that soul or spirit or sentience exists not only in humans, but also in other animals, plants, rocks, geographic features such as mountains or rivers or other entities of the natural environment:water sprites, vegetation deities, tree sprites, etc. Animism may further attribute a life force to abstract concepts such as words, true names or metaphors in mythology. Some members of the non-tribal world also consider themselves animists (such as author Daniel Quinn, sculptor Lawson Oyekan, and many contemporary Pagans).



Feeling is the nominalization of the verb to feel. Originally used to describe the physical sensation of touch through either experience or perception, the word is also used to describe other experiences, such as "a feeling of warmth" and of sentience in general. In Latin, sentire meant to feel, hear or smell. In psychology, the word is usually reserved for the conscious subjective experience of emotion. Phenomenology and heterophenomenology are philosophical approaches that provide some basis for knowledge of feelings. Many schools of psychotherapy depend on the therapist achieving some kind of understanding of the client's feelings, for which methodologies exist.

Perception of the physical world does not necessarily result in a universal reaction among receivers (see emotions), but varies depending upon one's tendency to handle the situation, how the situation relates to the receiver's past experiences, and any number of other factors. Feelings are also known as a state of consciousness, such as that resulting from emotions, sentiments or desires. Feelings are only felt and are abstract in nature. They cannot be touched.

People buy products in hopes that the product will make them feel a certain way: perhaps happy, excited or beautiful. Or, they find the product useful in some way, even indirectly such as to support a charity or for altruistic economic reasons. Some people buy beauty products in hopes of achieving a state of happiness or a sense of self beauty or as an act or expression of beauty. Past events are used in our lives to form schemas in our minds, and based on those past experiences, we expect our lives to follow a certain script. However, storytelling, commemoration and reservation of commemoration (the unwillingness to overtly impose remembrances), research and investigation, and many other activities can help settle uneasy feelings without "scripting", without the ambivalence that feeling can only be "handled" by proxy, which is not always true.

A social psychologist, Daniel Gilbert, conducted a study on the influence of feelings on events alongside other researchers. The results showed that when the participants predicted a positive feeling for an event, the higher the chances that they wanted to relive the event. Predicted feelings were either short-lived or did not correlate to what the participant expected.



Perception (from the Latin perceptio, meaning gathering or receiving) is the organization, identification, and interpretation of sensory information in order to represent and understand the presented information or environment.

All perception involves signals that go through the nervous system, which in turn result from physical or chemical stimulation of the sensory system. For example, vision involves light striking the retina of the eye; smell is mediated by odor molecules; and hearing involves pressure waves.

Perception is not only the passive receipt of these signals, but it's also shaped by the recipient's learning, memory, expectation, and attention. Sensory input is a process that transforms this low-level information to higher-level information (e.g., extracts shapes for object recognition). The process that follows connects a person's concepts and expectations (or knowledge), restorative and selective mechanisms (such as attention) that influence perception.

Perception depends on complex functions of the nervous system, but subjectively seems mostly effortless because this processing happens outside conscious awareness.

Since the rise of experimental psychology in the 19th century, psychology's understanding of perception has progressed by combining a variety of techniques. Psychophysics quantitatively describes the relationships between the physical qualities of the sensory input and perception. Sensory neuroscience studies the neural mechanisms underlying perception. Perceptual systems can also be studied computationally, in terms of the information they process. Perceptual issues in philosophy include the extent to which sensory qualities such as sound, smell or color exist in objective reality rather than in the mind of the perceiver.

Although the senses were traditionally viewed as passive receptors, the study of illusions and ambiguous images has demonstrated that the brain's perceptual systems actively and pre-consciously attempt to make sense of their input. There is still active debate about the extent to which perception is an active process of hypothesis testing, analogous to science, or whether realistic sensory information is rich enough to make this process unnecessary.

The perceptual systems of the brain enable individuals to see the world around them as stable, even though the sensory information is typically incomplete and rapidly varying. Human and animal brains are structured in a modular way, with different areas processing different kinds of sensory information. Some of these modules take the form of sensory maps, mapping some aspect of the world across part of the brain's surface. These different modules are interconnected and influence each other. For instance, taste is strongly influenced by smell.

"Percept" is also a term used by Deleuze and Guattari[8] to define perception independent from perceivers.



In philosophy of mind, panpsychism is the view that mind or a mindlike aspect is a fundamental and ubiquitous feature of reality. It is also described as a theory that "the mind is a fundamental feature of the world which exists throughout the universe."It is one of the oldest philosophical theories, and has been ascribed to philosophers including Thales, Plato, Spinoza, Leibniz, William James, Alfred North Whitehead, Bertrand Russell, and Galen Strawson. In the 19th century, panpsychism was the default philosophy of mind in Western thought, but it saw a decline in the mid-20th century with the rise of logical positivism. Recent interest in the hard problem of consciousness has revived interest in panpsychism.



A deity or god is a supernatural being considered divine or sacred. The Oxford Dictionary of English defines deity as "a god or goddess (in a polytheistic religion)", or anything revered as divine. C. Scott Littleton defines a deity as "a being with powers greater than those of ordinary humans, but who interacts with humans, positively or negatively, in ways that carry humans to new levels of consciousness, beyond the grounded preoccupations of ordinary life". A goddess is a female deity.

Religions can be categorized by how many deities they worship. Monotheistic religions accept only one deity (predominantly referred to as God), polytheistic religions accept multiple deities. Henotheistic religions accept one supreme deity without denying other deities, considering them as aspects of the same divine principle; and nontheistic religions deny any supreme eternal creator deity but may accept a pantheon of deities which live, die and may be reborn like any other being.

Although most monotheistic religions traditionally envision their God as omnipotent, omnipresent, omniscient, omnibenevolent, and eternal, none of these qualities are essential to the definition of a "deity" and various cultures conceptualized their deities differently. Monotheistic religions typically refer to God in masculine terms, while other religions refer to their deities in a variety of ways – masculine, feminine, androgynous and without gender.

Historically, many ancient cultures – including the ancient Mesopotamians, Egyptians, Greeks, Romans, and Norsemen– personified natural phenomena, variously as either deliberate causes or effects. Some Avestan and Vedic deities were viewed as ethical concepts. In Indian religions, deities were envisioned as manifesting within the temple of every living being's body, as sensory organs and mind. Deities were envisioned as a form of existence (Saṃsāra) after rebirth, for human beings who gain merit through an ethical life, where they become guardian deities and live blissfully in heaven, but are also subject to death when their merit is lost.



In folk belief, spirit is the vital principle or animating force within all living things. As far back as 1628 and 1633 respectively, both William Harvey and René Descartes speculated that somewhere within the body, in a special locality, there was a ‘vital spirit’ or 'vital force', which animated the whole bodily frame, such as the engine in a factory moves the machinery in it. Spirit has frequently been conceived of as a supernatural being, or non-physical entity; for example, a demon, ghost, fairy, or angel. In ancient Islamic terminology however, a spirit (rūḥ), applies only to pure spirits, but not to other invisible creatures, such as jinn, demons and angels.

Historically, spirit has been used to refer to a "subtle" as opposed to "gross" material substance, as put forth in the notable last paragraph of Sir Isaac Newton's Principia Mathematica. In English Bibles, "the Spirit" (with a capital "S"), specifically denotes the Holy Spirit.

The concepts of spirit and soul often overlap, and both are believed to survive bodily death in some religions, and "spirit" can also have the sense of ghost, i.e. a manifestation of the spirit of a deceased person. Spirit is also often used to refer to the consciousness or personality.



The afterlife (also referred to as life after death or the world to come or reincarnation) is an existence some believe that the essential part of an individual's identity or their stream of consciousness continues to have after the death of their physical body. According to various ideas about the afterlife, the essential aspect of the individual that lives on after death may be some partial element, or the entire soul or spirit, of an individual, which carries with it and may confer personal identity or, on the contrary nirvana. Belief in an afterlife is in contrast to the belief in oblivion after death.

In some views, this continued existence often takes place in a spiritual realm, and in other popular views, the individual may be reborn into this world and begin the life cycle over again, likely with no memory of what they have done in the past. In this latter view, such rebirths and deaths may take place over and over again continuously until the individual gains entry to a spiritual realm or otherworld. Major views on the afterlife derive from religion, esotericism and metaphysics.

Some belief systems, such as those in the Abrahamic tradition, hold that the dead go to a specific plane of existence after death, as determined by God, or other divine judgment, based on their actions or beliefs during life. In contrast, in systems of reincarnation, such as those in the Indian religions, the nature of the continued existence is determined directly by the actions of the individual in the ended life.



The stream of consciousness is a metaphor describing how thoughts seem to flow through the conscious mind. Research studies have shown that we only experience one mental event at a time as a fast-moving mind stream. William James, often considered to be the father of American psychology, first coined the phrase "stream of consciousness". The full range of thoughts—that one can be aware of—can form the content of this stream.
WIKIPEDIA INTELLECTUAL

Literature broadly is any collection of written work, but it is also used more narrowly for writings specifically considered to be an art form, especially prose fiction, drama, and poetry. In recent centuries, the definition has expanded to include oral literature, much of which has been transcribed. Literatue is a method of recording, preserving, and transmitting knowledge and entertainment.

Literature, as an art form, can also include works in various non-fiction genres, such as autobiography, diaries, memoir, letters, and the essay. Within its broad definition, literature includes non-fictional books, articles or other printed information on a particular subject.

Etymologically, the term derives from Latin literatura/litteratura "learning, a writing, grammar," originally "writing formed with letters," from litera/littera "letter". In spite of this, the term has also been applied to spoken or sung texts. Developments in print technology have allowed an ever-growing distribution and proliferation of written works, which now includes electronic literature.

Literature is classified according to whether it is poetry, prose or drama, and such works are categorized according to historical periods, or their adherence to certain aesthetic features, or genre.



Writing is a medium of human communication that involves the representation of a language with symbols. Writing systems are not themselves human languages (with the debatable exception of computer languages); they are means of rendering a language into a form that can be reconstructed by other humans separated by time and/or space. While not all languages utilize a writing system, those with systems of inscriptions can complement and extend capacities of spoken language by enabling the creation of durable forms of speech that can be transmitted across space (e.g., correspondence) and stored over time (e.g., libraries or other public records). It has also been observed that the activity of writing itself can have knowledge-transforming effects, since it allows humans to externalize their thinking in forms that are easier to reflect on and potentially rework. Writing relies on many of the same semantic structures as the speech it represents, such as lexicon and syntax, with the added dependency of a system of symbols to represent that language's phonology and morphology. The result of the activity of writing is called a text, and the interpreter or activator of this text is called a reader.

As human societies emerged, collective motivations for the development of writing were driven by pragmatic exigencies like keeping history, maintaining culture, codifying knowledge through curricula and lists of texts deemed to contain foundational knowledge (e.g., The Canon of Medicine) or to be artistically exceptional (e.g., a literary canon), organizing and governing societies through the formation of legal systems, census records, contracts, deeds of ownership, taxation, trade agreements, treaties, and so on. Amateur historians, including H.G. Wells, had speculated since the early 20th century on the likely correspondence between the emergence of systems of writing and the development of city-states into empires. As Charles Bazerman explains, the "marking of signs on stones, clay, paper, and now digital memories—each more portable and rapidly travelling than the previous—provided means for increasingly coordinated and extended action as well as memory across larger groups of people over time and space." For example, around the 4th millennium BC, the complexity of trade and administration in Mesopotamia outgrew human memory, and writing became a more dependable method of recording and presenting transactions in a permanent form. In both ancient Egypt and Mesoamerica, on the other hand, writing may have evolved through calendric and political necessities for recording historical and environmental events. Further innovations included more uniform, predictable, and widely dispersed legal systems, distribution and discussion of accessible versions of sacred texts, and the origins of modern practices of scientific inquiry and knowledge-consolidation, all largely reliant on portable and easily reproducible forms of inscribed language.  

Individual, as opposed to collective, motivations for writing include improvised additional capacity for the limitations of human memory (e.g., to-do lists, recipes, reminders, logbooks, maps, the proper sequence for a complicated task or important ritual), dissemination of ideas (as in an essay, monograph, broadside, petition, or manifesto), imaginative narratives and other forms of storytelling, personal or business correspondence, and lifewriting (e.g., a diary or journal).



The contemporary national legal systems are generally based on one of four basic systems: civil law, common law, statutory law, religious law or combinations of these. However, the legal system of each country is shaped by its unique history and so incorporates individual variations. The science that studies law at the level of legal systems is called comparative law.

Both civil (also known as Roman) and common law systems can be considered the most widespread in the world: civil law because it is the most widespread by landmass and by population overall, and common law because it is employed by the greatest number of people compared to any single civil law system.



SACRED TEXT
Religious texts are texts related to a religious tradition. They differ from literary texts by being a compilation or discussion of beliefs, mythologies, ritual practices, commandments or laws, ethical conduct, spiritual aspirations, and for creating or fostering a religious community. The relative authority of religious texts develops over time and is derived from the ratification, enforcement, and its use across generations. Some religious texts are accepted or categorized as canonical, some non-canonical, and others extracanonical, semi-canonical, deutero-canonical, pre-canonical or post-canonical.

A scripture is a subset of religious texts considered to be "especially authoritative", revered and "holy writ", "sacred, canonical", or of "supreme authority, special status" to a religious community. The terms sacred text and religious text are not necessarily interchangeable in that some religious texts are believed to be sacred because of the belief in some theistic religions such as the Abrahamic religions that the text is divinely or supernaturally revealed or divinely inspired, or in non-theistic religions such as some Indian religions they are considered to be the central tenets of their eternal Dharma. Many religious texts, in contrast, are simply narratives or discussions pertaining to the general themes, interpretations, practices, or important figures of the specific religion. In some religions (Islam), the scripture of supreme authority is well established (Quran). In others (Christianity), the canonical texts include a particular text (Bible) but is "an unsettled question", according to Eugene Nida. In yet others (Hinduism, Buddhism), there "has never been a definitive canon". While the term scripture is derived from the Latin scriptura, meaning "writing", most sacred scriptures of the world's major religions were originally a part of their oral tradition, and were "passed down through memorization from generation to generation until they were finally committed to writing", according to the Encyclopaedia Britannica.

Religious texts also serve a ceremonial and liturgical role, particularly in relation to sacred time, the liturgical year, the divine efficacy and subsequent holy service; in a more general sense, its performance.



Etymology (/ˌɛtɪˈmɒlədʒi/) is the study of the history of words. By extension, the phrase "the etymology of [a word]" means the origin of a particular word.

For languages with a long written history, etymologists make use of texts, and texts about the language, to gather knowledge about how words were used during earlier periods, how they developed in meaning and form, or when and how they entered the language. Etymologists also apply the methods of comparative linguistics to reconstruct information about forms that are too old for any direct information to be available.

By analyzing related languages with a technique known as the comparative method, linguists can make inferences about their shared parent language and its vocabulary. In this way, word roots in European languages, for example, can be traced all the way back to the origin of the Indo-European language family.

Even though etymological research originally grew from the philological tradition, much current etymological research is done on language families where little or no early documentation is available, such as Uralic and Austronesian.

The word etymology derives from the Greek word ἐτυμολογία (etumología), itself from ἔτυμον (étumon), meaning "true sense or sense of a truth", and the suffix -logia, denoting "the study of".

The term etymon refers to a word or morpheme (e.g., stem or root) from which a later word or morpheme derives. For example, the Latin word candidus, which means "white", is the etymon of English candid. Relationships are often less transparent, however. English place names such as Winchester, Gloucester, Tadcaster share in different modern forms a suffixed etymon that was once meaningful, Latin castrum 'fort'.



An ideology (/ˌʌɪdɪˈɒlədʒi/) is a set of beliefs or philosophies attributed to a person or group of persons, especially as held for reasons that are not purely epistemic, in which "practical elements are as prominent as theoretical ones." Formerly applied primarily to economic, political, or religious theories and policies, in a tradition going back to Karl Marx and Friedrich Engels, more recent use treats the term as mainly condemnatory.

The term was coined by Antoine Destutt de Tracy, a French Enlightenment aristocrat and philosopher, who conceived it in 1796 as the "science of ideas" to develop a rational system of ideas to oppose the irrational impulses of the mob. In political science, the term is used in a descriptive sense to refer to political belief systems.



Democracy (Greek: δημοκρατία, dēmokratiā, from dēmos 'people' and kratos 'rule') is a form of government in which the people have the authority to choose their governing legislation. Who people are and how authority is shared among them are core issues for democratic theory, development and constitution. Cornerstones include freedom of assembly and speech, inclusiveness and equality, membership, consent, voting, right to life and minority rights.

Generally, the two types of democracy are direct and representative. In a direct democracy, the people directly deliberate and decide on legislation. In a representative democracy, the people elect representatives to deliberate and decide on legislation, such as in parliamentary or presidential democracy. Liquid democracy combines elements of these two basic types.

Prevalent day-to-day decision making of democracies is the majority rule, though other decision making approaches like supermajority and consensus have been equally integral to democracies. They serve the crucial purpose of inclusiveness and broader legitimacy on sensitive issues, counterbalancing majoritarianism, and therefore mostly take precedence on a constitutional level.

In the common variant of liberal democracy, the powers of the majority are exercised within the framework of a representative democracy, but the constitution limits the majority and protects the minority, usually through the enjoyment by all of certain individual rights, e.g. freedom of speech, or freedom of association. Besides these general types of democracy, there have been a wealth of further types.

Democracy makes all forces struggle repeatedly to realize their interests and devolves power from groups of people to sets of rules. Western democracy, as distinct from that which existed in pre-modern societies, is generally considered to have originated in city-states such as Classical Athens and the Roman Republic, where various schemes and degrees of enfranchisement of the free male population were observed before the form disappeared in the West at the beginning of late antiquity. The English word dates back to the 16th century, from the older Middle French and Middle Latin equivalents.

According to American political scientist Larry Diamond, democracy consists of four key elements: a political system for choosing and replacing the government through free and fair elections; the active participation of the people, as citizens, in politics and civic life; protection of the human rights of all citizens; and a rule of law, in which the laws and procedures apply equally to all citizens. Todd Landman, nevertheless, draws our attention to the fact that democracy and human rights are two different concepts and that "there must be greater specificity in the conceptualisation and operationalisation of democracy and human rights".

The term appeared in the 5th century BC to denote the political systems then existing in Greek city-states, notably Athens, to mean "rule of the people", in contrast to aristocracy (ἀριστοκρατία, aristokratía), meaning "rule of an elite". While theoretically, these definitions are in opposition, in practice the distinction has been blurred historically. The political system of Classical Athens, for example, granted democratic citizenship to free men and excluded slaves and women from political participation. In virtually all democratic governments throughout ancient and modern history, democratic citizenship consisted of an elite class, until full enfranchisement was won for all adult citizens in most modern democracies through the suffrage movements of the 19th and 20th centuries.

Democracy contrasts with forms of government where power is either held by an individual, as in an absolute monarchy, or where power is held by a small number of individuals, as in an oligarchy. Nevertheless, these oppositions, inherited from Greek philosophy, are now ambiguous because contemporary governments have mixed democratic, oligarchic and monarchic elements. Karl Popper defined democracy in contrast to dictatorship or tyranny, thus focusing on opportunities for the people to control their leaders and to oust them without the need for a revolution.



Classical antiquity (also the classical era, classical period or classical age) is the period of cultural history between the 8th century BC and the 6th century AD centered on the Mediterranean Sea, comprising the interlocking civilizations of ancient Greece and ancient Rome known as the Greco-Roman world. It is the period in which both Greek and Roman societies flourished and wielded great influence throughout much of Europe, Northern Africa, and West Asia.

Conventionally, it is taken to begin with the earliest-recorded Epic Greek poetry of Homer (8th–7th-century BC), and continues through the emergence of Christianity and the fall of the Western Roman Empire (5th-century AD). It ends with the decline of classical culture during Late antiquity (250–750), a period overlapping with the Early Middle Ages (600–1000). Such a wide span of history and territory covers many disparate cultures and periods. Classical antiquity may also refer to an idealized vision among later people of what was, in Edgar Allan Poe's words, "the glory that was Greece, and the grandeur that was Rome".

The culture of the ancient Greeks, together with some influences from the ancient Near East, was the basis of art, philosophy, society, and education, until the Roman imperial period. The Romans preserved, imitated, and spread this culture over Europe, until they themselves were able to compete with it, and the classical world began to speak Latin as well as Greek. This Greco-Roman cultural foundation has been immensely influential on the language, politics, law, educational systems, philosophy, science, warfare, poetry, historiography, ethics, rhetoric, art and architecture of the modern world. Surviving fragments of classical culture led to a revival beginning in the 14th century which later came to be known as the Renaissance, and various neo-classical revivals occurred in the 18th and 19th centuries.



Antisemitism (also spelled anti-semitism or anti-Semitism) is hostility to, prejudice, or discrimination against Jews. A person who holds such positions is called an antisemite. Antisemitism is generally considered to be a form of racism.

Antisemitism may be manifested in many ways, ranging from expressions of hatred of or discrimination against individual Jews to organized pogroms by mobs or police forces, or even military attacks on entire Jewish communities. Although the term did not come into common usage until the 19th century, it is also applied to previous and later anti-Jewish incidents. Notable instances of persecution include the Rhineland massacres preceding the First Crusade in 1096, the Edict of Expulsion from England in 1290, the 1348–1351 persecution of Jews during the Black Death, the massacres of Spanish Jews in 1391, the persecutions of the Spanish Inquisition, the expulsion from Spain in 1492, the Cossack massacres in Ukraine from 1648 to 1657, various anti-Jewish pogroms in the Russian Empire between 1821 and 1906, the 1894–1906 Dreyfus affair in France, the Holocaust in German-occupied Europe during World War II, Soviet anti-Jewish policies, and Arab and Muslim involvement in the Jewish exodus from Arab and Muslim countries.

The root word Semite gives the false impression that antisemitism is directed against all Semitic people, e.g., including Arabs, Assyrians and Arameans. The compound word Antisemitismus ('antisemitism') was first used in print in Germany in 1879 as a scientific-sounding term for Judenhass ('Jew-hatred'), and this has been its common use since then.



Morality (from Latin: moralitas, lit. 'manner, character, proper behavior') is the differentiation of intentions, decisions and actions between those that are distinguished as proper and those that are improper. Morality can be a body of standards or principles derived from a code of conduct from a particular philosophy, religion or culture, or it can derive from a standard that a person believes should be universal. Morality may also be specifically synonymous with "goodness" or "rightness".

Moral philosophy includes meta-ethics, which studies abstract issues such as moral ontology and moral epistemology, and normative ethics, which studies more concrete systems of moral decision-making such as deontological ethics and consequentialism. An example of normative ethical philosophy is the Golden Rule, which states that: "One should treat others as one would like others to treat oneself."

Immorality is the active opposition to morality (i.e. opposition to that which is good or right), while amorality is variously defined as an unawareness of, indifference toward, or disbelief in any particular set of moral standards or principles.



Solidarity is an awareness of shared interests, objectives, standards, and sympathies creating a psychological sense of unity of groups or classes. It refers to the ties in a society that bind people together as one. The term is generally employed in sociology and the other social sciences as well as in philosophy and bioethics. It is also a significant concept in Catholic social teaching; therefore it is a core concept in Christian democratic political ideology.

What forms the basis of solidarity and how it's implemented varies between societies. In developing societies it may be mainly based on kinship and shared values while more developed societies accumulate various theories as to what contributes to a sense of solidarity, or rather, social cohesion.

Solidarity is also one of six principles of the Charter of Fundamental Rights of the European Union and December 20 of each year is International Human Solidarity Day recognized as an international observance. Concepts of solidarity are mentioned in the Universal Declaration on Bioethics and Human Rights, but not defined clearly. As biotechnology and biomedical enhancement research and production increase, the need for distinct definition of solidarity within healthcare system frameworks is important.



Literacy is popularly understood as an ability to read, write and use numeracy in at least one method of writing, an understanding reflected by mainstream dictionary and handbook definitions. Starting in the 1980s, however, literacy researchers have maintained that defining literacy as an ability apart from any actual event of reading and writing ignores the complex ways reading and writing always happen in a specific context and in tandem with the values associated with that context. The view that literacy always involves social and cultural elements is reflected in UNESCO's stipulation that literacy is an "ability to identify, understand, interpret, create, communicate and compute, using printed and written materials associated with varying contexts." Modern attention to literacy as a "context-dependent assemblage of social practices" reflects the understanding that individuals' reading and writing practices develop and change over the lifespan as their cultural, political, and historical contexts change. For example, in Scotland, literacy has been defined as: "The ability to read, write and use numeracy, to handle information, to express ideas and opinions, to make decisions and solve problems, as family members, workers, citizens and lifelong learners."

Such expanded definitions have altered long-standing "rule of thumb" measures of literacy, e.g., the ability to read the newspaper, in part because the increasing involvement of computers and other digital technologies in communication necessitates additional skills (e.g. interfacing with web browsers and word processing programs; organizing and altering the configuration of files, etc.). By extension, the expansion of these necessary skill-sets became known, variously, as computer literacy, information literacy, and technological literacy. Elsewhere definitions of literacy extend the original notion of "acquired ability" into concepts like "arts literacy," visual literacy (the ability to understand visual forms of communication such as body language, pictures, maps, and video), statistical literacy, critical literacy, media literacy, ecological literacy, disaster literacy, and health literacy.



International relations (IR) or international affairs (IA)—commonly also referred to as international studies (IS), global studies (GS), or global affairs (GA)—is the study of politics, economics and law on a global level. Depending on the academic institution, it is either a field of political science, an interdisciplinary academic field similar to global studies, or an independent academic discipline that examines social science and humanities in an international context.

In all cases, international relations is concerned with the relationships between political entities (polities) such as sovereign states, inter-governmental organizations (IGOs), international non-governmental organizations (INGOs), other non-governmental organizations (NGOs), and multinational corporations (MNCs), and the wider world-systems produced by this interaction. International relations is an academic and a public policy field, and so can be positive and normative, because it analyses and formulates the foreign policy of a given state.

As a political activity, international relations dates from at least the time of Greek historian Thucydides (c. 460–395 BC), but emerged as a discrete academic field within political science in the early 20th century. In practice, international relations and affairs forms a separate academic program or field from political science, and the courses taught therein are highly interdisciplinary.

For example, international relations draws from the fields of politics, economics, international law, communication studies, history, demography, geography, sociology, anthropology, criminology and psychology. The scope of international relations encompasses issues such as globalization, diplomatic relations, state sovereignty, international security, ecological sustainability, nuclear proliferation, nationalism, economic development, global finance, terrorism, and human rights.



The term public intellectual describes the intellectual participating in the public-affairs discourse of society, in addition to an academic career. Regardless of the academic field or the professional expertise, the public intellectual addresses and responds to the normative problems of society, and, as such, is expected to be an impartial critic who can "rise above the partial preoccupation of one's own profession—and engage with the global issues of truth, judgment, and taste of the time". In Representations of the Intellectual (1994), Edward Saïd said that the "true intellectual is, therefore, always an outsider, living in self-imposed exile, and on the margins of society".

An intellectual usually is associated with an ideology or with a philosophy, e.g. the Third Way centrism of Anthony Giddens in the Labour Government of Tony Blair. The Czech intellectual Václav Havel said that politics and intellectuals can be linked, but that moral responsibility for the intellectual's ideas, even when advocated by a politician, remains with the intellectual. Therefore, it is best to avoid utopian intellectuals who offer 'universal insights' to resolve the problems of political economy with public policies that might harm and that have harmed civil society; that intellectuals be mindful of the social and cultural ties created with their words, insights and ideas; and should be heard as social critics of politics and power.



Academic background
In journalism, the term intellectual usually connotes "a university academic" of the humanities—especially a philosopher—who addresses important social and political matters of the day. Hence, such an academic functions as a public intellectual who explains the theoretic bases of said problems and communicates possible answers to the policy makers and executive leaders of society. The sociologist Frank Furedi said that "Intellectuals are not defined according to the jobs they do, but [by] the manner in which they act, the way they see themselves, and the [social and political] values that they uphold. Public intellectuals usually arise from the educated élite of a society; although the North American usage of the term "intellectual" includes the university academics. The difference between "intellectual" and "academic" is participation in the realm of public affairs.



CRITICISM
In "An Interview with Milton Friedman" (1974), the American economist Milton Friedman said that businessmen and the intellectuals are enemies of capitalism. The intellectuals because most believed in socialism while the businessman expected economic privileges:

   The two, chief enemies of the free society or free enterprise are intellectuals, on the one hand, and businessmen, on the other, for opposite reasons. Every intellectual believes in freedom for himself, but he's opposed to freedom for others. [...] He thinks [...] [that] there ought to be a central planning board that will establish social priorities. [...] The businessmen are just the opposite—every businessman is in favor of freedom for everybody else, but, when it comes to himself that's a different question. He's always "the special case". He ought to get special privileges from the government, a tariff, this, that, and the other thing.

In "The Intellectuals and Socialism" (1949), the British economist Friedrich Hayek said that "journalists, teachers, ministers, lecturers, publicists, radio commentators, writers of fiction, cartoonists, and artists" are the intellectual social class whose function is to communicate the complex and specialized knowledge of the scientist to the general public. That in the 20th century, the intellectuals were attracted to socialism and to social democracy because the socialists offered "broad visions; the spacious comprehension of the social order, as a whole, which a planned system promises" and that such broad-vision philosophies "succeeded in inspiring the imagination of the intellectuals" to change and improve their societies.

According to Hayek, intellectuals disproportionately support socialism for idealistic and utopian reasons that cannot be realized in practical terms. Nonetheless, Albert Einstein said in the article "Why Socialism?" (1949), that the economy of the world is not private property because it is a "planetary community of production and consumption". In American society, the intellectual status class are demographically characterized as people who hold liberal-to-leftist political perspectives about guns-or-butter fiscal policy.

The Congregational theologian Edwards Amasa Park proposed segregating the intellectuals from the public sphere of society in the United States

In "The Heartless Lovers of Humankind" (1987), the journalist and popular historian Paul Johnson said:

   It is not the formulation of ideas, however misguided, but the desire to impose them on others that is the deadly sin of the intellectuals. That is why they so incline, by temperament, to the Left. For capitalism merely occurs; if no-one does anything to stop it. It is socialism that has to be constructed, and, as a rule, forcibly imposed, thus providing a far bigger role for intellectuals in its genesis. The progressive intellectual habitually entertains Walter Mitty visions of exercising power.

The public- and private-knowledge dichotomy originated in Ancient Greece, from Socrates's rejection of the Sophist concept that the pursuit of knowledge (truth) is a "public market of ideas", open to all men of the city, not only to philosophers. In contradiction to the Sophist's public market of knowledge, Socrates proposed a knowledge monopoly for and by the philosophers. Thus, "those who sought a more penetrating and rigorous intellectual life rejected, and withdrew from, the general culture of the city, in order to embrace a new model of professionalism", namely the private market of ideas.

As an intellectual, Bertrand Russell was a pacifist who advised Britain against re-arming for World War I

Addressing the societal place, roles and functions of intellectuals in 19th century American society, the Congregational theologian Edwards Amasa Park said: "We do wrong to our own minds, when we carry out scientific difficulties down to the arena of popular dissension". That for the stability of society (social, economic and political) it is necessary "to separate the serious, technical role of professionals from their responsibility [for] supplying usable philosophies for the general public". Thus, operated Socrate's cultural dichotomy of public-knowledge and private-knowledge, of "civic culture" and "professional culture", the social constructs that describe and establish the intellectual sphere of life as separate and apart from the civic sphere of life.

British philosopher Bertrand Russell once said, "I have never called myself an intellectual, and no one has dared to call me one in my presence. I think an intellectual may be defined as a person who pretends to have more intellect than he has, and I hope this definition does not fit me."

Richard Hofstadter quipped that, "an intellectual is a person who likes to turn easy answers into harder questions."



INTELLIGENTSIA
The American historian Norman Stone said that the intellectual social class misunderstand the reality of society and so are doomed to the errors of logical fallacy, ideological stupidity, and poor planning hampered by ideology. In her memoirs, the Conservative politician Margaret Thatcher said that the anti-monarchical French Revolution (1789–1799) was "a utopian attempt to overthrow a traditional order [...] in the name of abstract ideas, formulated by vain intellectuals". Yet, as Prime Minister she asked Britain's academics to help her government resolve the social problems of British society—whilst she retained the populist opinion of "The Intellectual" as being a man of un-British character, a thinker, not a doer. Thatcher's anti-intellectualist perspective was shared by the mass media, especially The Spectator and The Sunday Telegraph newspapers, whose reportage documented a "lack of intellectuals" in Britain.

In his essay "Why do intellectuals oppose capitalism?" (1998), the American libertarian philosopher Robert Nozick of the Cato Institute argued that intellectuals become embittered leftists because their academic skills, much rewarded at school and at university, are undervalued and underpaid in the capitalist market economy. Thus, the intellectuals turned against capitalism—despite enjoying a more economically and financially comfortable life in a capitalist society than they might enjoy in either socialism or communism.

In post-communist Europe, the social attitude perception of the intelligentsia became anti-intellectual. In the Netherlands, the word intellectual negatively connotes an overeducated person of "unrealistic visions of the World". In Hungary, the intellectual is perceived as an "egghead", a person who is "too-clever" for the good of society. In the Czech Republic, the intellectual is a cerebral person, aloof from reality. Such derogatory connotations of intellectual are not definitive because in the "case of English usage, positive, neutral, and pejorative uses can easily coexist". The example is Václav Havel, who "to many outside observers, [became] a favoured instance of The Intellectual as National Icon" in the early history of the post-Communist Czech Republic.

In his book Intellectuals and Society (2010), the economist Thomas Sowell said that lacking disincentives in professional life, the intellectual (producer of knowledge, not material goods) tends to speak outside his or her area of expertise and expects social and professional benefits from the halo effect, derived from possessing professional expertise. That in relation to other professions, the public intellectual is socially detached from the negative and unintended consequences of public policy derived from his or her ideas. As such, the philosopher and mathematician Bertrand Russell (1872–1970) advised the British government against national rearmament in the years before World War I (1914–1918) while the German Empire prepared for war. Yet, the post-war intellectual reputation of Russell remained almost immaculate and his opinions respected by the general public because of the halo effect.



In philosophy, a formal fallacy, deductive fallacy, logical fallacy or non sequitur (Latin for "it does not follow") is a pattern of reasoning rendered invalid by a flaw in its logical structure that can neatly be expressed in a standard logic system, for example propositional logic. It is defined as a deductive argument that is invalid. The argument itself could have true premises, but still have a false conclusion. Thus, a formal fallacy is a fallacy where deduction goes wrong, and is no longer a logical process. This may not affect the truth of the conclusion, since validity and truth are separate in formal logic.

While a logical argument is a non sequitur if, and only if, it is invalid, the term "non sequitur" typically refers to those types of invalid arguments which do not constitute formal fallacies covered by particular terms (e.g. affirming the consequent). In other words, in practice, "non sequitur" refers to an unnamed formal fallacy.

A special case is a mathematical fallacy, an intentionally invalid mathematical proof, often with the error subtle and somehow concealed. Mathematical fallacies are typically crafted and exhibited for educational purposes, usually taking the form of spurious proofs of obvious contradictions.

A formal fallacy is contrasted with an informal fallacy, which may have a valid logical form and yet be unsound because one or more premises are false.



Calligraphy (from Greek: καλλιγραφία) is a visual art related to writing. It is the design and execution of lettering with a broad-tipped instrument, brush, or other writing instrument. A contemporary calligraphic practice can be defined as "the art of giving form to signs in an expressive, harmonious, and skillful manner".

Modern calligraphy ranges from functional inscriptions and designs to fine-art pieces where the letters may or may not be readable. Classical calligraphy differs from typography and non-classical hand-lettering, though a calligrapher may practice both.

Calligraphy continues to flourish in the forms of wedding invitations and event invitations, font design and typography, original hand-lettered logo design, religious art, announcements, graphic design and commissioned calligraphic art, cut stone inscriptions, and memorial documents. It is also used for props and moving images for film and television, and also for testimonials, birth and death certificates, maps, and other written works.



Incivility is a general term for social behaviour lacking in civility or good manners, on a scale from rudeness or lack of respect for elders, to vandalism and hooliganism, through public drunkenness and threatening behaviour. The word "incivility" is derived from the Latin incivilis, meaning "not of a citizen".

The distinction between plain rudeness, and perceived incivility as threat, will depend on some notion of civility as structural to society; incivility as anything more ominous than bad manners is therefore dependent on appeal to notions like its antagonism to the complex concepts of civic virtue or civil society. It has become a contemporary political issue in a number of countries.



The public sphere (German Öffentlichkeit) is an area in social life where individuals can come together to freely discuss and identify societal problems, and through that discussion influence political action. Such a discussion is called public debate and is defined as the expression of views on matters that are of concern to the public—often, but not always, with opposing or diverging views being expressed by participants in the discussion. Public debate takes place mostly through the mass media, but also at meetings or through social media, academic publications and government policy documents. The term was originally coined by German philosopher Jürgen Habermas who defined the public sphere as "made up of private people gathered together as a public and articulating the needs of society with the state". Communication scholar Gerard A. Hauser defines it as "a discursive space in which individuals and groups associate to discuss matters of mutual interest and, where possible, to reach a common judgment about them". The public sphere can be seen as "a theater in modern societies in which political participation is enacted through the medium of talk" and "a realm of social life in which public opinion can be formed".

Describing the emergence of the public sphere in the 18th century, Jürgen Habermas noted that the public realm, or sphere, originally was "coextensive with public authority", while "the private sphere comprised civil society in the narrower sense, that is to say, the realm of commodity exchange and of social labor". Whereas the "sphere of public authority" dealt with the state, or realm of the police, and the ruling class, or the feudal authorities (church, princes and nobility) the "authentic 'public sphere'", in a political sense, arose at that time from within the private realm, specifically, in connection with literary activities, the world of letters. This new public sphere spanned the public and the private realms, and "through the vehicle of public opinion it put the state in touch with the needs of society". "This area is conceptually distinct from the state: it [is] a site for the production and circulation of discourses that can in principle be critical of the state."[10] The public sphere "is also distinct from the official economy; it is not an arena of market relations but rather one of the discursive relations, a theater for debating and deliberating rather than for buying and selling". These distinctions between "state apparatuses, economic markets, and democratic associations...are essential to democratic theory". The people themselves came to see the public sphere as a regulatory institution against the authority of the state. The study of the public sphere centers on the idea of participatory democracy, and how public opinion becomes political action.

The ideology of the public sphere theory is that the government's laws and policies should be steered by the public sphere and that the only legitimate governments are those that listen to the public sphere. "Democratic governance rests on the capacity of and opportunity for citizens to engage in enlightened debate". Much of the debate over the public sphere involves what is the basic theoretical structure of the public sphere, how information is deliberated in the public sphere, and what influence the public sphere has over society.



A demagogue /ˈdɛməɡɒɡ/ (from Greek δημαγωγός, a popular leader, a leader of a mob, from δῆμος, people, populace, the commons + ἀγωγός leading, leader) or rabble-rouser is a leader who gains popularity in a democracy by exploiting emotions, prejudice, and ignorance to arouse the common people against elites, whipping up the passions of the crowd and shutting down reasoned deliberation. Demagogues overturn established norms of political conduct, or promise or threaten to do so.

Historian Reinhard Luthin defined demagogue thus: "What is a demagogue? He is a politician skilled in oratory, flattery and invective; evasive in discussing vital issues; promising everything to everybody; appealing to the passions rather than the reason of the public; and arousing racial, religious, and class prejudices—a man whose lust for power without recourse to principle leads him to seek to become a master of the masses. He has for centuries practiced his profession of 'man of the people'. He is a product of a political tradition nearly as old as western civilization itself."

Demagogues have appeared in democracies since ancient Athens. They exploit a fundamental weakness in democracy: because ultimate power is held by the people, it is possible for the people to give that power to someone who appeals to the lowest common denominator of a large segment of the population. Demagogues usually advocate immediate, forceful action to address a crisis while accusing moderate and thoughtful opponents of weakness or disloyalty. If elected to high executive office, demagogues typically unravel constitutional limits on executive power and attempt to convert their democracy to dictatorship.



Psychological manipulation is a type of social influence that aims to change the behavior or perception of others through indirect, deceptive, or underhanded tactics. By advancing the interests of the manipulator, often at another's expense, such methods could be considered exploitative and devious.

Social influence is not necessarily negative. For example, people such as friends, family and doctors, can try to persuade to change clearly unhelpful habits and behaviors. Social influence is generally perceived to be harmless when it respects the right of the influenced to accept or reject it, and is not unduly coercive. Depending on the context and motivations, social influence may constitute underhanded manipulation.



In sociology, the post-industrial society is the stage of society's development when the service sector generates more wealth than the manufacturing sector of the economy.

The term was originated by Alain Touraine and is closely related to similar sociological theoretical concepts such as post-Fordism, information society, knowledge economy, post-industrial economy, liquid modernity, and network society. They all can be used in economics or social science disciplines as a general theoretical backdrop in research design.

As the term has been used, a few common themes, including the ones below have begun to emerge.

   The economy undergoes a transition from the production of goods to the provision of services.
   Knowledge becomes a valued form of capital; see Human capital.
   Producing ideas is the main way to grow the economy.
   Through processes of globalization and automation, the value and importance to the economy of blue-collar, unionized work, including manual labor (e.g., assembly-line work) decline, and those of professional workers (e.g., scientists, creative-industry professionals, and IT professionals) grow in value and prevalence.
   Behavioral and information sciences and technologies are developed and implemented. (e.g., behavioral economics, information architecture, cybernetics, game theory and information theory.)



The reputation of a social entity (a person, a social group, an organization, or a place) is an opinion about that entity, typically as a result of social evaluation on a set of criteria, such as behavior or performance.

Reputation is known to be a ubiquitous, spontaneous, and highly efficient mechanism of social control in natural societies. It is a subject of study in social, management, and technological sciences. Its influence ranges from competitive settings, like markets, to cooperative ones, like firms, organizations, institutions and communities. Furthermore, reputation acts on different levels of agency, individual and supra-individual. At the supra-individual level, it concerns groups, communities, collectives and abstract social entities (such as firms, corporations, organizations, countries, cultures and even civilizations). It affects phenomena of different scales, from everyday life to relationships between nations. Reputation is a fundamental instrument of social order, based upon distributed, spontaneous social control.

The concept of reputation is considered important in business, politics, education, online communities, and many other fields, and it may be considered as a reflection of that social entity's identity.



A race is a grouping of humans based on shared physical or social qualities into categories generally viewed as distinct by society. The term was first used to refer to speakers of a common language and then to denote national affiliations. By the 17th century the term began to refer to physical (phenotypical) traits. Modern scholarship regards race as a social construct, an identity which is assigned based on rules made by society. While partially based on physical similarities within groups, race does not have an inherent physical or biological meaning.

Social conceptions and groupings of races have varied over time, often involving folk taxonomies that define essential types of individuals based on perceived traits. Today, scientists consider such biological essentialism obsolete, and generally discourage racial explanations for collective differentiation in both physical and behavioral traits.

Even though there is a broad scientific agreement that essentialist and typological conceptualizations of race are untenable, scientists around the world continue to conceptualize race in widely differing ways. While some researchers continue to use the concept of race to make distinctions among fuzzy sets of traits or observable differences in behavior, others in the scientific community suggest that the idea of race is inherently naive or simplistic. Still others argue that, among humans, race has no taxonomic significance because all living humans belong to the same subspecies, Homo sapiens sapiens.

Since the second half of the 20th century, the association of race with the discredited theories of scientific racism has contributed to race becoming increasingly seen as a largely pseudoscientific system of classification. Although still used in general contexts, race has often been replaced by less ambiguous and loaded terms: populations, people(s), ethnic groups, or communities, depending on context.



An ethnic group or ethnicity is a named social category of people who identify with each other on the basis of shared attributes that distinguish them from other groups such as a common set of traditions, ancestry, language, history, society, culture, nation, religion, or social treatment within their residing area. Ethnicity is sometimes used interchangeably with the term nation, particularly in cases of ethnic nationalism, and is separate from, but related to the concept of races.

Ethnicity can be an inherited status or based on the society within which one lives. Membership of an ethnic group tends to be defined by a shared cultural heritage, ancestry, origin myth, history, homeland, language or dialect, symbolic systems such as religion, mythology and ritual, cuisine, dressing style, art or physical appearance. Ethnic groups often continue to speak related languages and share a similar gene pool.

By way of language shift, acculturation, adoption and religious conversion, individuals or groups may over time shift from one ethnic group to another. Ethnic groups may be subdivided into subgroups or tribes, which over time may become separate ethnic groups themselves due to endogamy or physical isolation from the parent group. Conversely, formerly separate ethnicities can merge to form a pan-ethnicity and may eventually merge into one single ethnicity. Whether through division or amalgamation, the formation of a separate ethnic identity is referred to as ethnogenesis.



Honour (British English) or honor (American English; see spelling differences) is the idea of a bond between an individual and a society as a quality of a person that is both of social teaching and of personal ethos, that manifests itself as a code of conduct, and has various elements such as valour, chivalry, honesty, and compassion. It is an abstract concept entailing a perceived quality of worthiness and respectability that affects both the social standing and the self-evaluation of an individual or institution such as a family, school, regiment or nation. Accordingly, individuals (or institutions) are assigned worth and stature based on the harmony of their actions with a specific code of honour, and the moral code of the society at large.

Samuel Johnson, in his A Dictionary of the English Language (1755), defined honour as having several senses, the first of which was "nobility of soul, magnanimity, and a scorn of meanness". This sort of honour derives from the perceived virtuous conduct and personal integrity of the person endowed with it. On the other hand, Johnson also defined honour in relationship to "reputation" and "fame"; to "privileges of rank or birth", and as "respect" of the kind which "places an individual socially and determines his right to precedence". This sort of honour is often not so much a function of moral or ethical excellence, as it is a consequence of power. Finally, with respect to sexuality, honour has traditionally been associated with (or identical to) "chastity" or "virginity", or in case of married men and women, "fidelity". Some have argued that honour should be seen more as a rhetoric, or set of possible actions, than as a code.



Paternalism is action that limits a person's or group's liberty or autonomy and is intended to promote their own good. Paternalism can also imply that the behavior is against or regardless of the will of a person, or also that the behavior expresses an attitude of superiority. Paternalism, paternalistic and paternalist have all been used as a pejorative.

The word paternalism is from the Latin pater "father" via the adjective paternus "fatherly", which in Medieval Latin became paternalis. Some such as John Stuart Mill think paternalism to be appropriate towards children, saying: "It is, perhaps, hardly necessary to say that this doctrine is meant to apply only to human beings in the maturity of their faculties. We are not speaking of children, or of young persons below the age which the law may fix as that of manhood or womanhood." Paternalism towards adults is sometimes thought of as treating them as if they were children.
WIKIPEDIA INTELLECTUAL


In philosophy, physicalism is the metaphysical thesis that "everything is physical", that there is "nothing over and above" the physical, or that everything supervenes on the physical.[2] Physicalism is a form of ontological monism—a "one substance" view of the nature of reality as opposed to a "two-substance" (dualism) or "many-substance" (pluralism) view. Both the definition of "physical" and the meaning of physicalism have been debated.

Physicalism is closely related to materialism. Physicalism grew out of materialism with advancements of the physical sciences in explaining observed phenomena. The terms are often used interchangeably, although they are sometimes distinguished, for example on the basis of physics describing more than just matter (including energy and physical law).

According to a 2009 survey, physicalism is the majority view among philosophers, but there remains significant opposition to physicalism. Neuroplasticity is one such evidence that is used in support of a non-physicalist view. The philosophical zombie argument is another attempt to challenge physicalism.



GLOSSARY OF ENGINEERING
Absolute electrode potential
   In electrochemistry, according to an IUPAC definition, is the electrode potential of a metal measured with respect to a universal reference system (without any additional metal–solution interface).
Absolute pressure
   Is zero-referenced against a perfect vacuum, using an absolute scale, so it is equal to gauge pressure plus atmospheric pressure.
Absolute zero
   Is the lower limit of the thermodynamic temperature scale, a state at which the enthalpy and entropy of a cooled ideal gas reach their minimum value, taken as 0. Absolute zero is the point at which the fundamental particles of nature have minimal vibrational motion, retaining only quantum mechanical, zero-point energy-induced particle motion. The theoretical temperature is determined by extrapolating the ideal gas law; by international agreement, absolute zero is taken as −273.15° on the Celsius scale (International System of Units), which equals −459.67° on the Fahrenheit scale (United States customary units or Imperial units). The corresponding Kelvin and Rankine temperature scales set their zero points at absolute zero by definition.
Absorbance
   Absorbance or decadic absorbance is the common logarithm of the ratio of incident to transmitted radiant power through a material, and spectral absorbance or spectral decadic absorbance is the common logarithm of the ratio of incident to transmitted spectral radiant power through a material.
AC power
   Electric power delivered by alternating current; common household power is AC.
Acceleration
   The rate at which the velocity of a body changes with time, and the direction in which that change is acting.
Acid
   A molecule or ion capable of donating a hydron (proton or hydrogen ion H+), or, alternatively, capable of forming a covalent bond with an electron pair (a Lewis acid).
Acid-base reaction
   A chemical reaction that occurs between an acid and a base, which can be used to determine pH.
Acid strength
   In strong acids, most of the molecules give up a hydrogen ion and become ionized.
Acoustics
   The scientific study of sound.
Activated sludge
   A type of wastewater treatment process for treating sewage or industrial wastewaters using aeration and a biological floc composed of bacteria and protozoa.
Activated sludge model
   A generic name for a group of mathematical methods to model activated sludge systems.
Active transport
   In cellular biology, active transport is the movement of molecules across a membrane from a region of their lower concentration to a region of their higher concentration—against the concentration gradient. Active transport requires cellular energy to achieve this movement. There are two types of active transport: primary active transport that uses ATP, and secondary active transport that uses an electrochemical gradient. An example of active transport in human physiology is the uptake of glucose in the intestines.
Actuator
   A device that accepts 2 inputs (control signal, energy source) and outputs kinetic energy in the form of physical movement (linear, rotary, or oscillatory). The control signal input specifies which motion should be taken. The energy source input is typically either an electric current, hydraulic pressure, or pneumatic pressure. An actuator can be the final element of a control loop
Adenosine triphosphate
   A complex organic chemical that provides energy to drive many processes in living cells, e.g. muscle contraction, nerve impulse propagation, chemical synthesis. Found in all forms of life, ATP is often referred to as the "molecular unit of currency" of intracellular energy transfer.
Adhesion
   The tendency of dissimilar particles or surfaces to cling to one another (cohesion refers to the tendency of similar or identical particles/surfaces to cling to one another).
Adiabatic process
   A process where no heat energy is lost to outside space.
Adiabatic wall
   A barrier through which heat energy cannot pass.
Aerobic digestion
   A process in sewage treatment designed to reduce the volume of sewage sludge and make it suitable for subsequent use.
Aerodynamics
   The study of the motion of air, particularly its interaction with a solid object, such as an airplane wing. It is a sub-field of fluid dynamics and gas dynamics, and many aspects of aerodynamics theory are common to these fields..
Aerospace engineering
   Aerospace engineering Is the primary field of engineering concerned with the development of aircraft and spacecraft. It has two major and overlapping branches: Aeronautical engineering and Astronautical Engineering. Avionics engineering is similar, but deals with the electronics side of aerospace engineering.
Afocal system
   An optical system that produces no net convergence or divergence of the beam, i.e. has an infinite effective focal length.
Agricultural engineering
   The profession of designing machinery, processes, and systems for use in agriculture.
Albedo
   A measure of the fraction of light reflected from an astronomical body or other object.
Alkane
   An alkane, or paraffin (a historical name that also has other meanings), is an acyclic saturated hydrocarbon. In other words, an alkane consists of hydrogen and carbon atoms arranged in a tree structure in which all the carbon–carbon bonds are single.
Alkene
   An unsaturated hydrocarbon that contains at least one carbon–carbon double bond. The words alkene and olefin are often used interchangeably.
Alkyne
   Is an unsaturated hydrocarbon containing at least one carbon—carbon triple bond. The simplest acyclic alkynes with only one triple bond and no other functional groups form a homologous series with the general chemical formula CnH2n−2.
Alloy
   is a combination of metals or of a metal and another element. Alloys are defined by a metallic bonding character.
Alpha particle
   Alpha particles consist of two protons and two neutrons bound together into a particle identical to a helium-4 nucleus. They are generally produced in the process of alpha decay, but may also be produced in other ways. Alpha particles are named after the first letter in the Greek alphabet, α.
Alternating current
   Electrical current that regularly reverses direction.
Alternative hypothesis
   In statistical hypothesis testing, the alternative hypothesis (or maintained hypothesis or research hypothesis) and the null hypothesis are the two rival hypotheses which are compared by a statistical hypothesis test. In the domain of science two rival hypotheses can be compared by explanatory power and predictive power..
Ammeter
   An instrument that measures current.
Amino acids
   Are organic compounds containing amine (-NH2) and carboxyl (-COOH) functional groups, along with a side chain (R group) specific to each amino acid. The key elements of an amino acid are carbon (C), hydrogen (H), oxygen (O), and nitrogen (N), although other elements are found in the side chains of certain amino acids. About 500 naturally occurring amino acids are known (though only 20 appear in the genetic code) and can be classified in many ways.
Amorphous solid
   An amorphous (from the Greek a, without, morphé, shape, form) or non-crystalline solid is a solid that lacks the long-range order that is characteristic of a crystal.
Ampere
   The SI unit of current flow, one coulomb per second.
Amphoterism
   In chemistry, an amphoteric compound is a molecule or ion that can react both as an acid as well as a base. Many metals (such as copper, zinc, tin, lead, aluminium, and beryllium) form amphoteric oxides or hydroxides. Amphoterism depends on the oxidation states of the oxide. Al2O3 is an example of an amphoteric oxide..
Amplifier
   A device that replicates a signal with increased power.
Amplitude
   The amplitude of a periodic variable is a measure of its change over a single period (such as time or spatial period). There are various definitions of amplitude, which are all functions of the magnitude of the difference between the variable's extreme values. In older texts the phase is sometimes called the amplitude.
Anaerobic digestion
   Is a collection of processes by which microorganisms break down biodegradable material in the absence of oxygen. The process is used for industrial or domestic purposes to manage waste or to produce fuels. Much of the fermentation used industrially to produce food and drink products, as well as home fermentation, uses anaerobic digestion.
Angular acceleration
   Is the rate of change of angular velocity. In three dimensions, it is a pseudovector. In SI units, it is measured in radians per second squared (rad/s2), and is usually denoted by the Greek letter alpha (α).
Angular momentum
   In physics, angular momentum (rarely, moment of momentum or rotational momentum) is the rotational equivalent of linear momentum. It is an important quantity in physics because it is a conserved quantity—the total angular momentum of a system remains constant unless acted on by an external torque.
Angular velocity
   In physics, the angular velocity of a particle is the rate at which it rotates around a chosen center point: that is, the time rate of change of its angular displacement relative to the origin (i.e. in layman's terms: how quickly an object goes around something over a period of time - e.g. how fast the earth orbits the sun). It is measured in angle per unit time, radians per second in SI units, and is usually represented by the symbol omega (ω, sometimes Ω). By convention, positive angular velocity indicates counter-clockwise rotation, while negative is clockwise.
Anion
   Is an ion with more electrons than protons, giving it a net negative charge (since electrons are negatively charged and protons are positively charged).
Annealing (metallurgy)
   A heat treatment process that relieves internal stresses.
Annihilation
   In particle physics, annihilation is the process that occurs when a subatomic particle collides with its respective antiparticle to produce other particles, such as an electron colliding with a positron to produce two photons. The total energy and momentum of the initial pair are conserved in the process and distributed among a set of other particles in the final state. Antiparticles have exactly opposite additive quantum numbers from particles, so the sums of all quantum numbers of such an original pair are zero. Hence, any set of particles may be produced whose total quantum numbers are also zero as long as conservation of energy and conservation of momentum are obeyed.
Anode
   The electrode at which current enters a device such as an electrochemical cell or vacuum tube.
ANSI
   The American National Standards Institute is a private non-profit organization that oversees the development of voluntary consensus standards for products, services, processes, systems, and personnel in the United States. The organization also coordinates U.S. standards with international standards so that American products can be used worldwide.
Anti-gravity
   Anti-gravity (also known as non-gravitational field) is a theory of creating a place or object that is free from the force of gravity. It does not refer to the lack of weight under gravity experienced in free fall or orbit, or to balancing the force of gravity with some other force, such as electromagnetism or aerodynamic lift.
Applied engineering
   Is the field concerned with the application of management, design, and technical skills for the design and integration of systems, the execution of new product designs, the improvement of manufacturing processes, and the management and direction of physical and/or technical functions of a firm or organization. Applied-engineering degreed programs typically include instruction in basic engineering principles, project management, industrial processes, production and operations management, systems integration and control, quality control, and statistics.
Applied mathematics
   Mathematics used for solutions of practical problems, as opposed to pure mathematics.
Arc length
   Determining the length of an irregular arc segment is also called rectification of a curve. Historically, many methods were used for specific curves. The advent of infinitesimal calculus led to a general formula that provides closed-form solutions in some cases.
Archimedes' principle
   Archimedes' principle states that the upward buoyant force that is exerted on a body immersed in a fluid, whether fully or partially submerged, is equal to the weight of the fluid that the body displaces and acts in the upward direction at the center of mass of the displaced fluid. Archimedes' principle is a law of physics fundamental to fluid mechanics. It was formulated by Archimedes of Syracuse.
Area moment of inertia
   The 2nd moment of area, also known as moment of inertia of plane area, area moment of inertia, or second area moment, is a geometrical property of an area which reflects how its points are distributed with regard to an arbitrary axis. The second moment of area is typically denoted with either an I {\displaystyle I} I for an axis that lies in the plane or with a J {\displaystyle J} J for an axis perpendicular to the plane. In both cases, it is calculated with a multiple integral over the object in question. Its dimension is L (length) to the fourth power. Its unit of dimension when working with the International System of Units is meters to the fourth power, m4.
Arithmetic mean
   In mathematics and statistics, the arithmetic mean or simply the mean or average when the context is clear, is the sum of a collection of numbers divided by the number of numbers in the collection.
Arithmetic progression
   In mathematics, an arithmetic progression (AP) or arithmetic sequence is a sequence of numbers such that the difference between the consecutive terms is constant. Difference here means the second minus the first. For instance, the sequence 5, 7, 9, 11, 13, 15, . . . is an arithmetic progression with common difference of 2.
Aromatic hydrocarbon
   An aromatic hydrocarbon or arene (or sometimes aryl hydrocarbon) is a hydrocarbon with sigma bonds and delocalized pi electrons between carbon atoms forming a circle. In contrast, aliphatic hydrocarbons lack this delocalization. The term "aromatic" was assigned before the physical mechanism determining aromaticity was discovered; the term was coined as such simply because many of the compounds have a sweet or pleasant odour. The configuration of six carbon atoms in aromatic compounds is known as a benzene ring, after the simplest possible such hydrocarbon, benzene. Aromatic hydrocarbons can be monocyclic (MAH) or polycyclic (PAH).
Arrhenius equation
   The Arrhenius equation is a formula for the temperature dependence of reaction rates. The equation was proposed by Svante Arrhenius in 1889, based on the work of Dutch chemist Jacobus Henricus van 't Hoff who had noted in 1884 that Van 't Hoff's equation for the temperature dependence of equilibrium constants suggests such a formula for the rates of both forward and reverse reactions. This equation has a vast and important application in determining rate of chemical reactions and for calculation of energy of activation. Arrhenius provided a physical justification and interpretation for the formula. Currently, it is best seen as an empirical relationship. It can be used to model the temperature variation of diffusion coefficients, population of crystal vacancies, creep rates, and many other thermally-induced processes/reactions. The Eyring equation, developed in 1935, also expresses the relationship between rate and energy.
Artificial intelligence
   The intelligence of machines and the branch of computer science that aims to create it..
Assembly language
   A computer programming language where most statements correspond to one or a few machine op-codes.
Atomic orbital
   In atomic theory and quantum mechanics, an atomic orbital is a mathematical function that describes the wave-like behavior of either one electron or a pair of electrons in an atom. This function can be used to calculate the probability of finding any electron of an atom in any specific region around the atom's nucleus. The term atomic orbital may also refer to the physical region or space where the electron can be calculated to be present, as defined by the particular mathematical form of the orbital.
Atomic packing factor
   The percentage of the volume filled with atomic mass in a crystal formation.
Audio frequency
   An audio frequency (abbreviation: AF) or audible frequency is characterized as a periodic vibration whose frequency is audible to the average human. The SI unit of audio frequency is the hertz (Hz). It is the property of sound that most determines pitch.
Austenitization
   Austenitization means to heat the iron, iron-based metal, or steel to a temperature at which it changes crystal structure from ferrite to austenite. The more open structure of the austenite is then able to absorb carbon from the iron-carbides in carbon steel. An incomplete initial austenitization can leave undissolved carbides in the matrix. For some irons, iron-based metals, and steels, the presence of carbides may occur during the austenitization step. The term commonly used for this is two-phase austenitization.
Automation
   Is the technology by which a process or procedure is performed with minimum human assistance. Automation or automatic control is the use of various control systems for operating equipment such as machinery, processes in factories, boilers and heat treating ovens, switching on telephone networks, steering and stabilization of ships, aircraft and other applications and vehicles with minimal or reduced human intervention. Some processes have been completely automated.
Autonomous vehicle
   A vehicle capable of driving from one point to another without input from a human operator.
Azimuthal quantum number
   The azimuthal quantum number is a quantum number for an atomic orbital that determines its orbital angular momentum and describes the shape of the orbital. The azimuthal quantum number is the second of a set of quantum numbers which describe the unique quantum state of an electron (the others being the principal quantum number, following spectroscopic notation, the magnetic quantum number, and the spin quantum number). It is also known as the orbital angular momentum quantum number, orbital quantum number or second quantum number, and is symbolized as ℓ.



Applied science is the use of existing scientific knowledge to practical goals, like technology or inventions.

Within natural science, disciplines that are basic science develop basic information to explain and perhaps predict phenomena in the natural world. Applied science is the use of scientific processes and knowledge as the means to achieve a particular practical or useful result. This includes a broad range of applied science related fields, including engineering and medicine.

Applied science can also apply formal science, such as statistics and probability theory, as in epidemiology. Genetic epidemiology is an applied science applying both biological and statistical methods.



Natural science is a branch of science concerned with the description, prediction, and understanding of natural phenomena, based on empirical evidence from observation and experimentation. Mechanisms such as peer review and repeatability of findings are used to try to ensure the validity of scientific advances.

Natural science can be divided into two main branches: life science and physical science. Life science is alternatively known as biology, and physical science is subdivided into branches: physics, chemistry, astronomy and Earth science. These branches of natural science may be further divided into more specialized branches (also known as fields). As empirical sciences, natural sciences use tools from the formal sciences, such as mathematics and logic, converting information about nature into measurements which can be explained as clear statements of the "laws of nature".

Modern natural science succeeded more classical approaches to natural philosophy, usually traced to Taoists traditions in Asia and in the Occident to ancient Greece. Galileo, Descartes, Bacon, and Newton debated the benefits of using approaches which were more mathematical and more experimental in a methodical way. Still, philosophical perspectives, conjectures, and presuppositions, often overlooked, remain necessary in natural science. Systematic data collection, including discovery science, succeeded natural history, which emerged in the 16th century by describing and classifying plants, animals, minerals, and so on. Today, "natural history" suggests observational descriptions aimed at popular audiences.



Physics (from Ancient Greek: φυσική (ἐπιστήμη), romanized: physikḗ (epistḗmē), lit. 'knowledge of nature', from φύσις phýsis 'nature') is the natural science that studies matter, its motion and behavior through space and time, and the related entities of energy and force. Physics is one of the most fundamental scientific disciplines, and its main goal is to understand how the universe behaves.

Physics is one of the oldest academic disciplines and, through its inclusion of astronomy, perhaps the oldest. Over much of the past two millennia, physics, chemistry, biology, and certain branches of mathematics were a part of natural philosophy, but during the Scientific Revolution in the 17th century these natural sciences emerged as unique research endeavors in their own right. Physics intersects with many interdisciplinary areas of research, such as biophysics and quantum chemistry, and the boundaries of physics are not rigidly defined. New ideas in physics often explain the fundamental mechanisms studied by other sciences and suggest new avenues of research in academic disciplines such as mathematics and philosophy.

Advances in physics often enable advances in new technologies. For example, advances in the understanding of electromagnetism, solid-state physics, and nuclear physics led directly to the development of new products that have dramatically transformed modern-day society, such as television, computers, domestic appliances, and nuclear weapons; advances in thermodynamics led to the development of industrialization; and advances in mechanics inspired the development of calculus.



Formal science is a branch of science studying formal language disciplines concerned with formal systems, such as logic, mathematics, statistics, theoretical computer science, artificial intelligence, information theory, game theory, systems theory, decision theory, and theoretical linguistics. Whereas the natural sciences and social sciences seek to characterize physical systems and social systems, respectively, using empirical methods, the formal sciences are language tools concerned with characterizing abstract structures described by symbolic systems. The formal sciences aid the natural and social sciences by providing information about the structures the latter use to describe the world, and what inferences may be made about them.



Genetic epidemiology is the study of the role of genetic factors in determining health and disease in families and in populations, and the interplay of such genetic factors with environmental factors. Genetic epidemiology seeks to derive a statistical and quantitative analysis of how genetics work in large groups.




Population genetics is a subfield of genetics that deals with genetic differences within and between populations, and is a part of evolutionary biology. Studies in this branch of biology examine such phenomena as adaptation, speciation, and population structure.

Population genetics was a vital ingredient in the emergence of the modern evolutionary synthesis. Its primary founders were Sewall Wright, J. B. S. Haldane and Ronald Fisher, who also laid the foundations for the related discipline of quantitative genetics. Traditionally a highly mathematical discipline, modern population genetics encompasses theoretical, lab, and field work. Population genetic models are used both for statistical inference from DNA sequence data and for proof/disproof of concept.

What sets population genetics apart today from newer, more phenotypic approaches to modelling evolution, such as evolutionary game theory and adaptive dynamics, is its emphasis on genetic phenomena as dominance, epistasis, the degree to which genetic recombination breaks up linkage disequilibrium, and the random phenomena of mutation and genetic drift. This makes it appropriate for comparison to population genomics data.



Quantitative genetics deals with phenotypes that vary continuously (in characters such as height or mass)—as opposed to discretely identifiable phenotypes and gene-products (such as eye-colour, or the presence of a particular biochemical).

Both branches use the frequencies of different alleles of a gene in breeding populations (gamodemes), and combine them with concepts from simple Mendelian inheritance to analyze inheritance patterns across generations and descendant lines. While population genetics can focus on particular genes and their subsequent metabolic products, quantitative genetics focuses more on the outward phenotypes, and makes summaries only of the underlying genetics.

Due to the continuous distribution of phenotypic values, quantitative genetics must employ many other statistical methods (such as the effect size, the mean and the variance) to link phenotypes (attributes) to genotypes. Some phenotypes may be analyzed either as discrete categories or as continuous phenotypes, depending on the definition of cut-off points, or on the metric used to quantify them. Mendel himself had to discuss this matter in his famous paper, especially with respect to his peas attribute tall/dwarf, which actually was "length of stem". Analysis of quantitative trait loci, or QTL, is a more recent addition to quantitative genetics, linking it more directly to molecular genetics.



An allele (UK: /ˈæliːl/, /əˈliːl/; US: /əˈliːl/; modern formation from Greek ἄλλος állos, "other")[1][2][3] is one of two, or more, forms of a given gene.[4] E.g. the ABO blood grouping is controlled by the ABO gene which has six common alleles. Nearly every living human's phenotype for the ABO gene is some combination of just these six alleles.[5][6] An allele is one of two, or more, versions of the same gene at the same place on a chromosome. It can also refer to different sequence variations for a several-hundred base-pair or more region of the genome that codes for a protein. Alleles can come in different extremes of size. At the lowest possible size an allele can be a single nucleotide polymorphism (SNP).[7] At the higher end, it can be up to several thousand base-pairs long.[8][9] Most alleles result in little or no observable change in the function of the protein the gene codes for.

However, sometimes, different alleles can result in different observable phenotypic traits, such as different pigmentation. A notable example of this is Gregor Mendel's discovery that the white and purple flower colors in pea plants were the result of "pure line" traits, that is a single gene with two alleles.

All multicellular organisms have two sets of chromosomes at some point in their life cycle; that is, they are diploid. In this case, the chromosomes can be paired. Each chromosome in the pair contains the same genes in the same order, and place, along the length of the chromosome. For a given gene, if the two chromosomes contain the same allele, they, and the organism, are homozygous with respect to that gene. If the alleles are different, they, and the organism, are heterozygous with respect to that gene.



ENGINEERING BRANCHES
Engineering is the discipline and profession that applies scientific theories, mathematical methods, and empirical evidence to design, create, and analyze technological solutions cognizant of safety, human factors, physical laws, regulations, practicality, and cost. In the contemporary era, engineering is generally considered to consist of the major primary branches of chemical engineering, civil engineering, electrical engineering, and mechanical engineering. There are numerous other engineering subdisciplines and interdisciplinary subjects that may or may not be part of these major engineering branches.



Mathematics (from Greek: μάθημα, máthēma, 'knowledge, study, learning') includes the study of such topics as quantity (number theory), structure (algebra), space (geometry), and change (mathematical analysis). It has no generally accepted definition.

Mathematicians seek and use patterns to formulate new conjectures; they resolve the truth or falsity of such by mathematical proof. When mathematical structures are good models of real phenomena, mathematical reasoning can be used to provide insight or predictions about nature. Through the use of abstraction and logic, mathematics developed from counting, calculation, measurement, and the systematic study of the shapes and motions of physical objects. Practical mathematics has been a human activity from as far back as written records exist. The research required to solve mathematical problems can take years or even centuries of sustained inquiry.

Rigorous arguments first appeared in Greek mathematics, most notably in Euclid's Elements. Since the pioneering work of Giuseppe Peano (1858–1932), David Hilbert (1862–1943), and others on axiomatic systems in the late 19th century, it has become customary to view mathematical research as establishing truth by rigorous deduction from appropriately chosen axioms and definitions. Mathematics developed at a relatively slow pace until the Renaissance, when mathematical innovations interacting with new scientific discoveries led to a rapid increase in the rate of mathematical discovery that has continued to the present day.

Mathematics is essential in many fields, including natural science, engineering, medicine, finance, and the social sciences. Applied mathematics has led to entirely new mathematical disciplines, such as statistics and game theory. Mathematicians engage in pure mathematics (mathematics for its own sake) without having any application in mind, but practical applications for what began as pure mathematics are often discovered later.



Mental set was first articulated by Abraham Luchins in the 1940s and demonstrated in his well-known water jug experiments. In these experiments, participants were asked to fill one jug with a specific amount of water using only other jugs (typically three) with different maximum capacities as tools. After Luchins gave his participants a set of water jug problems that could all be solved by employing a single technique, he would then give them a problem that could either be solved using that same technique or a novel and simpler method. Luchins discovered that his participants tended to use the same technique that they had become accustomed to despite the possibility of using a simpler alternative. Thus mental set describes one's inclination to attempt to solve problems in such a way that has proved successful in previous experiences. However, as Luchins' work revealed, such methods for finding a solution that have worked in the past may not be adequate or optimal for certain new but similar problems. Therefore, it is often necessary for people to move beyond their mental sets in order to find solutions. This was again demonstrated in Norman Maier's 1931 experiment, which challenged participants to solve a problem by using a household object (pliers) in an unconventional manner. Maier observed that participants were often unable to view the object in a way that strayed from its typical use, a phenomenon regarded as a particular form of mental set (more specifically known as functional fixedness, which is the topic of the following section). When people cling rigidly to their mental sets, they are said to be experiencing fixation, a seeming obsession or preoccupation with attempted strategies that are repeatedly unsuccessful. In the late 1990s, researcher Jennifer Wiley worked to reveal that expertise can work to create a mental set in people considered to be experts in their fields, and she gained evidence that the mental set created by expertise could lead to the development of fixation.



In linguistics, a participle (PTCP) is a form of nonfinite verb that comprises perfective or continuative grammatical aspects in numerous tenses. A participle also may function as an adjective or an adverb. For example, in "boiled potato", boiled is the past participle of the verb boil, adjectivally modifying the noun potato; in "ran us ragged," ragged is the past participle of the verb rag, adverbially qualifying the verb ran.



Deception is an act or statement which misleads, hides the truth, or promotes a belief, concept, or idea that is not true. It is often done for personal gain or advantage. Deception can involve dissimulation, propaganda and sleight of hand as well as distraction, camouflage or concealment. There is also self-deception, as in bad faith. It can also be called, with varying subjective implications, beguilement, deceit, bluff, mystification, ruse, or subterfuge.

Deception is a major relational transgression that often leads to feelings of betrayal and distrust between relational partners. Deception violates relational rules and is considered to be a negative violation of expectations. Most people expect friends, relational partners, and even strangers to be truthful most of the time. If people expected most conversations to be untruthful, talking and communicating with others would require distraction and misdirection to acquire reliable information. A significant amount of deception occurs between some romantic and relational partners.

Deceit and dishonesty can also form grounds for civil litigation in tort, or contract law (where it is known as misrepresentation or fraudulent misrepresentation if deliberate), or give rise to criminal prosecution for fraud. It also forms a vital part of psychological warfare in denial and deception.



Computational geometry is a branch of computer science devoted to the study of algorithms which can be stated in terms of geometry. Some purely geometrical problems arise out of the study of computational geometric algorithms, and such problems are also considered to be part of computational geometry. While modern computational geometry is a recent development, it is one of the oldest fields of computing with a history stretching back to antiquity.

Computational complexity is central to computational geometry, with great practical significance if algorithms are used on very large datasets containing tens or hundreds of millions of points. For such sets, the difference between O(n2) and O(n log n) may be the difference between days and seconds of computation.

The main impetus for the development of computational geometry as a discipline was progress in computer graphics and computer-aided design and manufacturing (CAD/CAM), but many problems in computational geometry are classical in nature, and may come from mathematical visualization.

Other important applications of computational geometry include robotics (motion planning and visibility problems), geographic information systems (GIS) (geometrical location and search, route planning), integrated circuit design (IC geometry design and verification), computer-aided engineering (CAE) (mesh generation), computer vision (3D reconstruction).

The main branches of computational geometry are:

   Combinatorial computational geometry, also called algorithmic geometry, which deals with geometric objects as discrete entities. A groundlaying book in the subject by Preparata and Shamos dates the first use of the term "computational geometry" in this sense by 1975.
   Numerical computational geometry, also called machine geometry, computer-aided geometric design (CAGD), or geometric modeling, which deals primarily with representing real-world objects in forms suitable for computer computations in CAD/CAM systems. This branch may be seen as a further development of descriptive geometry and is often considered a branch of computer graphics or CAD. The term "computational geometry" in this meaning has been in use since 1971.



ALGORITHMICS
Algorithmics is the systematic study of the design and analysis of algorithms. It is fundamental and one of the oldest fields of computer science. It includes algorithm design, the art of building a procedure which can solve efficiently a specific problem or a class of problem, algorithmic complexity theory, the study of estimating the hardness of problems by studying the properties of the algorithm that solves them, or algorithm analysis, the science of studying the properties of a problem, such as quantifying resources in time and memory space needed by this algorithm to solve this problem.



Humans interact with computers in many ways; the interface between humans and computers is crucial to facilitate this interaction. Desktop applications, internet browsers, handheld computers, ERP, and computer kiosks make use of the prevalent graphical user interfaces (GUI) of today. Voice user interfaces (VUI) are used for speech recognition and synthesizing systems, and the emerging multi-modal and Graphical user interfaces (GUI) allow humans to engage with embodied character agents in a way that cannot be achieved with other interface paradigms. The growth in human–computer interaction field has been in quality of interaction, and in different branching in its history. Instead of designing regular interfaces, the different research branches have had a different focus on the concepts of multimodality rather than unimodality, intelligent adaptive interfaces rather than command/action based ones, and finally active rather than passive interfaces.

The Association for Computing Machinery (ACM) defines human-computer interaction as "a discipline concerned with the design, evaluation and implementation of interactive computing systems for human use and with the study of major phenomena surrounding them". An important facet of HCI is user satisfaction (or simply End User Computing Satisfaction). "Because human–computer interaction studies a human and a machine in communication, it draws from supporting knowledge on both the machine and the human side. On the machine side, techniques in computer graphics, operating systems, programming languages, and development environments are relevant. On the human side, communication theory, graphic and industrial design disciplines, linguistics, social sciences, cognitive psychology, social psychology, and human factors such as computer user satisfaction are relevant. And, of course, engineering and design methods are relevant." Due to the multidisciplinary nature of HCI, people with different backgrounds contribute to its success. HCI is also sometimes termed human–machine interaction (HMI), man-machine interaction (MMI) or computer-human interaction (CHI).

Poorly designed human-machine interfaces can lead to many unexpected problems. A classic example is the Three Mile Island accident, a nuclear meltdown accident, where investigations concluded that the design of the human-machine interface was at least partly responsible for the disaster. Similarly, accidents in aviation have resulted from manufacturers' decisions to use non-standard flight instruments or throttle quadrant layouts: even though the new designs were proposed to be superior in basic human-machine interaction, pilots had already ingrained the "standard" layout and thus the conceptually good idea actually had undesirable results.



In computer engineering, computer architecture is a set of rules and methods that describe the functionality, organization, and implementation of computer systems. Some definitions of architecture define it as describing the capabilities and programming model of a computer but not a particular implementation. In other definitions computer architecture involves instruction set architecture design, microarchitecture design, logic design, and implementation.



In theoretical computer science and mathematics, the theory of computation is the branch that deals with what problems can be solved on a model of computation, using an algorithm, how efficiently they can be solved or to what degree (e.g., approximate solutions versus precise ones). The field is divided into three major branches: automata theory and formal languages, computability theory, and computational complexity theory, which are linked by the question: "What are the fundamental capabilities and limitations of computers?".

In order to perform a rigorous study of computation, computer scientists work with a mathematical abstraction of computers called a model of computation. There are several models in use, but the most commonly examined is the Turing machine. Computer scientists study the Turing machine because it is simple to formulate, can be analyzed and used to prove results, and because it represents what many consider the most powerful possible "reasonable" model of computation (see Church–Turing thesis). It might seem that the potentially infinite memory capacity is an unrealizable attribute, but any decidable problem solved by a Turing machine will always require only a finite amount of memory. So in principle, any problem that can be solved (decided) by a Turing machine can be solved by a computer that has a finite amount of memory.



A theory is a set of formulas, often assumed to be closed under logical consequence. Decidability for a theory concerns whether there is an effective procedure that decides whether the formula is a member of the theory or not, given an arbitrary formula in the signature of the theory. The problem of decidability arises naturally when a theory is defined as the set of logical consequences of a fixed set of axioms.

There are several basic results about decidability of theories. Every (non-paraconsistent) inconsistent theory is decidable, as every formula in the signature of the theory will be a logical consequence of, and thus a member of, the theory. Every complete recursively enumerable first-order theory is decidable. An extension of a decidable theory may not be decidable. For example, there are undecidable theories in propositional logic, although the set of validities (the smallest theory) is decidable.

A consistent theory that has the property that every consistent extension is undecidable is said to be essentially undecidable. In fact, every consistent extension will be essentially undecidable. The theory of fields is undecidable but not essentially undecidable. Robinson arithmetic is known to be essentially undecidable, and thus every consistent theory that includes or interprets Robinson arithmetic is also (essentially) undecidable.

Examples of decidable first-order theories include the theory of real closed fields, and Presburger arithmetic, while the theory of groups and Robinson arithmetic are examples of undecidable theories.



Automata theory is the study of abstract machines and automata, as well as the computational problems that can be solved using them. It is a theory in theoretical computer science. The word automata (the plural of automaton) comes from the Greek word αὐτόματα, which means "self-making".

The figure at right illustrates a finite-state machine, which belongs to a well-known type of automaton. This automaton consists of states (represented in the figure by circles) and transitions (represented by arrows). As the automaton sees a symbol of input, it makes a transition (or jump) to another state, according to its transition function, which takes the current state and the recent symbol as its inputs.

Automata theory is closely related to formal language theory. An automaton is a finite representation of a formal language that may be an infinite set. Automata are often classified by the class of formal languages they can recognize, typically illustrated by the Chomsky hierarchy, which describes the relations between various languages and kinds of formalized logics.

Automata play a major role in theory of computation, compiler construction, artificial intelligence, parsing and formal verification.



Theoretical computer science (TCS) is a subset of general computer science and mathematics that focuses on more mathematical topics of computing, and includes the theory of computation.

It is difficult to circumscribe the theoretical areas precisely. The ACM's Special Interest Group on Algorithms and Computation Theory (SIGACT) provides the following description:

   TCS covers a wide variety of topics including algorithms, data structures, computational complexity, parallel and distributed computation, probabilistic computation, quantum computation, automata theory, information theory, cryptography, program semantics and verification, machine learning, computational biology, computational economics, computational geometry, and computational number theory and algebra. Work in this field is often distinguished by its emphasis on mathematical technique and rigor.



In theoretical computer science, a transition system is a concept used in the study of computation. It is used to describe the potential behavior of discrete systems. It consists of states and transitions between states, which may be labeled with labels chosen from a set; the same label may appear on more than one transition. If the label set is a singleton, the system is essentially unlabeled, and a simpler definition that omits the labels is possible.

Transition systems coincide mathematically with abstract rewriting systems (as explained further in this article) and directed graphs. They differ from finite state automata in several ways:

   The set of states is not necessarily finite, or even countable.
   The set of transitions is not necessarily finite, or even countable.
   No "start" state or "final" states are given.

Transition systems can be represented as directed graphs.



A complex system is a system composed of many components which may interact with each other. Examples of complex systems are Earth's global climate, organisms, the human brain, infrastructure such as power grid, transportation or communication systems, social and economic organizations (like cities), an ecosystem, a living cell, and ultimately the entire universe.

Complex systems are systems whose behavior is intrinsically difficult to model due to the dependencies, competitions, relationships, or other types of interactions between their parts or between a given system and its environment. Systems that are "complex" have distinct properties that arise from these relationships, such as nonlinearity, emergence, spontaneous order, adaptation, and feedback loops, among others. Because such systems appear in a wide variety of fields, the commonalities among them have become the topic of their independent area of research. In many cases, it is useful to represent such a system as a network where the nodes represent the components and links to their interactions.



In mathematics and science, a nonlinear system is a system in which the change of the output is not proportional to the change of the input. Nonlinear problems are of interest to engineers, biologists, physicists, mathematicians, and many other scientists because most systems are inherently nonlinear in nature. Nonlinear dynamical systems, describing changes in variables over time, may appear chaotic, unpredictable, or counterintuitive, contrasting with much simpler linear systems.

Typically, the behavior of a nonlinear system is described in mathematics by a nonlinear system of equations, which is a set of simultaneous equations in which the unknowns (or the unknown functions in the case of differential equations) appear as variables of a polynomial of degree higher than one or in the argument of a function which is not a polynomial of degree one. In other words, in a nonlinear system of equations, the equation(s) to be solved cannot be written as a linear combination of the unknown variables or functions that appear in them. Systems can be defined as nonlinear, regardless of whether known linear functions appear in the equations. In particular, a differential equation is linear if it is linear in terms of the unknown function and its derivatives, even if nonlinear in terms of the other variables appearing in it.

As nonlinear dynamical equations are difficult to solve, nonlinear systems are commonly approximated by linear equations (linearization). This works well up to some accuracy and some range for the input values, but some interesting phenomena such as solitons, chaos, and singularities are hidden by linearization. It follows that some aspects of the dynamic behavior of a nonlinear system can appear to be counterintuitive, unpredictable or even chaotic. Although such chaotic behavior may resemble random behavior, it is in fact not random. For example, some aspects of the weather are seen to be chaotic, where simple changes in one part of the system produce complex effects throughout. This nonlinearity is one of the reasons why accurate long-term forecasts are impossible with current technology.

Some authors use the term nonlinear science for the study of nonlinear systems. This term is disputed by others:

   Using a term like nonlinear science is like referring to the bulk of zoology as the study of non-elephant animals.
   — Stanislaw Ulam



Spontaneous order, also named self-organization in the hard sciences, is the spontaneous emergence of order out of seeming chaos. It is a process in social networks including economics, though the term "self-organization" is more often used for physical changes and biological processes, while "spontaneous order" is typically used to describe the emergence of various kinds of social orders from a combination of self-interested individuals who are not intentionally trying to create order through planning. The evolution of life on Earth, language, crystal structure, the Internet and a free market economy have all been proposed as examples of systems which evolved through spontaneous order.

Spontaneous orders are to be distinguished from organizations. Spontaneous orders are distinguished by being scale-free networks, while organizations are hierarchical networks. Further, organizations can be and often are a part of spontaneous social orders, but the reverse is not true. Further, while organizations are created and controlled by humans, spontaneous orders are created, controlled, and controllable by no one. In economics and the social sciences, spontaneous order is defined as "the result of human actions, not of human design".

Spontaneous order is an equilibrium behavior between self-interested individuals, which is most likely to evolve and survive, obeying the natural selection process "survival of the likeliest".



The universe (Latin: universus) is all of space and time[a] and their contents, including planets, stars, galaxies, and all other forms of matter and energy. While the spatial size of the entire universe is unknown, it is possible to measure the size of the observable universe, which is currently estimated to be 93 billion light-years in diameter. In various multiverse hypotheses, a universe is one of many causally disconnected constituent parts of a larger multiverse, which itself comprises all of space and time and its contents.

The earliest cosmological models of the universe were developed by ancient Greek and Indian philosophers and were geocentric, placing Earth at the center. Over the centuries, more precise astronomical observations led Nicolaus Copernicus to develop the heliocentric model with the Sun at the center of the Solar System. In developing the law of universal gravitation, Isaac Newton built upon Copernicus' work as well as Johannes Kepler's laws of planetary motion and observations by Tycho Brahe.

Further observational improvements led to the realization that the Sun is one of hundreds of billions of stars in the Milky Way, which is one of at least two trillion galaxies in the universe. Many of the stars in our galaxy have planets. At the largest scale, galaxies are distributed uniformly and the same in all directions, meaning that the universe has neither an edge nor a center. At smaller scales, galaxies are distributed in clusters and superclusters which form immense filaments and voids in space, creating a vast foam-like structure. Discoveries in the early 20th century have suggested that the universe had a beginning and that space has been expanding, since then, and is currently still expanding at an increasing rate.

The Big Bang theory is the prevailing cosmological description of the development of the universe. According to estimation of this theory, space and time emerged together 13.799±0.021 billion years ago and the energy and matter initially present have become less dense as the universe expanded. After an initial accelerated expansion called the inflationary epoch at around 10−32 seconds, and the separation of the four known fundamental forces, the universe gradually cooled and continued to expand, allowing the first subatomic particles and simple atoms to form. Dark matter gradually gathered, forming a foam-like structure of filaments and voids under the influence of gravity. Giant clouds of hydrogen and helium were gradually drawn to the places where dark matter was most dense, forming the first galaxies, stars, and everything else seen today. It is possible to see objects that are now further away than 13.799 billion light-years because space itself has expanded, and it is still expanding today. This means that objects which are now up to 46.5 billion light-years away can still be seen in their distant past, because in the past, when their light was emitted, they were much closer to Earth.

From studying the movement of galaxies, it has been discovered that the universe contains much more matter than is accounted for by visible objects; stars, galaxies, nebulas and interstellar gas. This unseen matter is known as dark matter (dark means that there is a wide range of strong indirect evidence that it exists, but we have not yet detected it directly). The ΛCDM model is the most widely accepted model of our universe. It suggests that about 69.2%±1.2% [2015] of the mass and energy in the universe is a cosmological constant (or, in extensions to ΛCDM, other forms of dark energy, such as a scalar field) which is responsible for the current expansion of space, and about 25.8%±1.1% [2015] is dark matter. Ordinary ('baryonic') matter is therefore only 4.84%±0.1% [2015] of the physical universe. Stars, planets, and visible gas clouds only form about 6% of ordinary matter, or about 0.29% of the entire universe.

There are many competing hypotheses about the ultimate fate of the universe and about what, if anything, preceded the Big Bang, while other physicists and philosophers refuse to speculate, doubting that information about prior states will ever be accessible. Some physicists have suggested various multiverse hypotheses, in which our universe might be one among many universes that likewise exist.



A complex adaptive system is a system that is complex in that it is a dynamic network of interactions, but the behavior of the ensemble may not be predictable according to the behavior of the components. It is adaptive in that the individual and collective behavior mutate and self-organize corresponding to the change-initiating micro-event or collection of events. It is a "complex macroscopic collection" of relatively "similar and partially connected micro-structures" formed in order to adapt to the changing environment and increase their survivability as a macro-structure. The Complex Adaptive Systems approach builds on replicator dynamics.

The study of complex adaptive systems, a subset of nonlinear dynamical systems, is interdisciplinary matter that attempts to blend insights from the natural and social sciences to develop system-level models and insights that allow for heterogeneous agents, phase transition, and emergent behavior.
WIKIPEDIA [WRITING] EQUAL

The id, ego, and super-ego are a set of three concepts in psychoanalytic theory describing distinct, interacting agents in the psychic apparatus (defined in Sigmund Freud's structural model of the psyche). The three agents are theoretical constructs that describe the activities and interactions of the mental life of a person. In the ego psychology model of the psyche, the id is the set of uncoordinated instinctual desires; the super-ego plays the critical and moralizing role; and the ego is the organized, realistic agent that mediates, between the instinctual desires of the id and the critical super-ego; Freud explained that:

   The functional importance of the ego is manifested in the fact that, normally, control over the approaches to motility devolves upon it. Thus, in its relation to the id, [the ego] is like a man on horseback, who has to hold in check the superior strength of the horse; with this difference, that the rider tries to do so with his own strength, while the ego uses borrowed forces. The analogy may be carried a little further. Often, a rider, if he is not to be parted from his horse, is obliged to guide [the horse] where it wants to go; so, in the same way, the ego is in the habit of transforming the id's will into action, as if it were its own.

The existence of the super-ego is observable in how people can view themselves as guilty and bad, shameful and weak, and feel compelled to do certain things. In The Ego and the Id (1923), Freud presents "the general character of harshness and cruelty exhibited by the [ego] ideal — its dictatorial Thou shalt; thus, in the psychology of the ego, Freud hypothesized different levels of ego ideal or superego development with greater ideals:

   . . . nor must it be forgotten that a child has a different estimate of his parents at different periods of his life. At the time at which the Oedipus complex gives place to the super-ego they are something quite magnificent; but later, they lose much of this. Identifications then come about with these later parents as well, and indeed they regularly make important contributions to the formation of character; but in that case they only affect the ego, they no longer influence the super-ego, which has been determined by the earliest parental images.
   — New Introductory Lectures on Psychoanalysis, p. 64.

The earlier in the child's development, the greater the estimate of parental power; thus, when the child is in rivalry with the parental imago, the child then feels the dictatorial Thou shalt, which is the manifest power that the imago represents on four levels: (i) the auto-erotic, (ii) the narcissistic, (iii) the anal, and (iv) the phallic. Those different levels of mental development, and their relations to parental imagos, correspond to specific id forms of aggression and affection; thus aggressive and destructive desires animate the myths in the fantasies and repressions of patients, in all cultures. In response to the unstructured ambiguity and conflicting uses of the term "the unconscious mind", Freud introduced the structured model of ego psychology (id, ego, super-ego) in the essay Beyond the Pleasure Principle (1920) and elaborated, refined, and made that model formal in the essay The Ego and the Id.



In psychology, intellectualization is a defense mechanism by which reasoning is used to block confrontation with an unconscious conflict and its associated emotional stress – where thinking is used to avoid feeling. It involves removing one's self, emotionally, from a stressful event. Intellectualization may accompany, but is different from, rationalization, the pseudo-rational justification of irrational acts.

Intellectualization is one of Freud's original defense mechanisms. Freud believed that memories have both conscious and unconscious aspects, and that intellectualization allows for the conscious analysis of an event in a way that does not provoke anxiety.



In psychoanalytic theory, a defence mechanism is an unconscious psychological mechanism that reduces anxiety arising from unacceptable or potentially harmful stimuli.

Defence mechanisms may result in healthy or unhealthy consequences depending on the circumstances and frequency with which the mechanism is used. Defence mechanisms (German: Abwehrmechanismen) are psychological strategies brought into play by the unconscious mind to manipulate, deny, or distort reality in order to defend against feelings of anxiety and unacceptable impulses and to maintain one's self-schema or other schemas. These processes that manipulate, deny, or distort reality may include the following: repression, or the burying of a painful feeling or thought from one's awareness even though it may resurface in a symbolic form; identification, incorporating an object or thought into oneself; and rationalization, the justification of one's behaviour and motivations by substituting "good" acceptable reasons for the actual motivations. In psychoanalytic theory, repression is considered the basis for other defence mechanisms.

Healthy people normally use different defence mechanisms throughout life. A defence mechanism becomes pathological only when its persistent use leads to maladaptive behaviour such that the physical or mental health of the individual is adversely affected. Among the purposes of ego defence mechanisms is to protect the mind/self/ego from anxiety or social sanctions or to provide a refuge from a situation with which one cannot currently cope.

One resource used to evaluate these mechanisms is the Defense Style Questionnaire (DSQ-40).



Psychoanalytic theory is the theory of personality organization and the dynamics of personality development that guides psychoanalysis, a clinical method for treating psychopathology. First laid out by Sigmund Freud in the late 19th century, psychoanalytic theory has undergone many refinements since his work. Psychoanalytic theory came to full prominence in the last third of the twentieth century as part of the flow of critical discourse regarding psychological treatments after the 1960s, long after Freud's death in 1939. Freud had ceased his analysis of the brain and his physiological studies and shifted his focus to the study of the mind and the related psychological attributes making up the mind, and on treatment using free association and the phenomena of transference. His study emphasized the recognition of childhood events that could influence the mental functioning of adults. His examination of the genetic and then the developmental aspects gave the psychoanalytic theory its characteristics. Starting with his publication of The Interpretation of Dreams in 1899, his theories began to gain prominence.



The unconscious mind (or the unconscious) consists of the processes in the mind which occur automatically and are not available to introspection and include thought processes, memories, interests and motivations.

Even though these processes exist well under the surface of conscious awareness, they are theorized to exert an effect on behavior. The term was coined by the 18th-century German Romantic philosopher Friedrich Schelling and later introduced into English by the poet and essayist Samuel Taylor Coleridge.

Empirical evidence suggests that unconscious phenomena include repressed feelings, automatic skills, subliminal perceptions, and automatic reactions, and possibly also complexes, hidden phobias and desires.

The concept was popularized by the Austrian neurologist and psychoanalyst Sigmund Freud. In psychoanalytic theory, unconscious processes are understood to be directly represented in dreams, as well as in slips of the tongue and jokes.

Thus the unconscious mind can be seen as the source of dreams and automatic thoughts (those that appear without any apparent cause), the repository of forgotten memories (that may still be accessible to consciousness at some later time), and the locus of implicit knowledge (the things that we have learned so well that we do them without thinking).

It has been argued that consciousness is influenced by other parts of the mind. These include unconsciousness as a personal habit, being unaware and intuition. Phenomena related to semi-consciousness include awakening, implicit memory, subliminal messages, trances, hypnagogia and hypnosis. While sleep, sleepwalking, dreaming, delirium and comas may signal the presence of unconscious processes, these processes are seen as symptoms rather than the unconscious mind itself.

Some critics have doubted the existence of the unconscious.



The self-schema refers to a long lasting and stable set of memories that summarize a person's beliefs, experiences and generalizations about the self, in specific behavioral domains. A person may have a self-schema based on any aspect of himself or herself as a person, including physical characteristics, personality traits and interests, as long as they consider that aspect of their self important to their own self-definition.

For example, someone will have a self-schema of extroversion if they think of themselves as extroverted and also believe that their extroversion is central to who they are. Their self-schema for extroversion may include general self-categorizations ("I am sociable."), beliefs about how they would act in certain situations ("At a party I would talk to lots of people") and also memories of specific past events ("On my first day at university I made lots of new friends").



Identification is a psychological process whereby the individual assimilates an aspect, property, or attribute of the other and is transformed wholly or partially by the model that other provides. It is by means of a series of identifications that the personality is constituted and specified. The roots of the concept can be found in Freud's writings. The three most prominent concepts of identification as described by Freud are: primary identification, narcissistic (secondary) identification and partial (secondary) identification.

While "in the psychoanalytic literature there is agreement that the core meaning of identification is simple – to be like or to become like another", it has also been adjudged '"the most perplexing clinical/theoretical area" in psychoanalysis'.



IN THE DEFENSE HIERARCHY
Vaillant divided defense mechanisms into a hierarchy of defenses ranging from immature through neurotic to healthy defenses, and placed intellectualization – imagining an act of violence without feeling the accompanying emotions, for example – in the mid-range, neurotic defenses. Like rationalisation, intellectualization can thus provide a bridge between immature and mature mechanisms both in the process of growing up and in adult life.

Winnicott, however, considered that erratic childhood care could lead to over-dependence on intellectuality as a substitute for mothering; and saw over-preoccupation with knowledge as an emotional impoverishment aimed at self-mothering via the mind. Julia Kristeva similarly described a process whereby "symbolicity itself is cathected...Since it is not sex-oriented, it denies the question of sexual difference".

One answer to such over-intellectualization may be the sense of humour, what Richard Hofstadter called the necessary quality of playfulness – Freud himself saying that "Humour can be regarded as the highest of these defensive processes"!



Intuition is the ability to acquire knowledge without recourse to conscious reasoning. Different fields use the word "intuition" in very different ways, including but not limited to: direct access to unconscious knowledge; unconscious cognition; inner sensing; inner insight to unconscious pattern-recognition; and the ability to understand something instinctively, without any need for conscious reasoning.

The word intuition comes from the Latin verb intueri translated as "consider" or from the late middle English word intuit, "to contemplate".



ANCIENT MORAL INTELLECTUALISM
The Greek philosopher Socrates (c. 470–399 BC) proposed that intellectualism allows that “one will do what is right or best just as soon as one truly understands what is right or best”; that virtue is a purely intellectual matter, because virtue and knowledge are related qualities that a person accrues, possesses, and improves by dedication to reason. So defined, Socratic intellectualism was an important philosophic component of Stoicism in which the problematic consequences of such a perspective are “Socratic paradoxes”, such as there is no weakness of will — that no one knowingly does, or seeks to do, evil (moral wrong); that anyone who does, or seeks to do, moral wrong does so involuntarily; and that virtue is knowledge, that there are not many virtues, but that all virtues are one.

Contemporary philosophers disagree that Socrates’s conceptions of the knowledge of truth, and of ethical conduct, can be equated with modern, post–Cartesian conceptions of knowledge and of rational intellectualism. As such, Michel Foucault demonstrated, with detailed historical study, that in Classical Antiquity (800 BC – AD 1000), “knowing the truth” is akin to “spiritual knowledge”, in the contemporarily understanding of the concept; hence, without exclusively concerning the rational intellect, spiritual knowledge is integral to the broader principle of “caring for the self”.

Typically, such care of the self-involved specific ascetic exercises meant to ensure that not only was knowledge of truth memorized, but learned, and then integrated to the self, in the course of transforming oneself into a good person. Therefore, to understand truth meant “intellectual knowledge” requiring one’s integration to the (universal) truth, and authentically living it in one’s speech, heart, and conduct. Achieving that difficult task required continual care of the self, but also meant being someone who embodies truth, and so can readily practice the Classical-era rhetorical device of parrhesia: “to speak candidly, and to ask forgiveness for so speaking”; and, by extension, practice the moral obligation to speak the truth for the common good, even at personal risk. This ancient, Socratic moral philosophic perspective contradicts the contemporary understanding of truth and knowledge as rational undertakings.



In rhetoric, parrhesia is a figure of speech described as: "to speak candidly or to ask forgiveness for so speaking". This Ancient Greek word has three different forms, as related by Michel Foucault. Parrhesia is a noun, meaning "free speech". Parrhesiazomai is a verb, meaning "to use parrhesia". Parrhesiastes is a noun, meaning one who uses parrhesia, for example "one who speaks the truth to power".



Utilitarianism is a family of normative ethical theories that prescribe actions that maximize happiness and well-being for all affected individuals. Although different varieties of utilitarianism admit different characterizations, the basic idea behind all of them is to in some sense maximize utility, which is often defined in terms of well-being or related concepts. For instance, Jeremy Bentham, the founder of utilitarianism, described utility as "that property in any object, whereby it tends to produce benefit, advantage, pleasure, good, or happiness...[or] to prevent the happening of mischief, pain, evil, or unhappiness to the party whose interest is considered."

Utilitarianism is a version of consequentialism, which states that the consequences of any action are the only standard of right and wrong. Unlike other forms of consequentialism, such as egoism and altruism, utilitarianism considers the interests of all humans equally. Proponents of utilitarianism have disagreed on a number of points, such as whether actions should be chosen based on their likely results (act utilitarianism), or whether agents should conform to rules that maximize utility (rule utilitarianism). There is also disagreement as to whether total (total utilitarianism), average (average utilitarianism) or minimum utility should be maximized.

Though the seeds of the theory can be found in the hedonists Aristippus and Epicurus, who viewed happiness as the only good, and in the work of the medieval Indian philosopher Śāntideva, the tradition of utilitarianism properly began with Bentham, and has included John Stuart Mill, Henry Sidgwick, R. M. Hare, and Peter Singer. The concept has been applied towards social welfare economics, the crisis of global poverty, the ethics of raising animals for food, and the importance of avoiding existential risks to humanity.



A global catastrophic risk is a hypothetical future event which could damage human well-being on a global scale, even endangering or destroying modern civilization. An event that could cause human extinction or permanently and drastically curtail humanity's potential is known as an existential risk.

Potential global catastrophic risks include anthropogenic risks, caused by humans (technology, governance, climate change), and non-anthropogenic or external risks. Examples of technology risks are hostile artificial intelligence and destructive biotechnology or nanotechnology. Insufficient or malign global governance creates risks in the social and political domain, such as a global war, including nuclear holocaust, bioterrorism using genetically modified organisms, cyberterrorism destroying critical infrastructure like the electrical grid; or the failure to manage a natural pandemic. Problems and risks in the domain of earth system governance include global warming, environmental degradation, including extinction of species, famine as a result of non-equitable resource distribution, human overpopulation, crop failures and non-sustainable agriculture.

Examples of non-anthropogenic risks are an asteroid impact event, a supervolcanic eruption, a lethal gamma-ray burst, a geomagnetic storm destroying electronic equipment, natural long-term climate change, hostile extraterrestrial life, or the predictable Sun transforming into a red giant star engulfing the Earth.



Metaphysicians investigate questions about the ways the world could have been. David Lewis, in On the Plurality of Worlds, endorsed a view called Concrete Modal realism, according to which facts about how things could have been are made true by other concrete worlds in which things are different. Other philosophers, including Gottfried Leibniz, have dealt with the idea of possible worlds as well. A necessary fact is true across all possible worlds. A possible fact is true in some possible world, even if not in the actual world. For example, it is possible that cats could have had two tails, or that any particular apple could have not existed. By contrast, certain propositions seem necessarily true, such as analytic propositions, e.g., "All bachelors are unmarried." The view that any analytic truth is necessary is not universally held among philosophers. A less controversial view is that self-identity is necessary, as it seems fundamentally incoherent to claim that any x is not identical to itself; this is known as the law of identity, a putative "first principle". Similarly, Aristotle describes the principle of non-contradiction:

   It is impossible that the same quality should both belong and not belong to the same thing ... This is the most certain of all principles ... Wherefore they who demonstrate refer to this as an ultimate opinion. For it is by nature the source of all the other axioms.



Noogenesis is the emergence and evolution of intelligence.




Mentalism or sanism describes discrimination and oppression against a mental trait or condition a person has, or is judged to have. This discrimination may or may not be characterized in terms of mental disorder or cognitive impairment. The discrimination is based on numerous factors such as stereotypes about neurodivergence, for example Aspergers, learning disorders, ADHD, bipolar, schizophrenia, and personality disorders, specific behavioral phenomena such as stuttering and tics, or intellectual disability.

Like other forms of discrimination such as sexism and racism, mentalism involves multiple intersecting forms of oppression, complex social inequalities and imbalances of power. It can result in covert discrimination by multiple, small insults and indignities. It is characterized by judgments of another person's perceived mental health status. These judgments are followed by actions such as blatant, overt discrimination which may include refusal of service, or the denial of human rights. Mentalism impacts how individuals are treated by the general public, by mental health professionals, and by institutions, including the legal system. The negative attitudes involved may also be internalized.

The terms mentalism, from "mental", and sanism, from "sane", have become established in some contexts, though concepts such as social stigma, and in some cases ableism, may be used in similar but not identical ways.

While mentalism and sanism are used interchangeably, sanism is becoming predominant in certain circles, such as academics, those who identify as mad and mad advocates and in a socio-political context where sanism is gaining ground as a movement. The movement of sanism is an act of resistance among those who identify as mad, consumer survivors, and mental health advocates. In academia evidence of this movement can be found in the number of recent publications about sanism and social work practice.

Mentalism tends to be referred as mental disability, distinguishing itself from ableism, which refers to physical disability.



Brainstorming is a group creativity technique by which efforts are made to find a conclusion for a specific problem by gathering a list of ideas spontaneously contributed by its members.

In other words, brainstorming is a situation where a group of people meet to generate new ideas and solutions around a specific domain of interest by removing inhibitions. People are able to think more freely and they suggest as many spontaneous new ideas as possible. All the ideas are noted down without criticism and after the brainstorming session the ideas are evaluated. The term was popularized by Alex Faickney Osborn in the 1967 book Applied Imagination.



Semantics (from Ancient Greek: σημαντικός sēmantikós, "significant")[a] is the study of meaning. The term can be used to refer to subfields of several distinct disciplines including linguistics, philosophy, and computer science.



In logic, especially as applied in mathematics, concept A is a special case or specialization of concept B precisely if every instance of A is also an instance of B but not vice versa, or equivalently, if B is a generalization of A. A limiting case is a type of special case which is arrived at by taking some aspect of the concept to the extreme of what is permitted in the general case. A degenerate case is a special case which is in some way qualitatively different from almost all of the cases allowed.

Special case examples include the following:

   All squares are rectangles (but not all rectangles are squares); therefore the square is a special case of the rectangle.
   Fermat's Last Theorem, that an + bn = cn has no solutions in positive integers with n > 2, is a special case of Beal's conjecture, that ax + by = cz has no primitive solutions in positive integers with x, y, and z all greater than 2, specifically, the case of x = y = z.



Hyponyms and hypernyms

Hyponymy shows the relationship between a generic term (hypernym) and a specific instance of it (hyponym). A hyponym is a word or phrase whose semantic field is more specific than its hypernym. The semantic field of a hypernym, also known as a superordinate, is broader than that of a hyponym. An approach to the relationship between hyponyms and hypernyms is to view a hypernym as consisting of hyponyms. This, however, becomes more difficult with abstract words such as imagine, understand and knowledge. While hyponyms are typically used to refer to nouns, it can also be used on other parts of speech. Like nouns, hypernyms in verbs are words that refer to a broad category of actions. For example, verbs such as stare, gaze, view and peer can also be considered hyponyms of the verb look, which is their hypernym.

Hypernyms and hyponyms are asymmetric. Hyponymy can be tested by substituting X and Y in the sentence ‘X is a kind of Y’ and determining if it makes sense. For example, ‘A screwdriver is a kind of tool’ makes sense, but not ‘A tool is a kind of screwdriver’.

Strictly speaking, the meaning relation between hyponyms and hypernyms applies to lexical items of the same word class (or parts of speech), and holds between senses rather than words. For instance, the word screwdriver used in the previous example refers to the tool for turning a screw, and not to the drink made with vodka and orange juice.

Hyponymy is a transitive relation, if X is a hyponym of Y, and Y is a hyponym of Z, then X is a hyponym of Z. For example, violet is a hyponym of purple and purple is a hyponym of color; therefore violet is a hyponym of color. A word can be both a hypernym and a hyponym: for example purple is a hyponym of colour but itself is a hypernym of the broad spectrum of shades of purple between the range of crimson and violet.

The hierarchical structure of semantic fields can be mostly seen in hyponymy. They could be observed from top to bottom, where the higher level is more general and the lower level is more specific. For example, living things will be the highest level followed by plants and animals, and the lowest level may comprise dog, cat and wolf.

Under the relations of hyponymy and incompatibility, taxonomic hierarchical structures too can be formed. It consists of two relations; the first one being exemplified in 'An X is a Y' (simple hyponymy) while the second relation is 'An X is a kind/type of Y'. The second relation is said to be more discriminating and can be classified more specifically under the concept of taxonomy.

Co-hyponyms

If the hypernym Z consists of hyponyms X and Y, X and Y are identified as co-hyponyms. Co-hyponyms are labelled as such when separate hyponyms share the same hypernym but are not hyponyms of one another, unless they happen to be synonymous. For example, screwdriver, scissors, knife, and hammer are all co-hyponyms of one another and hyponyms of tool, but not hyponyms of one another: *‘A hammer is a type of knife’ is false.

Co-hyponyms are often but not always related to one another by the relation of incompatibility. For example, apple, peach and plum are co-hyponyms of fruit. However, an apple is not a peach, which is also not a plum. Thus, they are incompatible. Nevertheless, co-hyponyms are not necessarily incompatible in all senses. A queen and mother are both hyponyms of woman but there is nothing preventing the queen from being a mother. This shows that compatibility may be relevant.



In the philosophy of science, falsifiability or refutability is the capacity for a statement, theory or hypothesis to be contradicted by evidence. For example, the statement "All swans are white" is falsifiable because one can observe that black swans exist.[A]

Falsifiability was introduced by the philosopher of science Karl Popper in his book Logik der Forschung (1934, revised and translated into English in 1959 as The Logic of Scientific Discovery). He proposed it as the cornerstone of a solution to both the problem of induction and the problem of demarcation.

Popper argued for falsifiability and opposed this to the intuitively similar concept of verifiability. Whereas verifying the claim "All swans are white" would require assessment of all swans, which is not possible, the single observation of a black swan is sufficient to falsify it.

As a key notion in the separation of science from non-science and pseudo-science, falsifiability has featured prominently in many scientific controversies and applications, even being used as legal precedent.



Dialectical logic is the system of laws of thought, developed within the Hegelian and Marxist traditions, which seeks to supplement or replace the laws of formal logic. The precise nature of the relation between dialectical and formal logic was hotly debated within the Soviet Union and China.

Contrasting with the abstract formalism of traditional logic, dialectical logic in the Marxist sense was developed as the logic of motion and change and used to examine concrete forms. Its proponents claim it is a materialist approach to logic, drawing on the objective, material world.

Stalin argued in his "Marxism and Problems of Linguistics" that there was no class content to formal logic and that it was an acceptable neutral science. This led to the insistence that there were not two logics, but only formal logic. The analogy used was the relation of elementary and higher mathematics. Dialectical logic was hence concerned with a different area of study from that of formal logic.

The main consensus among dialecticians is that dialectics do not violate the law of contradiction of formal logic, although attempts have been made to create a paraconsistent logic.

Some Soviet philosophers argued that the materialist dialectic could be seen in the mathematical logic of Bertrand Russell; however, this was criticized by Deborin and the Deborinists as panlogicism.

Evald Ilyenkov held that logic was not a formal science but a reflection of scientific praxis and that the rules of logic are not independent of the content. He followed Hegel in insisting that formal logic had been sublated, arguing that logic needed to be a unity of form and content and to state actual truths about the objective world. Ilyenkov used Das Kapital to illustrate the constant flux of A and B and the vanity of holding strictly to either A or -A, due to the inherent logical contradiction of self-development.

During the Sino-Soviet split, dialectical logic was used in China as a symbol of Marxism–Leninism against the Soviet rehabilitation of formal logic.



Mathematical logic is a subfield of mathematics exploring the applications of formal logic to mathematics. It bears close connections to metamathematics, the foundations of mathematics, and theoretical computer science. The unifying themes in mathematical logic include the study of the expressive power of formal systems and the deductive power of formal proof systems.

Mathematical logic is often divided into the fields of set theory, model theory, recursion theory, and proof theory. These areas share basic results on logic, particularly first-order logic, and definability. In computer science (particularly in the ACM Classification) mathematical logic encompasses additional topics not detailed in this article; see Logic in computer science for those.

Since its inception, mathematical logic has both contributed to, and has been motivated by, the study of foundations of mathematics. This study began in the late 19th century with the development of axiomatic frameworks for geometry, arithmetic, and analysis. In the early 20th century it was shaped by David Hilbert's program to prove the consistency of foundational theories. Results of Kurt Gödel, Gerhard Gentzen, and others provided partial resolution to the program, and clarified the issues involved in proving consistency. Work in set theory showed that almost all ordinary mathematics can be formalized in terms of sets, although there are some theorems that cannot be proven in common axiom systems for set theory. Contemporary work in the foundations of mathematics often focuses on establishing which parts of mathematics can be formalized in particular formal systems (as in reverse mathematics) rather than trying to find theories in which all of mathematics can be developed.



Dialectic or dialectics (Greek: διαλεκτική, dialektikḗ; related to dialogue), also known as the dialectical method, is at base a discourse between two or more people holding different points of view about a subject but wishing to establish the truth through reasoned methods of argumentation. Dialectic resembles debate, but the concept excludes subjective elements such as emotional appeal and the modern pejorative sense of rhetoric. Dialectic may thus be contrasted with both the eristic, which refers to argument that aims to successfully dispute another's argument (rather than searching for truth), or the didactic method, wherein one side of the conversation teaches the other. Dialectic is alternatively known as minor logic, as opposed to major logic or critique.

Within Hegelianism, the word dialectic has the specialised meaning of a contradiction between ideas that serves as the determining factor in their relationship. Dialectic comprises three stages of development: first, the thesis, a statement of an idea; second, the antithesis, a reaction that contradicts or negates the thesis; and third, the synthesis, a statement through which the differences between the two points are resolved. Dialectical materialism, a theory or set of theories produced mainly by Karl Marx and Friedrich Engels, adapted the Hegelian dialectic into arguments regarding traditional materialism.

Dialectic tends to imply a process of evolution and so does not naturally fit within formal logic (see Logic and dialectic). This process is particularly marked in Hegelian dialectic, and even more so in Marxist dialectic, which may rely on the evolution of ideas over longer time periods in the real world; dialectical logic attempts to address this.



"Marxism and Problems of Linguistics" (Russian: Марксизм и вопросы языкознания) is an article written by Joseph Stalin, most of which was first published on June 20, 1950 in the newspaper "Pravda" (the "answers" attached at the end came later, in July and August), and was in the same year published as a pamphlet in large numbers.

The article appeared in the context of the last wave of publications by Marrists attacking the "old" linguistics, that had started in Pravda on May 9. Yet, instead of supporting Marrism, Stalin brought the campaign into a full turn, decisively finishing Marrism's acceptability in Soviet science. The "discussion" in the paper lingered a little while longer but didn't bring much new, due to the impossibility of arguing with Stalin. The next year, the Academy of Sciences published a hardbound volume of commentaries named "Session of the Department of Social Sciences of the Academy of Sciences of the U.S.S.R. Devoted to the Anniversary of the Publication of the Ingenious Work by J. V. Stalin "Marxism and Problems of Linguistics"", running a print of 10,000.



In logic, the law of non-contradiction (LNC) (also known as the law of contradiction, principle of non-contradiction (PNC), or the principle of contradiction) states that contradictory propositions cannot both be true in the same sense at the same time, e. g. the two propositions "A is B" and "A is not B" are mutually exclusive. Formally this is expressed as the tautology ¬(p ∧ ¬p).

One reason to have this law is the principle of explosion, which states that anything follows from a contradiction. The law is employed in a reductio ad absurdum proof.

To express the fact that the law is tenseless and to avoid equivocation, sometimes the law is amended to say "contradictory propositions cannot both be true 'at the same time and in the same sense'".

It is one of the so called three laws of thought, along with its complement, the law of excluded middle, and the law of identity. The law of noncontradiction is logically equivalent to the law of excluded middle by De Morgan's laws. However, no system of logic is built on just these laws, and none of these laws provide inference rules, such as modus ponens or De Morgan's laws.

The law of non contradiction and the law of excluded middle create a dichotomy in "logical space", wherein the two parts are "mutually exclusive" and "jointly exhaustive". The law of non-contradiction is merely an expression of the mutually exclusive aspect of that dichotomy, and the law of excluded middle, an expression of its jointly exhaustive aspect.



A paraconsistent logic is an attempt at a logical system to deal with contradictions in a discriminating way. Alternatively, paraconsistent logic is the subfield of logic that is concerned with studying and developing paraconsistent (or "inconsistency-tolerant") systems of logic.

Inconsistency-tolerant logics have been discussed since at least 1910 (and arguably much earlier, for example in the writings of Aristotle); however, the term paraconsistent ("beside the consistent") was not coined until 1976, by the Peruvian philosopher Francisco Miró Quesada Cantuarias.



Aufheben or Aufhebung is a German word with several seemingly contradictory meanings, including "to lift up", "to abolish", "cancel" or "suspend", or "to sublate". The term has also been defined as "abolish", "preserve", and "transcend". In philosophy, aufheben is used by Hegel to explain what happens when an Abstract and a Negative interact, and in this sense is translated mainly as "sublate".
 
Back to content | Back to main menu