Aside

Understanding the Tacit

My chum Steve Turner has a new book out. It has much Oakeshott interest and as many will know Steve has been a longstanding Oakeshott commentator. For me, one of his key articles is “Tradition and Cognitive Science: Oakeshott’s Undoing of the Kantian Mind”, a piece that I reference quite regularly. OK, so the book is ridiculously expensive – just put in a request to your library. Here is an excerpt: Tacitness in Practice Theory Practices Then and Now.

Image

avatar2012

EPISTEME 9.4 now available

This marks the first year we have published on a quarterly cycle and compared with most journals, we are up to date with no backlog: contents and abstracts

EVIDENCE AND INTUITION
Yuri Cath

Many philosophers accept a view – what I will call the intuition picture – according to which intuitions are crucial evidence in philosophy. Recently, Williamson (2004, 2007: ch. 1) has argued that such views are best abandoned because they lead to a psychologistic conception of philosophical evidence that encourages scepticism about the armchair judgements relied upon in philosophy. In this paper I respond to this criticism by showing how the intuition picture can be formulated in such a way that: (i) it is consistent with a wide range of views about not only philosophical evidence but also the nature of evidence in general, including Williamson’s famous view that E = K; (ii) it can maintain the central claims about the nature and role of intuitions in philosophy made by proponents of the intuition picture; (iii) it does not collapse into Williamson’s own deflationary view of the nature and role of intuitions in philosophy; and (iv) it does not lead to scepticism.

REGULARITY REFORMULATED
Weng Hong Tang

This paper focuses on the view that rationality requires that our credences be regular. I go through different formulations of the requirement, and show that they face several problems. I then formulate a version of the requirement that solves most of, if not all, these problems. I conclude by showing that an argument thought to support the requirement as traditionally formulated actually does not; if anything, the argument, slightly modified, supports my version of the requirement.

THREE FORMS OF INTERNALISM AND THE NEW EVIL DEMON PROBLEM
Andrew Moon

The new evil demon problem is often considered to be a serious obstacle for externalist theories of epistemic justification. In this paper, I aim to show that the new evil demon problem (‘NEDP’) also afflicts the two most prominent forms of internalism: moderate internalism and historical internalism. Since virtually all internalists accept at least one of these two forms, it follows that virtually all internalists face the NEDP. My secondary thesis is that many epistemologists – including both internalists and externalists – face a dilemma. The only form of internalism that is immune to the NEDP, strong internalism, is a very radical and revisionary view – a large number of epistemologists would have to significantly revise their views about justification in order to accept it. Hence, either epistemologists must accept a theory that is susceptible to the NEDP or accept a very radical and revisionary view.

JUSTIFICATION AS ‘WOULD-BE’ KNOWLEDGE
Aidan McGlynn

In light of the failure of attempts to analyse knowledge as a species of justified belief, a number of epistemologists have suggested that we should instead understand justification in terms of knowledge. This paper focuses on accounts of justification as a kind of ‘would-be’ knowledge. According to such accounts a belief is justified just in case any failure to know is due to uncooperative external circumstances. I argue against two recent accounts of this sort due to Alexander Bird and Martin Smith. A further aim is to defend a more traditional conception, according to which justification is a matter of sufficiently high evidential likelihood. In particular, I suggest that this conception of justification offers a plausible account of lottery cases: cases in which one believes a true proposition – for example that one’s lottery ticket will lose – on the basis of probabilistic evidence.

EVIDENCE OF EVIDENCE AND TESTIMONIAL REDUCTIONISM
William D. Rowley

An objection to reductionism in the epistemology of testimony that is often repeated but rarely defended in detail is that there is not enough positive evidence to provide the non-testimonial, positive reasons reductionism requires. Thus, on pain of testimonial skepticism, reductionism must be rejected. Call this argument the ‘Not Enough Evidence Objection’ (or ‘NEEO’). I will defend reductionism about testimonial evidence against the NEEO by arguing that we typically have non-testimonial positive reasons in the form of evidence about our testifier’s evidence. With a higher-level evidence principle borrowed from recent work on the epistemology of disagreement, I argue that, granting some plausible assumptions about conversational norms, the NEEO is unsound.

THE DANGERS OF USING SAFETY TO EXPLAIN TRANSMISSION FAILURE: A REPLY TO MARTIN SMITH
Chris Tucker

Many epistemologists hold that the Zebra Deduction (the animals are zebras, so they aren’t cleverly disguised mules) fails to transmit knowledge to its conclusion, but there is little agreement concerning why it has this defect. A natural idea is, roughly, that it fails to transmit because it fails to improve the safety of its conclusion. In his ‘Transmission Failure Explained’, Martin Smith defends a transmission principle which is supposed to underwrite this natural idea. There are two problems with Smith’s account. First, Smith’s argument for his transmission principle relies on a dubious premise (§1). Second, even if his transmission principle is true, Smith shows neither that it prevents the Zebra Deduction from transmitting knowledge to its conclusion, nor that it secures the natural idea (§2). I suspect that the failures of Smith’s account will be instructive for anyone who wants to connect transmission failure with a failure to enhance the safety, reliability or probability of one’s conclusion.

1405199938

What to Believe Now: Applying Epistemology to Contemporary Issues

Yet another strong Wiley title. David Coady also did a fine job of guest editing EPISTEME for a themed issue on Conspiracy Theories (aside from Harry Frankfurt’s little book where else would a title in mainstream academia have the word “shit” so prominent – see Pete Mandik’s paper).

avatar2012

Symposium on Pragmatic Encroachment

Two free discussion papers from EPISTEME 9:1

EMPIRICAL TESTS OF INTEREST-RELATIVE INVARIANTISM
Chandra Sekhar Sripada and Jason Stanley

According to Interest-Relative Invariantism, whether an agent knows that p, or possesses other sorts of epistemic properties or relations, is in part determined by the practical costs of being wrong about p. Recent studies in experimental philosophy have tested the claims of IRI. After critically discussing prior studies, we present the results of our own experiments that provide strong support for IRI. We discuss our results in light of complementary findings by other theorists, and address the challenge posed by a leading intellectualist alternative to our view.

PRAGMATIC ENCROACHMENT: IT’S NOT JUST ABOUT KNOWLEDGE
Jeremy Fantl and Matthew McGrath

There is pragmatic encroachment on some epistemic status just in case whether a proposition has that status for a subject depends not only on the subject’s epistemic position with respect to the proposition, but also on features of the subject’s non-epistemic, practical environment. Discussions of pragmatic encroachment usually focus on knowledge. Here we argue that, barring infallibilism, there is pragmatic encroachment on what is arguably a more fundamental epistemic status – the status a proposition has when it is warranted enough to be a reason one has for believing other things.

images

Science, the Market and Iterative Knowledge

The second paper co-authored with Dave Hardwick has now been published in Studies in Emergent Order:

Abstract: In a recent paper (Hardwick & Marsh, in press) we examine the recent tensions between the two broadly successful spontaneous orders, namely the Market and Science. We argued for an epistemic pluralism, the view that freedom and liberty (indeed the very concept of liberalism and civil society) exists at the nexus of a manifold of spontaneous forces, and that no single epistemic system should dominate. We also briefly introduced the concept of “iterative” knowledge to characterize the essentially dynamic nature of scientific knowledge. Herein lies a tension. The Market (and perhaps the prevailing culture at large) sees scientific knowledge in cumulative terms, that is, progressing to a conclusion in a linear fashion. This relatively static understanding of medical science as it relates to pharmaceutical studies can have a corrosive effect on the practice of medicine and ultimately, we believe, on the proper functioning of the market itself. In this paper we examine this tension in much closer detail by focusing upon the demands of the market, specifically the pharmaceutical industry, and the science upon which it is based. In other words, we expound upon a clash of epistemic value – one (science) that sees knowledge as essentially iterative (dynamic yet tentative) and the other (the Market) that harvests conclusive scientific knowledge (ostensibly as a fixed and firm commodity) functional to its own interests. Clinical Trials that are sharply focused with precisely determined deliverables often manifest this tension in the sharpest of relief. As a means of recovering drug development and testing costs, conclusive assessment is required to avoid creating serious financial problems for the companies themselves not to mention issues in the public interest.

Unknown

Cognitive Opening and Closing: Toward an Exploration of the Mental World of Entrepreneurship

Here is Thierry Aimar’s intro to his paper for Hayek in Mind.

Contemporary analysis usually divides games of chance into three dimensions. In Machina and Schmeidler’s (1992) terms, this division can be viewed based on the example of an urn containing 90 balls of different colors, out of which an agent pulls a ball, of which he must ex ante guess its color to achieve a predetermined gain. If the agent knows that the number of red, white, and black balls is the same (30), he finds himself in a situation of risk: He knows the possible consequences and the probability distributions, that is, he has one in three chances of getting a ball of any particular color. However, if he knows that these balls are red, white, and black, but in indefinite proportions, he is confronted with situations qualified as uncertainty: The consequences are known, but the probability distributions are not. Yet again, if the agent knows there are 90 balls of different colors in the urn but does not know how many of these colors there are, he is in a state of incomplete information: The agent is unable to define the list of possible outcomes (situation of ambiguity) and can expect some surprises identifiable ex ante, as states of nature are identifiable. An extra dimension may be added to this distinction: If the agent has himself placed 30 red balls in the box, but he does not know what other elements of indefinite character and number there are in the box, nor the structure of gains or losses associated with various results, then we can consider that the agent is in a position of ignorance. Not only is he unable to define the list of consequences of the game, but he also does not know the distribution of events. The agent is able to define what he knows, but unlike the three previous cases, he cannot determine the scope and nature of what he ignores. The surprise is necessarily unexpected in the sense that the agent is unable to identify ex ante the possible states of nature. It is in this latter perspective that Kirzner (1973, 1979, 1982) argues that market actors face a phenomenon of ‘‘genuine ignorance,’’ reflecting their inability to know all the opportunities for exchange or profit available in an economy. At any point in time, each individual perceives only fragmentary aspects of social reality in which he participates, and not its other facets. Each exchange is made in ignorance of other exchanges performed at the same time; thus, there is no common knowledge of prices and no actor can perceive the whole. In a monetary economy, the consequences of these independent exchanges are mutually dependent. The implications of this genuine ignorance on the coordination of activities are thus considerable. Using the example of Schmeidler and Machina’s urn (1992) from the time when the consequences of a draw for each individual depend on the (unknown) number of elements (of unknown character) deposited in the ballot box by an (unknown) number of (unknown) people, the ability of such a game to produce a balance is at least questionable. The stakes of this phenomenon of ignorance compel us to identify its sources. These are not found in any complexity of information, neither in the cost of its acquisition nor in its treatment (deliberation) from a perspective of bounded rationality. They come from a more fundamental phenomenon of dynamic subjectivism. According to authors such as Kinder (1973, 1979, 1997) and Lachmann (1977, 1986), agents’ preferences, endowments, knowledge, and strategies should be defined as personal, unique. Therefore, each individual is a priori ignorant of how others evaluate goods and services. Economic analysis is not therefore based on a perfect, or even sufficient knowledge of actors to coordinate their activities. The diversity of actors’ preferences, interpretations, and expectations would certainly not be a problem if they were constants. A process of trial and error would lead to new learning, opening onto a price structure that would allow coordination. But this is in fact not the case because the individual performances would change continuously, according to an endogenous process, ultimately explained by ignorance or internal self-ignorance (Aimar, 2008a). As Hayek (1951a, 1951b) explained, the actor can only partially perceive the existing opportunities for satisfaction, for reasons related to the organization of the human brain and the tacit characteristic of knowledge. His conscious choices being ignorant of a portion of his subjectivity, he makes mistakes, expressed by disappointment with satisfaction. He undergoes a de facto internal discoordination, forcing him to change his representations to make his beliefs conform to the reality of his interior environment. But changing choices results in transforming his internal environment and de facto creates new unknown areas. The mind, constantly evading the consciousness’s desire to fully absorb it, makes the process of self-discovery never-ending. Thus, market discoordination, the result of genuine ignorance, is finally but an internal discoordination, consequence of a phenomenon of self-ignorance. It was around this phenomenon of genuine ignorance and its perverse effects on coordination between individuals that Kirzner introduced the theme of entrepreneurship. The entrepreneurial function, driven by the incentive of profit, is to discover unperceived opportunities. Mobilizing qualities of alertness, reflected in cognitive openness, it reveals previously hidden information. Through his discoveries being translated into new money transactions, the entrepreneur socializes his knowledge and contributes to pulling market activities toward coordination. He goes beyond reducing ignorance; he transforms ignorance into uncertainty. But according to Kirzner, a parallel mission of the entrepreneur is to organize already discovered opportunities in the form of firms’ production plans, in order to protect them from risk of obsolescence resulting from the volatility of data. In a dynamic world, discovery and exploitation of opportunity are then the two faces of entrepreneurship. The author argues that these two dimensions may be contradictory in the entrepreneurial mind. As much as discovery implies a cognitive opening to the outside, all exploitation of discovered opportunities is accompanied by elements of mental rigidity. These take the form of cognitive closure, thus opposing the entrepreneur’s perception of new opportunities. The aim of this contribution is to illuminate by the structure of this contradiction by economic analysis, to provide the means to verify it through experimental economics and to consider its extensions in terms of neuroeconomics. Our plan is this: After explaining the basics of the theory of entrepreneurship and the elements that determine its duality, we will define the bases for an experimental protocol likely to support our thesis of an opposition in the cognitive field between the relative strengths of discovery and the exploitation of opportunities in the entrepreneurial mind. The last section forms the conclusion.

Unknown

Collective Intelligence 2012

Just under a week until the CI2012 shindig – as it so happens I’m busy co-writing a paper and co-editing a themed issue of Cognitive Systems Research on a species of CI – surprise, surprise “stigmergy.”

images

Kuhn’s Evolutionary Social Epistemology

Here’s a review of K. Brad Wray’s Kuhn’s Evolutionary Social Epistemology. (Wray, by the way, has been a strong contributor to EPISTEME). It’s also worth checking out Alexander Bird’s entry on Kuhn for Stanford Encyclopedia of Philosophy. Kuhn is one of those thinkers whose work has been tarnished by academics who need an off-the-peg philosophical outlook to paper over the lack of a critical philosophical culture.

Thomas Kuhn’s work occupies a strange place in the history of philosophy. With over one million copies sold, Kuhn’s Structure of Scientific Revolutions (1962) is probably the most popular academic philosophy book of the twentieth century. Yet, despite its intuitive appeal Kuhn’s work has been received very critically by philosophers themselves. Almost fifty years later, Brad Wray wants to move past the popular negative reading of Kuhn and searches for positive insights in his work.

51MHfWHnRkL._SS500_

Knowledge Has Always Been Networked

Here is a rather scathing review of David Weinberger’s Too Big To Know: Rethinking Knowledge Now that the Facts Aren’t the Facts, Experts Are Everywhere, and the Smartest Person in the Room Is the Room.

The renaissance of Marshall McLuhan in the era of the Web is disappointing for a number of reasons, not the least of which is its rather dull obviousness. There is little surprise that the quotable, evidence-free, technology-obsessed Canadian English professor would thrive in a technology-obsessed era where pithy quotes about the deep meaning of digital devices too often stands in for evidence. McLuhan, of course, was the master theorist of the medium; beyond the over-used “medium is the message,” McLuhan’s major insight was to argue that socio-technological systems — such as the media — operate on a grand scale, largely independent of the day-to-day interest us mere mortals might have in their actual content. McLuhan’s primary flaw, on the other hand, was to decouple this understanding of socio-technical system from any relationship to economics, politics, or society. As leading communications theorist James Carey put it, “McLuhan sees the principal effect [of communication technology] as impacting sensory organization and thought. McLuhan has much to say about perception and thought but little to say about institutions.”

German philosopher Martin Heidegger is less quoted in Silicon Valley than Marshall McLuhan, and not just because he was a Nazi. McLuhan and Heidegger are equally poor writers, but whereas McLuhan’s inscrutable prose has led to him being more read than he ought to be, unintelligibility has had the opposite outcome for Heidegger. A dazzlingly complex philosopher — probably the greatest of the 20th century — the most important aspect of Heidegger’s thought for our purposes is his understanding that human beings (or rather “Dasein,” “being-in-the-world”) are always thrown into a particular context, existing within already existing language structures and pre-determined meanings. In other words, the world is like the web, and we, Dasein, live inside the links.