Rank #1: "After Philosophy": Introduction (part 1)
The first e* post of the new academic year is a first in another sense. Previously, all my postings here have been research lectures, about my own work. This post is of a lecture I gave on October 17th, 2006 as part of a Theoretical Philosophy course on the pioneering Consciousness Studies Program at the University of Skövde, Sweden. That is, it is a teaching lecture (that I have been giving for a few years), aimed at third-year undergraduate students on a course primarily on Modern European (read "Continental") Philosophy. As such, it is not primarily my own work. However, given my rather skewed and limited knowledge of this area, proper scholars of this kind of philosophy will probably see more of me in this lecture than they see of the work of Derrida, Foucault, Gadamer, Habermas, Ricoeur, etc.
The lecture is almost entirely based on the Introduction chapter of After Philosophy: End or Transformation?, edited by Kenneth Baynes, James Bohman, and Thomas McCarthy, and so they deserve credit for most of the ideas presented. My contributions consist primarily in giving examples, and an extended, perhaps laboured, Bernstein-influenced musicological metaphor, that can be summarized in the slogan: "Kant is the Mahler of Philosophy".
This lecture makes poor use of the PodSlide format, going through only 6 slides in 40 minutes. It is actually only the first part of the lecture; part two, which is shorter, will be posted soon.
Nov 13 2013
Rank #2: Finding aesthetic pleasure on the subjective edge of chaos: A proposal for robotic creativity
In the talk, I give the nine axioms that constitute my approach to creating systems that exhibit creativity. The axioms are:
- Axiom 1: If you make your robot pleasure-seeking, and make creativity pleasurable, you'll make your robot creative
- Axiom 2: To be a good creator, it helps to be an appreciator
- Axiom 3: Let the robot experience output in the real world, as we do
- Axiom 4: We won’t like what it likes unless it likes what we like
- Axiom 5: An important motivator is the approval or attention of others
- Axiom 6: Sometimes it is better not to try pursue novelty directly, but something that is correlated with it
- Axiom 7: Let dynamics play a role in appreciation
- Axiom 8: Patterns in one's own states can be the objects of appreciation
- Axiom 9: The best way to make outputs in the real world is to be embodied in the real world
Nov 13 2013
Rank #3: In defense of transparent computationalism
Abstract of the original 1999 paper:
"A distinction is made between two senses of the claim “cognition is computation”. One sense, the opaque reading, takes computation to be whatever is described by our current computational theory and claims that cognition is best understood in terms of that theory. The transparent reading, which has its primary allegiance to the phenomenon of computation, rather than to any particular theory of it, is the claim that the best account of cognition will be given by whatever theory turns out to be the best account of the phenomenon of computation. The distinction is clarified and defended against charges of circularity and changing the subject. Several well-known objections to computationalism are then reviewed, and for each the question of whether the transparent reading of the computationalist claim can provide a response is considered."
I added to this by claiming that Gödel-style arguments don't show AI is impossible, but rather that the Church-Turing thesis is false. I rejected currently fashionable notions of pan-computationalism in favour of a view that makes having semantic properties essential to computation. I also argued that even if computationalism turns out to be false, it might still be possible for an artificial computational system to have a mind by virtue, at least in part, of the program it is running, since programming a computer not only changes it functionally, but also physically.
Nov 13 2013
Rank #4: Counterfactual computational vehicles of consciousness
In a recent paper (Bishop 2002), Bishop argues against computational explanations of consciousness by confronting them with a dilemma. On a non-causal or weakly causal construal of computation, familiar arguments from Putnam and Searle reveal computation to be too observer-relative to be able to underwrite any law-involving explanation of consciousness. On a (much more plausible) strongly causal notion of computation (Chalmers 1994; Chalmers 1996; Chrisley 1994), computational explanations must advert to non-actual, counterfactual states and state transitions. Working this fact into versions of Fading Qualia and Suddenly Disappearing Qualia arguments, Bishop concludes that strongly causal computationalism cannot be physicalist, in that it maintains that two states may differ only their non-physical (i.e. counterfactual) properties and yet be phenomenally distinct. I rebut this argument by embracing the second horn, and denying that appeal to non-actual or counterfactual properties is at odds with physicalism; indeed, it is the lifeblood of normal, physical, causal explanation. I further cast doubt on the argument by showing that it is too strong; if it is correct, computational states could not explain anything at all, not even computational phenomena, let alone conscious experience. I show how computational states that differ in their counterfactual properties must, contra Bishop's characterization, differ with respect to some of their actual properties. However, I note that inter-dependencies between current experience and computational state may only be explicable by reference to explicit, counterfactual states rather than the occurrent physical states which realize those dispositional properties. This is shown to cohere with at least one understanding (Chrisley 2004) of the sensorimotor contingency theory of perceptual experience (O'Regan and Noe 2000), in which expectation is understood as a disposition to produce a computational state corresponding to the sensation one would have if one were to perform a particular action. I conclude by sketching some implications for the search for correlates of experience. The considerations arising out of Bishop's argument show that if computationalism is true, then the search for correlates will fail if it only considers occurrent non-dispositionally construed physical states at a time to be the possible correlates of the experience being had at that time.
- Bishop, J.M. (2002) "Dancing With Pixies", in Preston, J. & Bishop, J.M., (eds), Views into the Chinese Room, pp. 360-379, Oxford University Press.
- Chalmers, D.J. (1994) "On Implementing a Computation", Minds and Machines, vol.4, pp.391-402.
- Chalmers, D. (1996) "Does a Rock Implement Every Finite-State Automaton?", Synthese, vol.108, pp.309-333.
- Chrisley, R. (1994) "Why Everything Doesn't Realize Every Computation," Minds and Machines 4:4, pp 403-420.
- Chrisley (2004) "Perceptual Experience as the Mastery of Sensorimotor Representational Contingencies", abstract in Proceedings of Towards a Science of Consciousness 2004, p 119.
- O'Regan, K., and Noe, A. (2001) "A Sensorimotor Account of Vision and Visual Consciousness", Behavioral And Brain Sciences 24(5).
Nov 13 2013
Rank #5: Evolving concepts of creativity: A mirror, a tightrope and an inkblot
My main point is that creativity can be seen as the result of maintaining a fruitful tension between:
- Self and environment
- The intuitive and the conceptual
- "Blue-sky" and applied
- Novel and familiar
- Chaos and order
- Disconnection and engagement.
Nov 13 2013
Rank #6: Machine models of consciousness: An ASSC tutorial (part 1)
Some general information about the tutorial can be found at http://www.assc10.org.uk/workshops.html#A1.
- PodSlides:iPod-ready video (.mp4; 40.5 MB; 47 min 25 sec)
- audio (.mp3; 21.7 MB; 47 min 26 sec)
- Powerpoint file (.ppt; 280 KB)
- "Virtual Machines and Consciousness" (.pdf; 232 KB), a paper by Aaron Sloman and myself from the Journal of Consciousness Studies, that is the basis of the second part of my presentation.
Nov 13 2013
Rank #7: Epistemic blindspot sets: A resolution of Sorensen's strengthened paradox of the surprise examination
Unfortunately, the audio recording levels were too high, so there is a lot of distortion; you may find this to be too irritating for the podcast to be listenable. Also, instead of a PowerPoint file of slides, there is a two-page PDF handout.
- R. A. Sorensen, A strengthened prediction paradox, Philosophical Quarterly 36 (1986), 504-513.
- R. A. Sorensen, Conditional blindspots and the knowledge squeeze: a solution to the prediction paradox, Australasian J. Phil. 62 (1984), 126-135.
Nov 13 2013
Rank #8: Painting an experience? How aesthetics might assist a neuroscience of sensory experience
- Embodied creativity
- Enactive models of experience
- Synthetic phenomenology
- Interactive empiricism
- Art works/installations
- PodSlides: iPod-ready video (.mp4; 27 MB; 33 min 55 sec)
- Audio (.mp3; 15.6 MB; 33 min 51 sec)
- PowerPoint file (.pptx; 1.5 MB)
Nov 12 2013
Rank #9: Sensory Augmentation, Synthetic Phenomenology and Interactive Empiricism
On Thursday the 26th and Friday the 27th of March, 2009, the e-sense project hosted the Key Issues in Sensory Augmentation Workshop at the University of Sussex. I was invited to speak at the workshop; my position statement (included below) serves as a good (if long) summary of my talk.
- PodSlides: iPod-ready video (.mp4; 19.3 MB; 26 min 25 sec)
- Audio (.mp3; 11.7 MB; 25 min 21 sec)
- PowerPoint file (.ppt; 696 KB)
- Position statement (.pdf; 78 KB)
How can empirical experiments with sensory augmentation devices be used to further philosophical and psychological enquiry into cognition and perception?
The use of sensory augmentation devices can play a crucial role in overcoming conceptual roadblocks in philosophy of mind, especially concerning our understanding of conscious experience and perception. The reciprocal design/use cycle of such devices might facilitate the kind of conceptual advance that is necessary for progress toward a scientific account of consciousness, a kind of advance that is not possible to induce, it is argued, through traditional discursive, rhetorical and argumentative means.
It is proposed that a philosopher's experience of using sensory augmentation devices can play a critical role in the development of their concepts of experience (Chrisley, Froese & Spiers 2008). The role of such experiences is not the same as the role of say, experimental observation in standard views of empirical science. On the orthodox view, an experiment is designed to test a (propositionally stated) hypothesis. The experiences that constitute the observational component of the experiment relate in a pre-determined, conceptually well-defined way to the hypothesis being tested. This is strikingly different from the role of experience emphasized by interactive empiricism (Chrisley 2010a; Chrisley 2008), in which the experiences transform the conceptual repertoire of the philosopher, rather that merely providing evidence for or against an empirical, non-philosophical proposition composed of previously possessed concepts.
A means of evaluation is need to test the effectiveness of the device with respect to the goals of interactive empiricism and conceptual change. Experimental philosophy (Nichols 2004) looks at the way in which subjects' philosophical views (usually conceived as something like degree of belief in a proposition) change as various contingencies related to the proposition change (e.g., how does the way one describes an ethical dilemma change subjects' morality judgements of the various actions in that situation?; cf, e.g. (Knobe 2005)). One could apply this technique directly, by empirically investigating how use of sensory augmentation devices affect subjects' degree of belief in propositions concerning the nature of perceptual experience. However, it would be more in keeping with the insights of interactive empiricism if such experiments measured behaviour other than verbal assent to or dissent from propositions, such as reaction times and errors in classification behaviour. This might allow one to detect changes in subjects' conceptions of the domain that are not reportable or detectable by more propositional, self-reflective means.
Are there rigorous techniques that can characterise the subjective experience of using sensory augmentation technology?
Synthetic phenomenology is 1) any attempt to characterize the phenomenal states possessed, or modelled by, an artefact (such as a robot); or 2) any attempt to use an artefact to help specify phenomenal states (independently of whether such states are possessed by a naturally conscious being or an artefact) (Chrisley 2009; Chrisley 2010b; Chrisley 2008). Although "that" clauses, such as “Bob believes that the dog is running”, work for specifying the content of linguistically and conceptually structured mental states (such as those involved in explicit reasoning, logical thought, etc.), there is reason to believe that some aspects of mentality (e.g., some aspects of visual experience) have content that is not conceptually structured. Insofar as language carries only conceptual content, “that” clauses will not be able to specify the non-conceptual content of experience. An alternative means, such as synthetic phenomenology, is needed.
Earlier (Chrisley 1995), I had suggested that we might use the states of a robotic model of consciousness to act as specifications of the contents of the modelled experiences. This idea has been developed for the case of specifying the non-conceptual content of visual experiences in the SEER-3 project (Chrisley and Parthemore 2007a; Chrisley & Parthemore 2007b). Specifications using SEER-3 rely on a discriminative theory of visual experience based on the notion of enactive expectations (expectations the robot has to receive a particular input were it to move in a particular way). Depictions of the changing expectational state of the robot can be constructed in real time, depictions that require the viewer to themselves deploy sensory—motor skills of the very kind that the theory takes to be essential to individuating the specified content. Thus, the viewer comes to know the discriminating characteristics of the content in an intuitive way (in contrast to, say, reading a list of formal statements each referring to one of the millions of expectations the robotic system has).
Just as SEER-3 models, and permits the specification of, experiences in a modality we naturally possess (vision), so might other robotic systems, equipped with sensors that do not correspond to anything in the natural human sensory repertoire, model and permit the specification of other experiential states. As with the case of visual experience, specification cannot consist in a mere recording or snapshot of the sensor state at any moment, nor even in a sequence of such snapshots. Rather, the specification must be dynamically generated in response to the specification consumer’s probing of the environment (virtual or real), with the sensor values being altered in a way that compensates for both the subjectivity of the experience being specified, and that of the recipient herself.
- Chrisley, R. (2010a, in press). "Interactive empiricism: the philosopher in the machine, in: McCarthy, N. (ed.), Philosophy of Engineering: Proceedings of a Series of Seminars held at The Royal Academy of Engineering. London: Royal Academy of Engineering. http://www.sussex.ac.uk/Users/ronc/papers/interactive-empiricism.pdf
- Chrisley, R. (2010b, in preparation) "Synthetic phenomenology". Scholarpedia. http://www.scholarpedia.org/article/Synthetic_phenomenology.
- Chrisley, R. (2009) "Synthetic Phenomenology", International Journal of Machine Consciousness 1:1. http://www.sussex.ac.uk/Users/ronc/papers/synthetic-phenomenology-ijmc.pdf
- Chrisley, R. (2008) "Philosophical foundations of artificial consciousness". Artificial Intelligence In Medicine 44:119-137. doi:10.1016/j.artmed.2008.07.011; http://www.sussex.ac.uk/Users/ronc/papers/phil-founds-artificial-consciousness.pdf
- Chrisley, R. Froese, T., Spiers, A (2008) "Engineering conceptual change: The Enactive Torch" Abstract of talk given November 11th, 2008, at the Royal Academy of Engineering as part of the 2008 Workshop on Philosophy and Engineering http://www.sussex.ac.uk/Users/ronc/e-asterisk/WPE2008-Chrisley.pdf
- Chrisley, R. and Parthemore, J. (2007a) "Robotic specification of the non-conceptual content of visual experience". In Proceedings of the AAAI Fall Symposium on "Consciousness and Artificial Intelligence: Theoretical foundations and current approaches". AAAI Press. http://www.consciousness.it/CAI/online_papers/Chrisley.pdf
- Chrisley, R. and Parthemore, J. (2007b) "Synthetic phenomenology: Exploiting embodiment to specify the non-conceptual content of visual experience". Journal of Consciousness Studies 14 pp. 44-58. http://www.sussex.ac.uk/Users/ronc/papers/ChrisleyandParthemore-SyntheticPhenomenology.pdf
- Chrisley, R. (1995) "Taking Embodiment Seriously: Non-conceptual Content and Robotics," in Ford, K., Glymour, C. and Hayes, P. (eds.) Android Epistemology. Cambridge: AAAI/MIT Press, pp 141-166. http://www.sussex.ac.uk/Users/ronc/papers/ae-embodiment.pdf
- Knobe, J. (2005). "Theory of Mind and Moral Cognition: Exploring the Connections", Trends in Cognitive Sciences 9, pp 357-359.
- Nichols, S. (2004). "Folk concepts and intuitions: From philosophy to cognitive science", Trends in Cognitive Sciences 8:11, pp 514-518.
Nov 12 2013
Rank #10: Concepts and Proto-Concepts in Cognitive Science (part 2)
Nov 12 2013