Monday, September 19, I was in Delft for a meeting with researchers from the Technical University Delft and two Australian researchers. Present were:
Seumas Miller – Australian National University – Centre for Applied Philosophy and Public Ethics; privacy issues
Cathy Flick – Charles Sturt University – ICT industry, codes of ethics, policies, trust, privacy
Jeroen van den Hoven – TU Delft; prof. Ethics of Technology; privacy issues, moral identity
Alper Cugun – MSc student TU Delft; social issues, online communities
Noëmi Manders – PhD researcher TU Delft; ethical aspects of identity management
Bibi van den Berg – PhD EUR; ambient intelligence
Jeroen Timmermans – PhD EUR; playful identities
Michiel de Lange – PhD EUR; playful identities
There were some interesting topics discussed. Read the full notes on the meeting below:
// Jeroen vd Hoven:
Privacy issues & moral identity: the right to write your own biography (Isaya Berlin).
With more and more data, identities are being constructed about people on the basis of database- information.
Popper: 3 worlds
1. someone interrupting and interfering in your autonomous thoughts
2. increasing your self-awareness by ‘looking at you’ – new perspective
3. information gets stored into databases > who will see it? quasi-independent information objects leading life of their own.
Institutions can constrain information (physical, deontic, epistemic) – moral justification:
– preventing harm (cf. Nazis in NL with lists of jews)
– transparancy – personal data has become commodified and is being sold
– information must remain in its original context; it can be used in discriminatory ways if it leakes into other domains.
– One can only write his own moral biography. Other institutions/person shouldn’t have acces to information that could picture your Self. [one cannot know what it is like to be you; persons change and develop their personalities]. Privileged acces only for subject.
We expect modesty in others in judging us and saying “I know you”.
Q: is breach of privacy justified when you suspect someone is doing something evil when your aiming for the truth, but it’s difficult to get at?
Q: What lattitude should people have to keep aspects of their life private?
> people need a ‘time-out’ from morality to experiment with it and develop their owm moral identity.
[individualist perception of identity; atomistic vs. collective image]
> We reveal more of ourselves to invite others to assist us in identity creation.
[presupposing an autonomous kernel that knows itself].
[the individual that constructs itself can do so in language that is not at all useful for other to comprehend – Wittgenstein]
Q: how can you describe the boundaries of what others may know things about you (just like ‘social space’. A: one can descern information as being ‘sensitive’ when it can be used to describe your moral identity; Q: but what is ‘sensitive information’? differs from person to person. Stereotypes can quickly emerge.
ambient intelligence: personalised space.
Q: to what extent can you expect a technology to ‘know you’ and adapt to your personality? Is the Self becoming a set of preferences?
concept of ‘strangeness’ of places: in a globalised world there are a lot of places that are familiar: the same everywhere. No room anymore for the ‘ongemak’ and the unexpected, unmediated experience. Sometimes persons are formed by uncontrollable events.
Q: what if there are many different people with all their own preferences present in one place?
Q: what does collective experience mean when all places are personalised?
Q: when does it turn off? how does it respond to your changing preferences?
Q: Responsability for experiences are being handed over to technologies. Who is responsable for the quality of these experiences?
Trusted computing: chip on motherboard, booting in ‘trusted mode’. Trusted modes are becoming ubiquitous: everyone needs to have a ‘trusted computer’. Large corporations are developing this. Possible problems:
– TC could be used to enforce DRM
– Technical specs change often
– open standards group TCG originally set up has been left by MS
– it can be used also for ‘trusted communication’ between terrorists, MS has not fully incorporated it in Vista.
// Seamus Miller
individual action – collective action (behaviour/congnitive). How can this be morally significant?
– assertions you don’t have the right to have
joint procedural mechanism: not one person, not collective, but through adding/combining different databases to create profiles > who is responsable?
Q: make distinction between responsability – accountability?