table of contents

04/24/2013

Tamara Kayali on Bioethics

Tamara Kayali completed her PhD at Cambridge University in 2011. Her PhD dissertation focused on issues of control, responsibility and the self in depression and used qualitative interviews with women to explore this topic. She completed a Bachelor's in Biotechnology from the Australian National University before studying Bioethics in her Honours year at the Unit […]

download transcript [vtt]
00:00:00.000
[MUSIC]
00:00:06.400
This is KZSU Stanford.
00:00:08.440
Welcome to entitled opinions.
00:00:12.160
My name is Robert Harrison.
00:00:14.360
We're coming to you from the Stanford campus.
00:00:16.880
[MUSIC]
00:00:29.840
During our long hiatus, I received an email from one of our listeners who wrote me the following.
00:00:41.520
I have been listening to your show for quite some time now.
00:00:45.440
I had been trying for a long time to find a good philosophical podcast,
00:00:50.640
an interesting phenomenon in itself, but had perpetually met with failure until.
00:00:57.120
Love at first listen, I found entitled opinions.
00:01:01.440
That's one way of defining entitled opinions, a show for those who otherwise meet with perpetual failure.
00:01:09.040
When they search for a good philosophical podcast.
00:01:13.040
We cover a lot more than philosophy around here as the brigade knows well.
00:01:18.480
But whether the topic is literature, biology, cosmology, psychiatry,
00:01:24.480
religion, evolution, or corporations, entitled opinions brings an insurgent,
00:01:32.560
extracurricular philosophical reflection to bear on it.
00:01:37.600
Trying to make it real compared to what?
00:01:41.200
That's our spirit.
00:01:42.960
They can eat their dinner on philosophy talk, eat their pork and beans.
00:01:48.960
On entitled opinions, we eat more chicken any man ever seen.
00:01:53.840
That's right, the band is at hand, the priest is at the feast, amor is at the door,
00:02:00.880
what gets put on in the east gets taken off here in the west.
00:02:04.880
So you just get here and we'll take care of the rest.
00:02:07.680
[MUSIC]
00:02:13.680
[MUSIC]
00:02:23.680
[MUSIC]
00:02:47.680
My guest today turns out to be the person who wrote me the email in question.
00:02:53.600
But before we engage her on the topic of bioethics, I would like to take a moment to clarify
00:02:59.600
something I mentioned toward the end of the recent show we aired on Charles Darwin.
00:03:05.520
And which I brought up again during our show with photographer Lena Herzog.
00:03:10.560
It has to do with what I called nature's essayistic manner of going about the business of natural
00:03:17.280
selection. A number of you were intrigued enough to ask me to clarify and elaborate exactly what I
00:03:23.600
meant when I declared that if I were God and my intention were to create the most diversified,
00:03:31.360
complex, wondrous, and beautiful biosphere possible, I could think of no better mechanism to put
00:03:38.320
in place than the creative, albeit cruel process of natural selection.
00:03:46.400
I define that mechanism as essayistic in the sense that at any given moment in the development of
00:03:52.240
an individual species or biogroup, there exists an enormous overload of putulating genetic
00:04:00.800
possibilities and random mutations, the vast majority of which are not selected successfully,
00:04:08.320
but which nevertheless continue to get generated, discarded, and recombined in the great
00:04:15.440
cauldron of the evolutionary process. Since only a tiny fraction of the proliferating possibilities gets
00:04:22.880
realized, most of nature remains in a state of sheer potential, a potential for life rather than
00:04:30.640
life itself. And if we could do an inventory of this near life, I don't know what else to call it,
00:04:37.920
we would be astonished at how ingenious creative evolution actually is when it comes to monitoring
00:04:44.880
the fierce struggles that take place on the boundary lines between possibility and actuality.
00:04:51.600
The genius of nature is that it really does not know what it is doing when it evolves its species.
00:04:59.840
Not if we conceive of knowledge, the way the Western tradition has tended to conceive of it.
00:05:06.240
Aristotle, with his unparalleled common sense, declared that a craftsman must have four
00:05:14.400
knowledge of what he intends to build if he hopes to bring it into being. Thus, the carpenter must
00:05:21.200
know the form of a table must have in his mind a concept of table before he can go about making one
00:05:29.520
out of the wood in his workshop. A house does not come into being by chance but by design.
00:05:39.440
The ontotheological idea of God in the Western tradition is largely that of the ultimate craftsman
00:05:45.760
in whose divine mind the design of creation pre-exists the act of creation.
00:05:52.080
God produces the world through his foreknowledge of its species and his power to actualize them into
00:06:00.000
being from out of the chaos of undifferentiated matter. Grasping the essence and totality of the
00:06:08.000
whole beforehand God assigns its parts accordingly. This is more or less what Leibniz meant when he declared
00:06:16.400
that we live in the best of all possible worlds, namely that God through his omniscience and goodness
00:06:23.760
actualize the mayor, the moan, perceived the best of possible worlds and that he really had no choice
00:06:31.600
in the matter. Nature does something similar but in a very different way, namely without necessity
00:06:40.240
and without foreknowledge. What became clear with Darwin is that nature does not operate like a craftsman
00:06:46.880
or what amounts to the same like the ontotheological God of creation. For nature does not know what
00:06:54.160
it is doing does not foresee its creative evolution. Instead it has a surcharge of vital options that
00:07:02.960
it casts into the arena of life and its surrounding ring of near life allowing those options to
00:07:10.720
strive for self-realization or self-perpetuation by whatever means available. In other words,
00:07:19.920
Nature assays. It attempts, experiments, tries and ventures the new, the variant, the possible and the
00:07:28.640
unprecedented. Nature is through and through, possibleitarian, forever incubating, combining and
00:07:36.720
assaying the range of possibilities. In the Western tradition nature has often been compared to a book
00:07:46.080
But if nature is like a book, that book is more of a collection of essays than a systematic
00:07:52.000
treatise or traditional novel. With this difference that its essays are writing themselves without
00:08:00.000
understanding the sentences out of which they are composed. In the realm of nature, making comes first
00:08:07.920
and knowledge if it comes about it all comes later. That's why here I would call on the authority
00:08:17.760
of one of the great trustees who sits on the board of entitled opinions, Jambatista Vico. Vico wrote a
00:08:26.480
century before Darwin, yet most of the modern science that came after him confirmed, Vico's
00:08:32.560
epistemological principle that we only know what we ourselves make. Vero mitt factum convertum tuur,
00:08:41.680
the true and the maid are convertible. First we make, then we come to know. Again, that is how
00:08:50.320
natural selection operates. First it makes through trial and error, first it assays, and when one
00:08:57.360
of its essays gets selected, it keeps a genetic code of its own blueprint for further replication.
00:09:03.360
This is what I meant when I spoke about nature's essayistic el-an-vita.
00:09:09.280
Now onto our show, the guest who joins me on entitled opinions today is Tamara Kiali. She is a post
00:09:19.040
doctoral fellow in the novel tech ethics team at Dalhousie University up in Canada.
00:09:25.440
She recently completed her PhD in the Department of Social Sciences at Cambridge University
00:09:30.800
with a dissertation on issues of control, responsibility, and the self-indipression,
00:09:36.720
using qualitative interviews with women to explore her topic.
00:09:40.720
Tamara has lectured in bioethics at Sydney University and the Australian National University.
00:09:48.080
She has a variety of research interests but is primarily interested in neuroethics
00:09:53.520
and the ethics of reproduction. For example, questions such as what policy schools and universities
00:10:00.160
should adopt on the use of cognitive enhancers to improve exam performance as well as issues related
00:10:07.040
to the genetic engineering of children and embryo selection for non-medical reasons.
00:10:11.280
Tamara joins me today to talk about bioethics. We've never done a show on bioethics before,
00:10:19.040
and I for one am looking forward to our conversation, but I can't resist mentioning one other
00:10:24.080
anecdote. In her email to me, Tamara informed me that after finishing her dissertation, she took a
00:10:29.280
part-time job in one of the college libraries at Cambridge and that it was tedious work,
00:10:34.880
work that involves scanning books into the computer system, but she could listen to her iPod
00:10:40.720
while she worked and I'm quoting her again. I wanted to listen to something intellectual
00:10:45.760
and stumbled upon entitled opinions and actually looked forward to work because of it.
00:10:49.920
When I listen to it, I feel that I'm tuning into an underworld of kindred spirits.
00:10:56.400
So here you are. Welcome to the underworld. Thanks very much. It feels unreal to actually be on
00:11:02.800
the other side. Having already listened to a number of shows, Tamara, you know by now that I am
00:11:08.000
extremely wary of what Martin Heidegger called "technicity," above all when it comes
00:11:14.640
to forms of biotechnology. I mean things like genetic engineering of crops, which is a widespread
00:11:20.480
practice these days, and one that raises for me a host of social, political, ecological,
00:11:26.560
and ethical questions. I also mean things like the reproductive technologies that are in place
00:11:33.600
today, the genetic engineering of children, cognitive enhancement, and embryo selection,
00:11:38.320
but before we get into some of the controversial issues, could I ask you to please say a word about
00:11:44.400
your discipline? Namely, bioethics. What exactly is bioethics in your view?
00:11:48.880
So I understand bioethics to mean any ethical issues concerning biology,
00:11:58.800
biological research and applications of biology as well as medicine,
00:12:04.560
being the clinical practice, medical research, and medical technologies.
00:12:13.520
So this is very wide, and if you think about it, actually, it's also very interdisciplinary,
00:12:20.080
so it involves not just philosophers, but also sociologists, lawyers, medical practitioners,
00:12:29.280
anthropologists, all of these sort of disciplines can weigh into bioethics. So it's quite for me,
00:12:39.520
it's a wide definition. It can encapsulate practices such as abortion, euthanasia. These are
00:12:48.080
quite old practices, and it can go all the way to IVF, that's in vitro fertilization,
00:12:59.680
pre-implantation genetic diagnosis, which is embryo selection, basically,
00:13:08.640
and perhaps in the future genetic engineering of children, as we already said, genetic engineering of
00:13:15.200
crops, and also neuroscientific advances. So you're describing here of biotechnologies,
00:13:21.840
yes. About which bioethics has something to say, do you understand bioethics to be offering,
00:13:30.880
or should it be offering a set of prescriptions or moral mandates, ethical imperatives,
00:13:37.840
to regulate the development and dissemination of bio technologies, or do you understand bioethics to
00:13:49.360
be something that does not presume to aim, at matters of principle, general, let's say,
00:13:57.120
universal principle, but applies to local practices in their particular circumstance.
00:14:05.760
And as you were mentioning, you're interested in the social context, as well as natural context,
00:14:12.880
and the repercussions that certain bio technologies will have in particular environment.
00:14:19.760
So... Yeah, I think it depends what kind of a bioethicist you are.
00:14:25.040
If you're taking the case-based approach, which I tend to take, you will look at
00:14:35.200
different bio technologies on a case-by-case basis, and look at often your intuitions and apply ethical
00:14:45.760
ethical principles out from there, expanding out from what your particular intuitions are.
00:14:53.200
I don't actually want to pin myself down to that particular approach, but
00:14:59.520
I do often feeling client towards it. The other approach is a principleist approach, which, as you said,
00:15:07.200
works from particular principles and flows out from that to apply those principles to
00:15:14.000
different ethical problems and issues. And ideally bioethics is seeking answers to seeking
00:15:24.080
to gain the most ethical solutions to problems that perplex us, and controversial
00:15:32.160
technologies that people are worried about and what we should do with them.
00:15:36.320
Well, let's take a few cases, if you don't mind. You talked about abortion and euthanasia as being
00:15:48.000
classical. They do not directly associate with biotechnology, but they do have a lot to do with
00:15:54.080
bioethics if you want to look at it that way. So let's take some traditional classical instances
00:15:59.920
before we talk about the much more sinister for me, sinister cases of what I guess we're not
00:16:06.880
calling novel tech. So abortion, for example. How do you approach an issue like that?
00:16:16.240
So for me, I'll say upfront that I take somewhat different perspective on abortion to the way it's
00:16:24.960
often debated in bioethics. So most people will be familiar with the debates on the moral status of
00:16:33.040
the embryo and the fetus, and the rights of the mother versus the rights of the fetus,
00:16:40.800
and various different very good philosophical arguments have been proposed about it.
00:16:47.440
I will say upfront that I am pro-choice. I come from a feminist perspective, but at the same time,
00:16:57.680
I have read some, I've done a bit of research on the sociological, the social context of abortion.
00:17:08.880
Mainly, I was, what fueled this actually was questions around, well, since contraception is
00:17:16.080
so widely available. Of course, it depends on which country you're in, but in many countries like
00:17:26.400
the UK and Canada and Australia, which I'm most familiar with, it's relatively easy to
00:17:34.400
get a hold of contraception, it's quite affordable, and yet why are there still so many
00:17:40.000
unplanned pregnancies to begin with? And the answers that I found to that shocked me quite a bit,
00:17:47.920
and what I found was that if you look at the demographics of abortion, what are the kinds of
00:17:57.520
people that are requesting it? There was, I'll take as an example, one quite large study that was done
00:18:04.720
in Australia of in 1996, there was close to 15,000 women aged between 18 and 23 that were surveyed.
00:18:13.840
And the women who reported teenage terminations were more likely to be in a de facto relationship,
00:18:27.040
less well educated, have no private health insurance. And these are things that you may well predict
00:18:34.800
when I ask people what would you predict. They often come up with these things, but what I didn't
00:18:41.120
expect was also that they are also more likely to have been a victim of partner violence.
00:18:47.760
And the shocking stat on this was that partner violence is the strongest predictive factor
00:18:56.800
of pregnancy termination among young Australian women. And women who had ever experienced partner
00:19:03.920
violence were more than twice as likely to have had an abortion than those who had not experienced
00:19:11.120
partner violence before. And that's actually probably an underestimation. So when I put that in the
00:19:21.280
context of the debate, it looked to me quite shocking because I hadn't envisaged that partner
00:19:28.320
violence would be such a strong predictive factor of abortion. And it's, it figures so little in
00:19:36.560
the debates. So it started to become clear to me that maybe abortion is a little bit like a bandaid
00:19:45.840
solution that yes, it probably prevents women from when we don't want women who have been the victims
00:19:56.320
of partner violence to also then be forced to have a baby that they don't want.
00:20:02.080
But what is the problem that we're trying to solve? If what's that issue here, if a big
00:20:11.680
fueling factor is coercion and violence, that's fueling the need for this technology, if you will,
00:20:19.920
then we're not really solving the root of the problem. And there's a more social problem that
00:20:26.880
maybe we need to approach with different techniques in at the same time.
00:20:37.680
Well, I don't understand why those who are in abusive relationships are less prone to take
00:20:44.000
birth control than those who are not. I mean, if you're, if you say contraception is readily
00:20:49.680
available, why are they getting more pregnant in abusive relationships and hence are having more
00:20:57.760
abortions? I mean, that, but these are sociological statistical issues that I guess would be
00:21:04.720
interesting to pursue if we're interested in the sociology of the bioethical issues here in this case.
00:21:12.480
And I appreciate the fact that you want to deal with the particularities of cases. And in this case,
00:21:17.520
you're bringing forward empirical evidence that actually you're right. It's not part of the usual
00:21:22.800
discourse when you figure out that there's some other things going on in the case of people who
00:21:31.440
do opt for these abortions. But that, again, is not a matter of principle. This is a matter of
00:21:40.560
looking at the embeddedness of a certain practice of, let's say, abortion or euthanasia within
00:21:47.040
in the larger social context. I guess I, my, the listeners of this program, the regular listeners
00:21:54.400
know that I have some particular set of concerns that are less sociologically oriented and
00:22:02.160
what can I say? Existentially oriented, politically oriented as well, and certainly philosophical in
00:22:09.200
nature. And I guess I'm interested in pushing you a little bit in that direction. And so,
00:22:16.720
do you, would you agree that there is a qualitative difference between what you call these
00:22:25.120
classical, bioethical and biotechnical, technological issues like abortion and euthanasia,
00:22:33.040
which you don't really need any biotechnology for, they've been practiced for millennia, it's not
00:22:38.800
something that arises necessarily with our own modernity. There are more refined ways to achieve
00:22:45.200
these results if you like. But that there's a qualitative difference between those and certain
00:22:51.280
novel technologies that are being experimented with as we speak and which are seem destined to develop
00:23:00.160
into ever more sophisticated forms of things like cognitive enhancement of
00:23:06.960
offspring's, you know, their abilities or cloning, genetic engineering, and so forth.
00:23:17.120
Is there, am I being naive to think that there's, that we're talking about a new sort of game here?
00:23:24.720
Related to the old, for sure, with the new set of rules and that there's something that is far more
00:23:30.400
worrisome going on? I think you're right in thinking that there is something on a different
00:23:37.680
scale here. I think what we're trying to achieve is very similar to what we've always been trying
00:23:44.720
to achieve. But by a novel by a text, I think up the ante, because splicing fish genes into a
00:23:56.240
tomato-no amount of selective breeding is going to achieve that. So that is something unprecedented
00:24:02.880
that we couldn't have achieved otherwise. Novel technologies to do with neuroscience
00:24:12.320
are really moving quickly. They're fascinating, deep brain stimulation, and as we're going to talk about
00:24:23.280
later on, FMRI, scanning and computer technologies, these are fields which are proposing to do things
00:24:31.840
that really we only dreamed about before or were in sci-fi novels. So I think, yeah, we have to
00:24:40.560
recognize what is similar. So when it comes to, for example, genetic engineering of children,
00:24:48.880
which we can't currently do, but there's a lot of debate about what we might do in the future.
00:24:54.240
And I think it is very worthwhile, recognizing where there are similarities between
00:25:02.800
trying to engineer children, if you will, environmentally, say by using private school tuition,
00:25:11.600
special classes, tennis lessons, music lessons, for example, which seem to have the same aim
00:25:19.520
as say engineering a special musical or sporty gene.
00:25:25.600
Right. That's the issue, isn't it? Because I agree, environmental engineering, you could say,
00:25:31.600
is on the same order as genetic engineering because it brings about a similar set of results,
00:25:36.080
if you're one of these ambitious parents who will do anything to get their child into Harvard
00:25:44.640
or Stanford or what have you. And they, for me, are the problem, not just the parents, but those who
00:25:52.320
will go to, will use any means necessary to achieve their own
00:25:57.920
a venal objective. But here, I'm thinking now Francis Fukuyama's book, our post-human future,
00:26:07.120
where he speaks about genetic engineering and cloning, but he says that the real problem facing us
00:26:12.640
is psychopharmacology. Because, and I think if I remember correctly, he discusses the example of
00:26:21.680
Ritalin, which is the drug that is prescribed massively for boys who have a kind of
00:26:30.080
bongchous behavior and attention deficit disorder, another kind of fabricated medical condition
00:26:37.600
that benefits the pharmaceutical industries that promote the drug. And that even the parents who
00:26:44.400
are aware that they might be using a drug in order to suppress perfectly natural, healthy
00:26:52.400
behavior in their children will do so nonetheless because it makes their own lives easier.
00:27:00.400
Now, you can imagine that if they're willing to over prescribe Ritalin to their kids, that how
00:27:07.760
much more ready will parents be to prescribe things that are not just going to make their lives
00:27:13.520
easier, but are promoted that are going to make their children's the prospects for their children's
00:27:18.160
future more bright and are going to make them more athletic and more intelligent and more beautiful.
00:27:23.280
And where we're already starting to see a genetically engineered, so what really worries me is the
00:27:31.040
consensus in the population at large. What happens in the laboratories is one thing. And the
00:27:39.280
scientists and who are trying to make a name for themselves by advancing as far as they can,
00:27:45.840
biotechnological advances, that's one thing. We could deal with them as a different kind of
00:27:51.600
pathology. The pathology that worries me, and I think if I understand Fukuyama correctly, it
00:27:57.280
worries someone I came to say, there is such a readiness on the part of this citizenry as a whole
00:28:05.440
to wholly embrace mine in a mindless way, biotechnologies whose consequences for human society as a whole,
00:28:14.000
and for our whole political system as a whole, cannot possibly be foreseen, controlled,
00:28:19.600
or predicted at the moment in which one is adopting these things. So
00:28:27.920
is I know that you've worked on cognitive enhancement from the psycho pharmaceuticals. Do you have a
00:28:37.440
bioethical position or stance on this issue? Yeah, well, I would actually agree with you
00:28:45.600
somewhat on your assessment of Ritalin. No doubt there is probably a small percentage of children,
00:28:56.800
maybe that have some kind of a real problem of attention deficit, high-viractivity disorder.
00:29:06.080
But the stats on it show that countries like the US far outstrip any other country when it comes to
00:29:17.440
the consumption of Ritalin, the prescription of Ritalin for children. And it does seem like there's
00:29:26.480
a number of factors that converge in favor of this. So of course the pharmaceutical companies
00:29:33.760
have an interest in selling their drug. Teachers have an interest in keeping their children under
00:29:39.840
control and don't like dealing with problem children or children that are, as you said, too rambunctious,
00:29:47.360
and parents don't like feeling that maybe they're at fault, maybe they're not disciplining
00:29:53.120
their children enough. So it's easier to say, well, maybe my child has a problem that just needs to
00:29:59.600
be medicated. And so it's a number of factors all rolled into one. And of course the child can also
00:30:09.520
buy into this. I think the part of the problem with this over-medicalization of perhaps
00:30:19.520
behaviors that are quite normal and natural is what happens to the child's feelings of responsibility.
00:30:26.400
When they grow up, how long are they going to feel dependent on this medication in order to function?
00:30:33.440
And these kind of issues are issues that we find occur also in depression.
00:30:39.920
These issues of the self-control and responsibility that people grapple with when they take
00:30:53.120
psychopharmaceuticals and it's not even for enhancement. What we're talking about here is a
00:30:59.920
ostensibly treatment of depression and of ADHD. So what happens when we take these medications and
00:31:11.360
there seems to be nothing wrong with us? What we're trying to do is, for example,
00:31:16.160
there's a drug called Madaffinil, which is supposed to be prescribed for people who suffer from
00:31:23.760
narcolepsy over sleeping during the day, sleeping too much. But it can be taken by, say, truck drivers
00:31:31.200
who need to stay awake, nurses on night shifts, or even students. There might be students in
00:31:38.480
Stanford right now who are taking Madaffinil to stay awake and alert and perform better at exams
00:31:44.960
or their school performance. So if we start to become the field dependent on these kind of drugs,
00:31:54.080
I think we need to think about what's going to happen.
00:32:00.640
Well, that's great. Tamara, we can go on thinking about it and having hypotheticals.
00:32:08.480
Do you have a stance on, so, well, now you've raised the question in a different way,
00:32:16.160
if parents are so ready, a certain kind of bourgeois parent, I mean, bourgeois in the
00:32:24.400
Nietzsche-in-sense of the last man type, to risk their children's perhaps health as well as
00:32:33.840
psychic health by over prescription of riddle-in. And I don't think there's hardly
00:32:39.280
an intelligent parent that doesn't realize that there is a lot of risk, at least some risk involved.
00:32:44.560
If they're willing to take that risk, what will cause them to hesitate if given the option to remove
00:32:50.800
in the future the gay gene from their, the embryo of an unborn child? Or an alcoholic
00:33:03.680
gene? Or some, so already we're talking about this absolute sovereignty of the parent over the biological
00:33:12.960
social, psychological destiny of a parent, and that you're removing from all future generations
00:33:19.760
a genetic legacy that otherwise would have been active in the future progeny of that unborn child.
00:33:28.560
So the difference between psycho-pharmaceuticals and this intervention in the genetic makeup is that
00:33:39.120
there's something irreversible in the decision to remove if they were to find the gene for
00:33:47.440
homosexuality or the gene for, I'll call it, or any kind of deviant behavior that this sort of,
00:33:54.560
what Hannah Arendt calls acting into nature, where nature and the difference between action in the
00:34:01.120
world, in the human world to which action properly belongs, I'm entirely with Hannah Arendt on that,
00:34:08.080
and acting into nature where human action has no business extending itself into nature is that
00:34:13.920
in the human world what she calls promise and forgiveness are always open, whereas in the world of
00:34:23.600
nature is unforgiving, and you cannot make a promise, as human beings we cannot promise that something is
00:34:30.800
going to be fine before we act into nature and find that it's not fine after all. We cannot
00:34:38.400
hold that promise, we cannot honor that promise nor is nature forgiving of these kinds of
00:34:44.960
mistakes. In fact, it can lead to disasters. This is a bioethical issue that worries me a great deal
00:34:52.880
because we seem to be very close, as we speak, to these kinds of decisions being handed over to
00:35:02.560
individuals who have shown on record not to have much compunction when it comes to engineering
00:35:10.880
their own children's social outcome and their chances for success.
00:35:18.640
Yeah, I actually find myself agreeing with you here, on the in a sense that yes, these things are
00:35:27.120
potentially irreversible, so we better darn well know what we're doing when we do them.
00:35:34.400
My worry is not so much acting into nature just because it's nature and nature has rules
00:35:46.000
that we shouldn't be playing around or playing God with. And if I can fight on philosopher with another,
00:35:54.720
I might throw sat at an hour and sat says, "Ajron Polsach, that is, that we have to recognize that
00:36:06.400
we have a fear of freedom of our own freedom." Now, that doesn't give a carte blanche to do anything
00:36:14.800
that we want. I think we should recognize when we feel worried or scared about doing something and
00:36:25.200
try to tease out, and this I think is part of the job of a bioethicist, is to tease out exactly
00:36:32.320
why we're worried and the risks associated with it, with what it is that we're proposing to do,
00:36:43.520
but also not be afraid to question our current practices. So if we find that, well, let's say, for example,
00:36:53.920
genetic engineering of our children to become musical geniuses, has the same intention behind it as
00:37:04.560
making them and forcing them to play piano for hours and on end. If we feel that those two practices
00:37:13.520
have a lot in common, I don't think we should be afraid to question our current practices then.
00:37:19.280
And then in that case, it might be that new bi-technologies that come up when we debate them,
00:37:27.760
and then perhaps they shine a new lens on current practices that we weren't questioning,
00:37:35.600
that maybe we might say, well, maybe we need to rethink what our intentions are behind having
00:37:45.120
children in the first place and having, why do we want to have particular children? What is
00:37:50.160
fueling this need? And that is something I'm really interested in.
00:37:55.840
Well, I can appreciate that. I find that it's eminently reasonable. And yet, I also do not believe
00:38:06.000
that reason has a lot of compelling power when it comes to regulating or warning people
00:38:16.080
let alone the industries that promote these practices about the potential danger. So you speak
00:38:22.640
about Jean-Paul Sath and throw him at Hana-Au'in. Sath, yes, he did say that we have a fear of our own
00:38:30.400
freedom. Sath was the one who resisted any attempt at the naturalization of the human. He thought
00:38:38.560
that the human was an exception to the order of nature because precisely this unconditional freedom
00:38:45.360
of the self, which is not reducible to any of its natural determinations. He called for radical self
00:38:55.040
responsibility that we have to take responsibility for all, that we are the authors of our own lives,
00:39:00.880
and that we are the authors of our world. We have the world we deserve, he said.
00:39:05.440
Therefore, I agree that from a searching perspective, what he would call for is a radical assumption
00:39:13.680
of responsibility for our actions in this fear. And that means that we have to take full cognizance
00:39:22.400
of what it is we are bringing about. Now, if you can assure me that it is all possible
00:39:28.480
at the present moment to take full cognizance of what we are bringing about, then I will be persuaded
00:39:36.960
that we can apply with that we can take full Sartrean responsibility for what we are doing.
00:39:42.480
I happen to believe that there is such a huge mismatch between what biotechnology, what it has released
00:39:53.360
into the world in terms of possibilities for the alteration of natural processes and the alteration
00:40:02.960
of the human species itself, such a disconnect between that, and the absolute crude
00:40:09.760
primitiveness of our own consciousness and awareness that we are still the same extremely fallible
00:40:18.560
if not wretched and depraved creatures that we always were, that if you hand over to our own
00:40:25.360
desires and decisions, questions about the fate of the use of these biotechnology, I fear,
00:40:32.720
in fact, I'm almost certain that we're in for a heap of trouble because I think human motivation
00:40:39.040
is mostly profane and it doesn't think long term, it's objectives have proven to be extremely self-centered
00:40:52.480
and narrow and so forth. So I just worry about handing over to individuals' desires, these kind
00:41:01.280
of decisions because I don't think most people and even governments, if you want to go that far,
00:41:08.400
have shown that they're willing to step up to the responsibilities of this unconditional freedom
00:41:16.720
that Sartrean was talking about, especially in this area. I agree with you, I think Sartre
00:41:25.120
places a lot of emphasis on the responsibility that we have to take. I think there is a real
00:41:34.800
emphasis within bioethics on individual choice. That seems to reign supreme, at least it's the most
00:41:45.760
popular view that bioethics is take, is that if anything we should veer on the side of allowing
00:41:53.040
parents to decide what choices they make with technology, if this were to become available,
00:42:00.480
they should be able to decide under the banner of reproductive choice, reproductive freedom,
00:42:06.960
what children, how many children to have, if they want children and within that, what kinds of
00:42:17.200
children to have. I'm skeptical of this though, not just because I wonder whether parents are really
00:42:27.360
ready to take responsibility for the consequences, but also whether society is really equipped for
00:42:38.560
the consequences, and I think we need to look more not just at individual practices or likely
00:42:46.400
practices that individual parents are going to take, but the likely effects that these practices are going
00:42:54.080
to have for society, for example, if genetic engineering of children were to become available,
00:43:02.080
the way that society's structured right now means it will probably be quite expensive,
00:43:08.480
probably only available to the rich, and for that reason could just amplify the advantages that
00:43:15.920
the rich could have, and widen the gap even more between the rich and the poor. There could be a genetic
00:43:23.600
arms race in the sense that even if you don't particularly want to engineer your child,
00:43:29.680
because Joe Blow, next door, and everyone else is doing it, you feel your children might be at a
00:43:34.640
disadvantage if you don't do it. So I think-
00:43:39.280
Well, that's I agreed, but then again, I don't want to insist too much on Fukuyama, I'm not a Fukuyama
00:43:45.520
niece, as it were. However, he being a political scientist is interested in the political
00:43:51.680
response to the dangers of post-humanist biotechnologies, and he thinks it's necessary to have
00:44:00.080
governing bodies at the national and international level that would regulate and draw lines beyond
00:44:07.440
which you cannot go. So I'm thinking, for example, of bi analogy in the sports world about all this
00:44:13.600
kind of doping that we never hear the end of, these things, I mean, Lance Armstrong,
00:44:18.800
recently being stripped of all his titles, and then you read about the various kinds of
00:44:25.760
enhancements, physical enhancements that certain drugs and techniques could provide. But if
00:44:33.360
everyone in a particular sport, we saw that in baseball, if everyone in a particular sport is
00:44:38.240
abusing steroids or certain kind of enhancers, then if you want to play that game,
00:44:46.960
you're going to have to start using those enhancers yourself. Otherwise, you're just not going to be
00:44:54.880
able to play the game. And if you have what you call a genetic arms race, and if you have the
00:44:59.520
freedom where some people are allowed to engage in it, then it's going to force everyone else.
00:45:06.640
That's why I'm with Fukuyama and he says, what we need is to have proscriptions against certain
00:45:13.360
the adoption of certain technologies, just the way they do it in sports. So anabolic steroids
00:45:20.000
are not permissible, or cognitive enhancement at the genetic level or even through
00:45:25.360
former psychopharmaceuticals are just that, yes, that these kinds of regulation are absolutely necessary
00:45:33.200
at the collective level. Everyone has to play by the same set of rules. Otherwise,
00:45:38.400
yeah, we're going to turn into a race of immortals and mortals, the damned and the saved,
00:45:44.880
all belonging to the same species. If the human species remains a human species, because that's
00:45:53.200
something that Hannah Arendt already 50 years ago, I have a quote here from her. She was reflecting
00:46:03.120
on contemporary sciences, acting into nature. And she was thinking about things not every day
00:46:09.200
to the biotechnology as much as, for example, nuclear fission and fusion, which are events that in
00:46:15.200
nature take place only in remote stars and collapsing stars, but we have found a way on earth to
00:46:21.760
produce something that would never ever take place on earth through this acting into nature. And what
00:46:26.800
does it give us? It gives us the atomic bomb. Welcome to the new world is such. She says, quote, "All
00:46:35.600
are pride in what we can do will disappear into some kind of mutation of the human race."
00:46:42.240
As if she were sensing that a post-human future was at hand, and let me quote her again,
00:46:49.920
"The stature of man would not simply be lowered by the unearthly nature of contemporary science,
00:46:56.560
but have been destroyed altogether." So here is one of the dangers is not modifications within
00:47:07.280
the human realm, but such a transmutation, such a mutation of human nature itself that we can no longer
00:47:16.720
call ourselves human, at least not the way we've been human since our early prehistory.
00:47:22.720
Yeah, it's actually something I found an anthropologist that could probably,
00:47:36.000
I think she would agree with your sentiments here. And she, her name is Amber case,
00:47:47.200
and anthropologist who is interested in cyborgs. And she believes we're actually already cyborgs
00:47:54.640
because if you take the definition of cyborg to be an organism, the quote is, "To which
00:48:02.560
exogenous components have been added for the purpose of adapting to new environments."
00:48:08.800
So we can look at, say, spacesuits as being an example of one of those or
00:48:15.440
deep-sea divers who wear special suits to adapt to that environment. But-
00:48:21.760
The iPhone.
00:48:23.600
The exactly, well, she raises the iPhone and mobile phones and laptops as examples of
00:48:30.960
technology that is an extension of who we are, but less in the physical sense and more in the
00:48:38.000
mental sense, that it's an extent of our mental self now, not just our physical self, which is
00:48:46.320
challenging and changing the nature of humans and what we look like.
00:48:55.440
And she argues that online personality is like a second self, so your webpage, your Facebook,
00:49:05.840
is a second self presentation. So in that sense, I might agree with Hannah Arendt. I'm not
00:49:15.840
skeptical of technology per se, just simply because it's fiddling with nature. But I do worry when
00:49:26.320
we up the ante with doing things that are irreversible and doing things that
00:49:36.080
we might be doing prematurely before we know the full implications. And if we're still looking narrowly,
00:49:43.600
if we're not looking at the wider implications, not just for other human beings, but for
00:49:50.640
other animal species and for the biosphere.
00:49:54.400
Well, can you say something about this new
00:50:02.080
technology that you were telling me about before coming on air of mind reading,
00:50:08.000
what do you call it? There's an acronym for it. I don't remember what it is.
00:50:12.800
Actually, I'm not sure if there is an acronym for it. It uses fMRI imaging of the brain and
00:50:26.000
along with computers. And it is aimed to read your mind.
00:50:32.320
The thing that they've managed to do so far, they've done a couple of things. One of them is they've
00:50:42.000
got a bunch of volunteers, the scientists, and get them to watch a Hollywood trailer,
00:50:50.720
a movie, Hollywood movie trailer, and match up the images that they're seeing with the fMRI
00:51:00.400
images of their brains as they're seeing each image. And then they take away the Hollywood trailer
00:51:06.800
and they tell the participants, "Okay, now I want you to visualize some of the images that you just
00:51:13.600
saw." And what the scientists are able to do is just by looking at the fMRI scan of the brain,
00:51:22.880
they can tell what image that person is thinking of. So far, they can only do it for images that
00:51:32.880
the person's already seen and the computer has been able to match it up with. But of course,
00:51:38.080
they're looking into extending this to be able to just tell what images you're thinking of.
00:51:44.080
Well, you see for someone like me, this is horrifying to an unspeakable extent because
00:51:49.840
on the one hand, there's nature that one acts into nature and you can genetically modify crops and
00:51:56.240
so on. Then there's biotechnology that has to do with the body. And let's say enhancement or
00:52:03.440
even musical talents. But then as you said, now we're talking about what we are going to soon be
00:52:08.960
capable of doing when it comes to our minds and our mental life and that you have these prosthesis
00:52:15.920
which have to do with our mental selves, not our physical selves, the iPhone, the computer.
00:52:20.800
And when you talk about the cyborg, if you're getting to that point where now you can collapse
00:52:26.560
the boundary between a person's inner thoughts and the neighbors inner thoughts, that is the
00:52:33.280
board collective where now there is no more self. As Hannah Arent would say, the thinking self,
00:52:42.320
which is in a dialogue with itself in solitude, it's my dialogue with myself that takes place,
00:52:49.360
I don't want to say inside the brain, it's not, it's a mental space. But that space of solitude,
00:52:56.240
which is impenetrable by the other, requires that I meet my neighbor, my fellow citizen in a space
00:53:08.000
in the world that comes between men, as she says. And I do it through language, speech, deed,
00:53:16.080
action, and this is the world of appearance. If we are going to collapse that the space between
00:53:23.520
minds and have a board collective where now, then we are a synthetic and organic,
00:53:33.200
the synthetic is not just attached to the organic, it becomes the actual destiny of the organic.
00:53:40.800
And for those of us who are committed to preserving what is human about the human species,
00:53:48.080
that it is horrifying. Yeah, actually, that seems to resonate a lot with what the
00:53:55.120
anthropologist Amber Case is saying, that when we are needing to be constantly connected to people
00:54:04.320
on our mobile phones, and then we feel lost when we lose our mobile phone. And we feel this
00:54:11.520
constant need to be texting, interacting on the internet, and there is no space for self-reflection.
00:54:19.600
We seem to be increasingly afraid of just sitting there and thinking and being alone with our thoughts.
00:54:28.160
And the replacement of electronic communication for a person-to-person interaction
00:54:39.440
might have some really strange effects on getting to know your neighbor and as well as getting to know
00:54:45.120
yourself. So I do think that if we lose our or have less time for self-reflection, we have less time
00:54:56.080
for long-term planning, daydreaming, I think is important for long-term planning and our idealism.
00:55:05.920
Well, again, what terrifies me the most is not so much the technology. The existence of the
00:55:12.080
technology terrifies me to the extent that I'm sure there's a host of people and my neighbor or
00:55:18.400
my fellow citizen who are enthusiastic about the prospect of this kind of mind-reading technology,
00:55:24.320
and who would be more than happy to be able to download directly into their brains things from
00:55:29.920
the internet so that they only have to have the mediation of the computer. And as you were
00:55:33.360
telling me, if you could download a Spanish language program before you go to Spain, it's just right there,
00:55:39.200
the amount of people who would willingly and enthusiastically embrace it, I think that's where the
00:55:44.640
devil lies is in the, again, as I said at the beginning of the program,
00:55:49.600
it's the consumer. And the consumer's demand, if we allow the consumer's demand to dictate
00:56:02.480
the policies regarding these novel biotechnology's, then I think we're in a heap of trouble.
00:56:08.880
Yeah, exactly. I think the problem is that these are potentially really powerful tools that could be
00:56:17.760
used for good as well as bad. This mind-reading technology could potentially solve a lot of
00:56:26.800
issues. There are people, for example, who are completely paralyzed, but by the use of mind-reading
00:56:34.720
might be able to move robotic instruments to help them. But at the same time, you can obviously
00:56:43.440
envisage how it could be used for bad. And if we were, like I said, in the future, this hasn't been
00:56:48.080
developed yet, but if we were able to download software onto our brains just like, just like we can to
00:56:54.960
our computers, if this were to replace education, I think that would be very dangerous and scary.
00:57:03.200
We can see, obviously, the advantages of this kind of technology, but I think in the context of a
00:57:08.480
consumer's culture that has really overtaken the world. We need to look at how is it most likely
00:57:16.240
going to be used, not just how in principle will it be used? Well, it sounds to me like you are in the
00:57:22.880
most essential domain that is around there today, certainly in our academic worlds, which is
00:57:30.640
bioethics, because it's a whole new world that we're entering. And we didn't even talk about
00:57:36.560
the way bioethics is originally a concept about medicine in the biosphere as such, and that not
00:57:44.000
only is it's relation to animals, but with plant life and with all of life. So, yeah, sure, that was
00:57:50.800
one of the conceptions of it back in 1970 by a biochemist by the name of Potter, but it's since been
00:58:00.880
left aside, and I think it would be quite advantageous if we were to revisit that definition again.
00:58:07.760
Right. So, I am delighted that I was able to provoke some entitled opinions out of you on this
00:58:15.680
topic, because I appreciate the way that you measure the judgments, because it does, it will require a
00:58:26.880
discriminant form of judgment, and case by case sort of study looking in the details. I would
00:58:34.320
probably be very bad in a kind of policymaking body because my extremism would probably
00:58:41.680
cause most people to just kind of discount as not being within the ballpark of these things, but
00:58:48.960
but it's so much fun to be extreme. Well, that's me. We can do it on the rate of good on entitled
00:58:55.200
opinions. So, Tamara Kiali, thanks again for coming through to talk to entitled opinions. You are
00:59:01.520
here from Dalhousie in Canada, and the next time you come through the Bay Area, let me know, because
00:59:07.200
we have a lot to continue on this discussion. And let me remind our listeners we've been speaking with
00:59:12.720
Tamara Kiali from Dalhousie, Ph.D. from Pecambridge, Bioethuses, Neuroethuses, and I am Robert Harrison for
00:59:20.560
entitled opinions. Take care.
00:59:50.400
(upbeat music)
00:59:52.980
(upbeat music)
00:59:55.560
(upbeat music)
00:59:58.140
(upbeat music)
01:00:00.720
(upbeat music)
01:00:03.300
♪♪
01:00:14.020
♪♪
01:00:23.220
♪♪
01:00:32.600
♪♪
01:00:42.600
♪♪
01:00:52.600
♪♪
01:00:58.600
[MUSIC PLAYING]