As I finish up on book on innovation, one of the most intriguing chapters I wrote was about the growing ethical issues being spawned by innovation and technology. Food/biofuel tradeoffs, God powers with nanotechnology and genome technologies, the thin line between real and virtual worlds and on and on. Complex issues, few simple answers.
Yet, in tech we don’t really have a good forum to discuss these issues. Go to most hospitals and they can quickly convene an ethics committee. This often includes a doctor, nurse, social worker, attorney, chaplain, medical ethics professional, and a member of the community. The committee is available to a doctor or someone close to the patient to consult on an ever evolving set of yes, technology spawned, modern issues like around life extension and genetics.
In technology we don’t discuss these issues much. Bill Joy prominently wrote in Wired magazine “The future does not need us” in 2ooo about risks from nanotech, robotics and genetic engineering). Yes, a decade ago. Few tech execs take such public positions. For the most part discussion on cyber-ethics has stayed in academia.
One of the professors I interviewed was Herman Tavani at Rivier College and he told me between the second edition of his book on technology ethics in 2007 and the third edition this year, a whole laundry list of new technology spawned ethical issues had sprung up. And given how rapidly “Cybertechnology is converging with biotechnology and nanotechnology” the issues are going to proliferate even more rapidly.
The other thing I learned from the conversation with him and other professors is how few schools teach cyber-ethics. When I got my MBA, I had a class on business ethics. Back then, there were few tech issues – we explored issues around the morality of doing business with a then apartheid S. Africa. You think every MBA and MIS degree today would have a course on cyber-ethics.
So against this background the recent Google/China flare-up makes for a interesting conversation. Google’s CEO Eric Schmidt was asked by Fareed Zakaria of Newsweek if his fiduciary responsibility to shareholders was not to maximize profits.
Eric’s response was:
“When we filed for our IPO, we attached to the document a statement about how we wanted to run our business. We said we were going to be different. We said that we were going to be motivated by concerns that were not always or strictly business ones.”
Many complimented Eric and asked what took him so long. Others said Eric would never had said it if Google was a dominant player in China – it is a distant second to Baidu. Steve Ballmer of Microsoft scolded him and called Google’s reaction “an irrational business decision.” John Chambers of Cisco called it “natural give and take” – i.e. a negotiation. But a virtual “ethics committee” like at a hospital came together to discuss various angles.
We need much more of that. We also need guiding principles. My friend Brian Sommer provided a perspective for that chapter :“For ethical innovation and technology, we need to find a higher-level set of principles …(like the) Ten Commandments. Here's a list of acceptable behaviors that's in three major religions and has survived thousands of years”
Interestingly, Steve invoked Saudi Arabia in his comments, and John made his comments in Saudi. The reality is increasingly states, particularly autocratic ones, are raising ethical issues with their growing tech savviness and resultant surveillance and censorship. They are also making it tougher for our tech vendors to live up to “Don’t be evil” promises.
We are giving governments access to all kinds of powerful technology, without burdening them with the obligations of “ fair and balanced use”. Who is going to call them out when they bully others to not question their actions?
And why just blame the autocratic states? Did we have in the US a real meaningful discussion around whether drones we are using for war violate Asimov’s First Law: “A robot may not injure a human being” and his second: “A robot must obey orders given it by human beings except where such orders would conflict with the First Law”?
Yes, we are all in this to make money. But from the smallest person in the industry to mighty states, we need a better compass. And we need to encourage everyone to have their moment of conscience and convene the “virtual ethics committee” around them. These are complex issues that none of us should try to resolve by ourselves - smart as we all are in tech.
Comments
Where are the “Ten Commandments” for Technology?
As I finish up on book on innovation, one of the most intriguing chapters I wrote was about the growing ethical issues being spawned by innovation and technology. Food/biofuel tradeoffs, God powers with nanotechnology and genome technologies, the thin line between real and virtual worlds and on and on. Complex issues, few simple answers.
Yet, in tech we don’t really have a good forum to discuss these issues. Go to most hospitals and they can quickly convene an ethics committee. This often includes a doctor, nurse, social worker, attorney, chaplain, medical ethics professional, and a member of the community. The committee is available to a doctor or someone close to the patient to consult on an ever evolving set of yes, technology spawned, modern issues like around life extension and genetics.
In technology we don’t discuss these issues much. Bill Joy prominently wrote in Wired magazine “The future does not need us” in 2ooo about risks from nanotech, robotics and genetic engineering). Yes, a decade ago. Few tech execs take such public positions. For the most part discussion on cyber-ethics has stayed in academia.
One of the professors I interviewed was Herman Tavani at Rivier College and he told me between the second edition of his book on technology ethics in 2007 and the third edition this year, a whole laundry list of new technology spawned ethical issues had sprung up. And given how rapidly “Cybertechnology is converging with biotechnology and nanotechnology” the issues are going to proliferate even more rapidly.
The other thing I learned from the conversation with him and other professors is how few schools teach cyber-ethics. When I got my MBA, I had a class on business ethics. Back then, there were few tech issues – we explored issues around the morality of doing business with a then apartheid S. Africa. You think every MBA and MIS degree today would have a course on cyber-ethics.
So against this background the recent Google/China flare-up makes for a interesting conversation. Google’s CEO Eric Schmidt was asked by Fareed Zakaria of Newsweek if his fiduciary responsibility to shareholders was not to maximize profits.
Eric’s response was:
“When we filed for our IPO, we attached to the document a statement about how we wanted to run our business. We said we were going to be different. We said that we were going to be motivated by concerns that were not always or strictly business ones.”
Many complimented Eric and asked what took him so long. Others said Eric would never had said it if Google was a dominant player in China – it is a distant second to Baidu. Steve Ballmer of Microsoft scolded him and called Google’s reaction “an irrational business decision.” John Chambers of Cisco called it “natural give and take” – i.e. a negotiation. But a virtual “ethics committee” like at a hospital came together to discuss various angles.
We need much more of that. We also need guiding principles. My friend Brian Sommer provided a perspective for that chapter :“For ethical innovation and technology, we need to find a higher-level set of principles …(like the) Ten Commandments. Here's a list of acceptable behaviors that's in three major religions and has survived thousands of years”
Interestingly, Steve invoked Saudi Arabia in his comments, and John made his comments in Saudi. The reality is increasingly states, particularly autocratic ones, are raising ethical issues with their growing tech savviness and resultant surveillance and censorship. They are also making it tougher for our tech vendors to live up to “Don’t be evil” promises.
We are giving governments access to all kinds of powerful technology, without burdening them with the obligations of “ fair and balanced use”. Who is going to call them out when they bully others to not question their actions?
And why just blame the autocratic states? Did we have in the US a real meaningful discussion around whether drones we are using for war violate Asimov’s First Law: “A robot may not injure a human being” and his second: “A robot must obey orders given it by human beings except where such orders would conflict with the First Law”?
Yes, we are all in this to make money. But from the smallest person in the industry to mighty states, we need a better compass. And we need to encourage everyone to have their moment of conscience and convene the “virtual ethics committee” around them. These are complex issues that none of us should try to resolve by ourselves - smart as we all are in tech.
Where are the “Ten Commandments” for Technology?
As I finish up on book on innovation, one of the most intriguing chapters I wrote was about the growing ethical issues being spawned by innovation and technology. Food/biofuel tradeoffs, God powers with nanotechnology and genome technologies, the thin line between real and virtual worlds and on and on. Complex issues, few simple answers.
Yet, in tech we don’t really have a good forum to discuss these issues. Go to most hospitals and they can quickly convene an ethics committee. This often includes a doctor, nurse, social worker, attorney, chaplain, medical ethics professional, and a member of the community. The committee is available to a doctor or someone close to the patient to consult on an ever evolving set of yes, technology spawned, modern issues like around life extension and genetics.
In technology we don’t discuss these issues much. Bill Joy prominently wrote in Wired magazine “The future does not need us” in 2ooo about risks from nanotech, robotics and genetic engineering). Yes, a decade ago. Few tech execs take such public positions. For the most part discussion on cyber-ethics has stayed in academia.
One of the professors I interviewed was Herman Tavani at Rivier College and he told me between the second edition of his book on technology ethics in 2007 and the third edition this year, a whole laundry list of new technology spawned ethical issues had sprung up. And given how rapidly “Cybertechnology is converging with biotechnology and nanotechnology” the issues are going to proliferate even more rapidly.
The other thing I learned from the conversation with him and other professors is how few schools teach cyber-ethics. When I got my MBA, I had a class on business ethics. Back then, there were few tech issues – we explored issues around the morality of doing business with a then apartheid S. Africa. You think every MBA and MIS degree today would have a course on cyber-ethics.
So against this background the recent Google/China flare-up makes for a interesting conversation. Google’s CEO Eric Schmidt was asked by Fareed Zakaria of Newsweek if his fiduciary responsibility to shareholders was not to maximize profits.
Eric’s response was:
“When we filed for our IPO, we attached to the document a statement about how we wanted to run our business. We said we were going to be different. We said that we were going to be motivated by concerns that were not always or strictly business ones.”
Many complimented Eric and asked what took him so long. Others said Eric would never had said it if Google was a dominant player in China – it is a distant second to Baidu. Steve Ballmer of Microsoft scolded him and called Google’s reaction “an irrational business decision.” John Chambers of Cisco called it “natural give and take” – i.e. a negotiation. But a virtual “ethics committee” like at a hospital came together to discuss various angles.
We need much more of that. We also need guiding principles. My friend Brian Sommer provided a perspective for that chapter :“For ethical innovation and technology, we need to find a higher-level set of principles …(like the) Ten Commandments. Here's a list of acceptable behaviors that's in three major religions and has survived thousands of years”
Interestingly, Steve invoked Saudi Arabia in his comments, and John made his comments in Saudi. The reality is increasingly states, particularly autocratic ones, are raising ethical issues with their growing tech savviness and resultant surveillance and censorship. They are also making it tougher for our tech vendors to live up to “Don’t be evil” promises.
We are giving governments access to all kinds of powerful technology, without burdening them with the obligations of “ fair and balanced use”. Who is going to call them out when they bully others to not question their actions?
And why just blame the autocratic states? Did we have in the US a real meaningful discussion around whether drones we are using for war violate Asimov’s First Law: “A robot may not injure a human being” and his second: “A robot must obey orders given it by human beings except where such orders would conflict with the First Law”?
Yes, we are all in this to make money. But from the smallest person in the industry to mighty states, we need a better compass. And we need to encourage everyone to have their moment of conscience and convene the “virtual ethics committee” around them. These are complex issues that none of us should try to resolve by ourselves - smart as we all are in tech.
January 25, 2010 in Industry Commentary | Permalink