Author and entrepreneur Jerry Kaplan offers an interesting crash course on computational ethics, the idea that robots and machines will require programming to make them cognizant of morals, decorum, manners, and various other social nuances. Jerry Kaplan's latest book is "Humans Need Not Apply: A Guide to Wealth and Work in the Age of Artificial Intelligence" ().
Read more at BigThink.com:
Follow Big Think here:
YouTube:
Facebook:
Twitter:
Transcript - As machines become increasingly autonomous by which I mean they can sense their environment and they can make decisions about what to do or what not to do. Of course it’s based on the programming and their experience. But we don’t have as direct control over what they do as we do today with the kinds of technology that we have. Now there’s a couple of very interesting consequences of that. One of them is that they’re going to be faced with having to make ethical decisions. I’ll call it ethics junior is just making socially appropriate decisions. So we’re taking machines and we’re putting them in situations where they’re around people. And something that we take for granted and it seems so natural that machines do not take for granted and do not find natural is the normal kinds of social courtesies and conventions that we operate by in dealing with other people. You don’t want to have a robot that’s making a delivery run down the sidewalk and everybody’s got to get out of the way. It has to be able to walk in a crowd in a socially appropriate way. Your autonomous car, right. There are lots of very interesting ethical conundrums that come up but a lot of them are just social. Okay it pulls up to the crosswalk. Should you cross? Wait? How’s it going to signal you? It’s right now the social conventions should make eye contact with the driver and they tell you whether to cross. (Read full transcript here: )
Read more at BigThink.com:
Follow Big Think here:
YouTube:
Facebook:
Twitter:
Transcript - As machines become increasingly autonomous by which I mean they can sense their environment and they can make decisions about what to do or what not to do. Of course it’s based on the programming and their experience. But we don’t have as direct control over what they do as we do today with the kinds of technology that we have. Now there’s a couple of very interesting consequences of that. One of them is that they’re going to be faced with having to make ethical decisions. I’ll call it ethics junior is just making socially appropriate decisions. So we’re taking machines and we’re putting them in situations where they’re around people. And something that we take for granted and it seems so natural that machines do not take for granted and do not find natural is the normal kinds of social courtesies and conventions that we operate by in dealing with other people. You don’t want to have a robot that’s making a delivery run down the sidewalk and everybody’s got to get out of the way. It has to be able to walk in a crowd in a socially appropriate way. Your autonomous car, right. There are lots of very interesting ethical conundrums that come up but a lot of them are just social. Okay it pulls up to the crosswalk. Should you cross? Wait? How’s it going to signal you? It’s right now the social conventions should make eye contact with the driver and they tell you whether to cross. (Read full transcript here: )
- Category
- TV 채널 - TV Channel
Sign in or sign up to post comments.
Be the first to comment