Nearly every college graduate had the class.
In fact, if you graduated with a degree in Computer Science, Information Systems or one of the many derivatives, you’ve likely even had a class on ‘Computer Ethics’.
But, when was the last time, in the context of working professionally, did you or your colleagues challenge yourselves to…
Examine the ethical element of the work being conducted?
How many organizations openly address ethics when composing either the requirements or the solution for those requirements?
To what extent do you consider the customer’s needs when selecting the development environment in which you’ll deliver the solution? Should you?
As working software engineering professionals, and business owners, questions like these are becoming a larger focus for views on the evolution of the profession. Further, how do digital product teams deal with these dilemmas as they arise? When does “the craft” take on a larger set of responsibilities?
As you can see, feature thinks heavily and has many debates both internally and externally on the ‘Ethics of Computing’.
Our CEO, Michael Potts, was invited to the 2015 3.0 Leaders Conference in Sarasota to share some thought provoking insight on Ethics. In his talk, Mike attempts to bring to light a very-difficult-to-define conversation, and proposes some simple thought exercises which may help software teams better protect them, their employers, AND their clients.
My [Michael Potts’] first brush with Ethics came as a Lead Software Engineer for Arthur Andersen here in Sarasota about 15 years ago. As you can imagine, as a software engineer in the aftermath of things that I didn’t quite understand at the time, I began doing some research on the topic.
How could I have protected myself? What does this mean for me? Email was a large contributing factor to that case, and I didn’t understand that. I thought that email was a communication much like a conversation. So as a software engineer I came across some research published by a gentleman named James Moor, of Dartmouth. ‘What is Computer Ethics’ was the title of that paper.
And in that paper he [Mr. Moor] talked about two things that computers were going to change forever in society, and the first; ‘Logical Malleability’, the fact that the computer was malleable in just about every capacity. Meaning, it did what is was told. There were no checks, not balances. A computer simply did what it was told. Yet, we give it a persona. We sometimes pretend, you know very J.A.R.V.I.S-like, HAL 9000, that they have a living prescience that they don’t actually have.
The other thing he talked about [Mr. Moor] was this thing he called, ‘Invisible Abuse’. Because software is compiled, other than the [software] engineer, most people don’t understand what happens inside of a computer. That doesn’t mean that the computer isn’t doing something. So now as a software engineer, I find myself at the keyboard compiling the source code that most people will never see. When you add that to the ramifications of the decisions made at a company like Arthur Andersen, I realized what software engineers had, on their shoulders… a single engineer could bring an entire organization down.
Good or evil. It really depended on the person. So how do you teach right and wrong? How do you teach software engineers?
Fast forward… feature. I’m in the business of software today. I work with CIOs, much in the fashion James [Moor] was talking about. And, we’re trying to hire and train young software engineers. I can teach them to code. I can teach them to compile. I can even teach them to speak like a human being and not like a computer when in the room with other intelligent people.
But what I can’t teach them to do is to make decisions that are right and wrong in the context of Ethics. Because, there is not such thing in the computer world, and if you haven’t noticed, the younger generation… the more time they spend with computers the less English they seem to understand. So we have this really fuzzy line on what’s right and wrong with the newer software engineers.
We think its great that people are learning to code. But we do think that the profession needs to help them understand that now that you understand Ruby, or Java, or PHP to some extent… you begin on Friday, that doesn’t mean by Monday you can go out and charge $175 an hour and convince some business to risk their business’ future on a weekend crash course in computer programming.
From our perspective, there’s a great deal of theory. Programming is 1 of 14 critical skills as a software engineer. When you pair that with the fact that universities have a regulated curriculum, they’re really not diving deep into the examination of ethics. Applied ethics, we’re fortunate if [software] engineers have even heard of the concept. Much less a deep dive in computer ethics…the responsibilities of owning someone else’s source code.
Source code, by the way, are one of those things that can be copied quickly. If you’re using certain tools, they even have a philosophy that every developer has a copy of the IP (intellectual property). That’s a very interesting dilemma when something that’s aethereal or vaporware now physically has a presence on 60 computers in the company. How do you control the software engineer, the safety of those assets. The number of laptops that get lifted from airports, that disappear, that get taken from coffee shops. These are all things that we don’t seem to talk about until they happen. And then, what do we do, we fire the software engineer. The software engineer who wasn’t trained, that didn’t even understand that this was a ‘thing’.
So, we focused on those 2 areas within, what we’ll call; professional ethics, in the software space.
The education of the executive. Some of these are Wharton School graduates. You know, very smart people. But you know what they’re not teaching at Wharton, is deep dives into computer ethics. They’re teaching them business theory, business model theory. So, that’s been a large focus for us, and I have to say that’s kept us fairly busy because… where are the rules? There are no rule systems for the executives. In many cases, it has to do with the context of the business.
If you’re in healthcare, life saving equipment… there’s a different set of ethics than we’ve got in an e-commerce shop. Which, by the way, if you’re in e-commerce, those rules are changing almost monthly at this point.
And then, to the litigation side of it, in the states of North Carolina and Texas, they’ve actually passed legislation that allows them to pursue criminal liability against firms like mine if they do in fact, create a piece of software which is violated.
Think: Target. Think: Delta. Think: BlueCross.
In those cases, had my firm written that software, they could’ve come after me and my partners, criminally. I could actually go to prison for that. Well, that’s not something I would want. (chuckles)
You know, are not going to prison is now dependent on the college graduates that we just hired that don’t understand ethics and are probably 8 years away from being able to have a conversation.
And I challenge anybody in this room… If you want to have a very interesting experience, in your next business meeting, simply say the words. Don’t accuse anybody, but simply say the words, “I’m not sure that’s an ethical approach.”
Your relationship with your colleagues will change forever. There’s something about the word ‘ethics’ that really, really, really scares people.
So its one of those things that you can’t really get a fair examination of because nobody really knows what it means.