The phrase "skills in reflective analysis for recognizing and navigating ethical challenges" is annoying to read: big words, clumsy organization. Simplify and clarify.
My bigger problem with this is that like so many encouragements for continuing education, it feels pro-teacher and anti-student. The real reason for continuing eduction is that the field changes continually, not that anyone needs to experience college over and over again. Adding a phrase to the effect that "The rapidly evolving field makes keeping up to date a continuing necessity" may make the section feel less like an advertisement for educators.
A related complaint is that since the code of ethics is being used to force practitioners to support academics, there should also be an ethical demand that educators support practitioners. I will elaborate on this point elsewhere.
Apologies for this, but due to the 22 edits on first day, I was unable to put the following text where it belongs, so I am putting it here to get it in before the deadline.
"the code is designed to support all computing professionals." It is bad news when sentence 2 of an ethics document is a bald-faced lie. Section 4.2 makes it clear that the code can be used to kick people out of the ACM. This code is used to benefit "good" people who are our friends and to punish "bad" people who are mean or make us look bad. When you punish someone, you are not "supporting" them. Please be honest. The last sentence in the preamble also makes it clear that "support" is the wrong word.
"the public good is always a primary consideration" The definition of public is slippery. For US spy agencies, the American good the the primary goal. For Russian spy agencies, the Russian good is the primary goal. Is it possible anyone who works for one of those agencies to follow the code. Microsoft employees primarily work for the benefit of Microsoft shareholders. Apple employees primarily work for the benefit of Apple shareholders. This is the law. Is it possible for anyone who works at one of those companies to follow the code? A different question is what scale of public? Does it have to be everyone on earth or can it be a subset?
I would delete the outline in the second paragraph to make this code shorter. The code is way too long and way too boring. While I have now read the whole code through at least 5 times, I have yet to do so in one sitting. And, I can take this kind of pain like few others. It is boring, boring, boring, boring. This means that nobody will ever read it, unless a professor makes them do so in class. Bad sign. Very bad sign.
"volunteer professional capacity" what does this mean? I have no idea. Reads like an oxymoron. Delete if possible.
"Each principle is supplemented by guidelines" Many are, but not all. Be honest.
"these extraordinary ethical responsibilities" Why extraordinary? Why are they not ordinary, what everyone should do anyways? Probably delete the word.
"and a particular principle may" clumsy grammar, hard to read
"these kinds of conflicts are best answered by thoughtful consideration" This isn't true. In my comments, I have found a number of questions that I cannot answer and might be best answered by someone who knows more about ethics than I do. The sentence is just wrong.
Move the last paragraph to be the first paragraph in 1.2. The paragraph is predominantly about "harm."
Encouraging "volunteer or pro bono" work seems wrong. I actually believe professionals should do that. But, it implies that something is wrong for someone to work hard for Microsoft or Google or their own startup. I don't understand how this advocacy is "ethics." This should be moved to a section on hopes or ideals.
Add "the" to "minimize the negative consequences."
The sentence "when the interests of multiple groups conflict, the needs of the least advantaged should." Apple is renowned for expensive hardware. Are Apple employees unethical, because poor people who cannot afford them need those resources, too? Are members of espionage agencies unethical, because it is easiest to spy on the poor? I feel this sentence needs clarification at the very least.
The word "stakeholder" in the first sentence is very slippery. It seems to include the victims of a virus, who might suffer damage. But, does it apply to the store clerks who lose their jobs to Amazon and other companies who are moving commerce online? If so, are the programmers at Amazon responsible to "undo or mitigate the harm" for those clerks who lose their jobs? Or is Amazon? Another example is that Microsoft worked very hard to take the market away from WordPerfect and Lotus 123 and others. Are Microsoft programmers supposed to "undo or mitigate the harm" to the programmers in the competitors who are put out of business? Are IT workers who make reports and put middle-managers out of work just as bad. Should IT workers "undo or mitigate the harm" to unemployed managers? If not, then why should a hacker who steals money be sanctioned. After all, a hacker who steals $1000 has much less of an impact on lives than when Amazon programmers destroy careers.
The third paragraph has several problems.
** The list "best practices for design, development, testing" is more of a practitioner and manager practice, rather than an explanation of "harm." This paragraph may belong in section 2.1 or perhaps 5.1 "Shared Programmer and Manager Practices."
** The phrase "should follow generally accepted best practices" is troublesome. Back before say 2005, Agile was not one of those practices. This phrase encourages conservatism and implies that there is nothing important left to learn in the field. This point actually contradicts other points below that encourage continuing education, because if we already know all the best practices, there really is no need for continuing education. All new ideas and all research and all innovations start out as "not best practices" and ethics must allow change, unless "ethics" is merely a tool to kill change.
** The sentence "the consequences of emergent systems and data aggregation should be carefully analyzed" is very, very troublesome. It contradicts itself. Emergent systems are ones that cannot be analyzed, and must be incrementally developed and understood. Unless you can predict the future. Only the non-emergent, predictable, planable systems can be analyzed. This sentence could be interpreted as "in all those cases where you must use Agile, use the Waterfall instead." This would use an ethics document to make a political point that doesn't even work. This sentence could be expanded to explain emergent and non-emergent systems and when and where each kind of process works. It is probably easier to delete the sentence.
The first sentence of the last paragraph should be like "Computing professionals have additional obligations …" They have this obligation at work as well as when they volunteer, and so on. Why limit?
Another problem with the section is that many issues are organizational, not professional. No individual could be held responsible for bugs in a large app, like Office or Linux, because they are so huge. The bugs must be fixed by the organization. Individuals work through organizations and the code of ethics does not really show that it understands the role of companies and open-source organizations in the economy. Organizations can do many things that are beyond the ability of individuals.
In the first paragraph, move references to ACM to 4.2. Or possibly add section "1.8 ACM Obligations" as a separate moral principle.
The sentence "Systems whose risks are unpredictable …" is problematic. Unless you can predict the future, then all systems have unpredictable risks. If you require "predictable risks" then no app should ever be deployed. Risks are a huge field that cannot be reduced to 2 paragraphs. I know of no tool or app that cannot be abused.
What does the word "special" mean in "special trust" and "special responsibilities"? Removing "special" still makes sense, and I don't know what it refers to.
The second paragraph is most cross reference, and should be deleted.
The paragraph that starts "the major underlying principle …" is about something more general. Probably it should be moved to somewhere in section 1.
This section seems oriented toward consultants and in particular DOD consultants. I don't have any suggestions but it feels odd.
- This is not an ethical principle, but a commandment for people to act. Is everyone expected to join social media and correct everyone's misstatements? Is this a commandment for social activism? Seems doomed in a "post truth" world. The word "imperative" takes it beyond ethics. I suggest deleting this section, or moving it to an ideals and hopes section 5.4.
- Many of the leaders in companies are business people or managers or others who are not obligated to follow these principles. This feels like a gigantic hole in section 3. Don't know what to do about it.
The discussion of "general good" should apply to both practitioners and managers. Possibly move to a section on shared responsibilities. Possibly move to General moral principles.
The list of tasks may be done by either practitioners or leaders. Possibly move to shared responsibilities.
- This point seems to encourage the creation of more bureaucracy, which in my experience is a bad idea. Perhaps backing off to find the underlying principle may help, but I cannot think of a way to do so. I don't like this point, but don't know what to recommend.
- This point is hard. Many management monitoring and reporting tools seem to remove the dignity of users, but computing professionals are paid to write those reports. Worthy of discussion of how to live up to it.
- I would split the last sentence into a separate paragraph.
- The first half could be made less educator-centric by referring to the rapidly evolving field.
- The second half is an excellent point. I would extend it to state that psychologists and sociologists now know that all humans are subject to cognitive biases and that they all use simple models to cope with their complex realities. There are no easy answers.
- In many places, there are cross references like in 1.2 "consider principle 3.7" All of these should be deleted. They are wordy and annoying. They add noise. Ideally everything should be linked. These cross references perhaps belong in a separate rationale document. Perhaps every responsibility should be linked to a general moral principle that it derives from.
* Perhaps my biggest complaint is that all ACM members are not equal under ethics rules. There are lots of rules for practitioners, about 2/3 as many for managers. And none (NONE) for academics. In the 1970s through the 2000s, Edsger Dijkstra was known for his acerbic wit, and he frequently wrote and said things that seemed to bash programmers. Many other academics (who lacked his wit) also followed his example and bashed programmers. For example, the Sigsoft Risks column commonly blames programmers for things that are very complicated, such as security and privacy problems, etc. More recently an entire field has emerged to deal with those issues, showing just how hard it is.
- Under this code of ethics, academics can bash away, believing "i am just blowing the whistle" or "i am revealing a problem" or whatever, with no consequences. But, if practitioners are saddled with section 2, then academics should be saddled with something as well. Otherwise, this code is merely another skirmish of class warfare by the elite.
Waffling between Idealism and Pragmatism
* Some principles start with a statement of idealism and then state how it can be worked around, if necessary, for example whistle-blowing. Some principles are pure idealism, for example professional review. It seems that someone should look at the waffling and make it consistent throughout.
The Roles of Corporations and Organizations
* Large projects are beyond the ability of individuals to complete. They are build by corporations. So, either the corp fixes the bugs or not. This code is focused on the individual programmer or manager and society, but does not seem aware of the intermediate roles.