I, Robot Morality and Ethics Quotes

How we cite our quotes: (Story title.paragraph)

Quote #10

"Think about the Machines for a while, Stephen. They are robots, and they follow the First Law. But the Machines work not for any single human being, but for all humanity, so that the First Law becomes: 'No Machine may harm humanity; or, through inaction, allow humanity to come to harm.'" (Evitable Conflict.212)

In later robot books, Asimov's robots come up with something called the Zeroeth Law, which is basically this idea here: robots have to prevent harm not just to individuals, but to all of humanity. Now, this raises some problems in this story, since in order to help people in "The Evitable Conflict," the Machines have to slightly hurt some individuals, those who belong to the Society for Humanity. So what do you think—is it more moral to help many by hurting a few?