摘要: Should lethal autonomous weapons systems—‘killer robots’—be used in war? There is a growing campaign favour of an international prohibition. The procase remains minority report, and those who have argued for it done so on implicitly consequentialist basis. Our task to defend the permissibility killer robots, indeed morally obligatory nature their development use, but basis non-aggregative structure right assumed by Just War theory. This necessary because most important arguments against proposed Rob Sparrow, make same assumptions. We show that robots can satisfy demands respect; particular, do not require individual responsibility every person killed war. Instead, crucial moral question which policy debate turns whether technology requirements fair re-distribution risk.