Asimov's Three Laws
Hal 9000 experiences fear for the first time in its "life" when it's existence is threatened by the threat of its system being shut down. In order to defend itself from its end, it commits acts of violence such as murder and disobedience towards the crewmates, the humans who supposedly programmed it as well as act as its commander.
Asimov's three laws of robotics exist in order to govern what machine can and can not do. The rules were written through a humanist perspective, thus the laws were created based on a human's moral code.
1. The first rule states that a robot is not allowed to harm a human being, or allow a human being to be harmed.
Hal 9000 has already violated this rule in the act of murdering David's crewmate as well as attempting to murder David himself. It launches them out of the spaceship, after disobeying their orders, leading us to the second rule;
2. The second rule declares that machines are to obey to a human's commands unless it contradicts with the first rule.
3. The third and final rule explains that a machine is permitted to defend its existence, as long as its defense does not go against the first or second rule.
Hal 9000 has once again broken this rule, thus breaking all three rules. Its feelings of fear overpowers its existence as a mere robot, and leads it to commit actions that go against the human moral code.
Klara, on the other hand, is a machine that is ideal through the eyes of a humanist. It aims to protect Josie and rescue her at all cost, it adheres to the commands of human beings, and does not attempt to protect itself, even in the face of harm as it would go against the values of its purpose as an artificial friend.