August 23, 2019

About Us  |  Support

Robot Runs Over Child in California Mall

On Thursday, July 7, an autonomous security robot ran over a 16 month-old toddler. The child’s foot swelled because of the incident and sustained a scrape to the back of his leg, but the toddler is otherwise okay.

The Full Story

Tiffany Teng was walking with her family through the Stanford Shopping Center in Palo Alto, California, when a 300 pound security robot knocked down Teng’s son, Harwin. The robot was introduced to the shopping center last year as a means to alert security personnel of potential risks. Once the robot knocked down Harwin, it continued to move forward and seemed to not notice that Harwin was in the way. Teng reports that a security officer told her that a similar incident occurred a few days prior with the same robot.

The robot involved was a K-5 unit designed and manufactured by Knightscope. It uses several sensors, lasers, and thermal cameras to observe its surroundings. Knightscope claims that the unit is capable of detecting cars, weather conditions, and people surrounding it.

The Stanford Shopping Center has temporarily shut down its security robot system in response to the incident. “We are investigating this incident thoroughly, and the K-5 units have been docked until the investigation is complete.” the mall said in statement on Tuesday.

Teng now believes that parents should be wary of the security robots at these centers, and to keep their children away from them. “Right now I don’t think I would ever go there again.” said Teng.

The Legal Implications of the Incident

The field of autonomous technology poses a number of complicated problems. The new technology raises the question of who is liable when these machines cause an injury. There currently is no legislation that specifically addresses the regulation or liability for autonomous robot owners and manufacturers. However, incidents like this could result in litigation under either premises liability or product liability law depending on the nature of the incident and where it occurred.

For example, a flaw with the design of the machine led to the child being hurt, Knightscope, the company that designed or manufactured the robot, would be liable under product liability law for a defective product design. If Knightscope knew, or should have known, that its machine was unreasonably dangerous to the public, the company would probably be liable.

On the other hand, Stanford Shopping center could be found liable for the injury based on premises liability law. Every business has a duty of care to keep its customers safe, and the use of a device that put its customers’ safety at risk it could be considered a breach of that duty. The question from this perspective is: Did the Stanford Shopping Center know, or should it have known, that the robot was dangerous but decide to use it anyway?

Let’s look at the alleged circumstances. The robot was relatively new in its design and may not have been fully tested to assess its potential to harm to others. Did a flaw in its design cause it to run over the child and cause injury? A security guard allegedly told Teng that the robot had a history of causing incidents with children. Based on this, Stanford Shopping Center knew or should have known that the robot was dangerous to small children, and should have stopped using the robot after the first incident.

The bottom line is that new technology is not always safe technology and the business that manufacture and use it have a duty of care to the people who may be impacted.

About Zac Pingle

Zac Pingle was born in Florida, and grew up in several places across the United States. From a young age, Zac developed a taste for writing, reading under trees and getting into trouble. Currently, Zac resides in Oregon as a college student where he aspires to become an English professor.