The prototype has multiple displays, cameras and projectors, no need for an additional PC or app. Consumers face it and sensors automatically calculate body measurements and stature. The environment is also scanned and captured.
The mirror then reflects the image with those garments that are to be purchased.
Presumably, consumers can try on different garments by moving their hands or by voice. Amazon promises realistic imagery that directly shows consumers how clothes would "fall" in reality.
"When the user looks in the mirror, he sees the reflections of the previously exposed objects as well as those of the software create images as if they were all part of a reflected scene, "says the patent text on how the mirror works.