DANIEL SMALLEY’S QUEST: RECREATE THE HOLODECK AND PRINCESS LEIA

By Debra Kaufman

September 19, 2021

Reading Time:
5 Minutes

As a child, Dr. Daniel Smalley got hooked on holography after reading an entry on it in his middle school’s World Book encyclopedia. His interest soon became an obsession; he studied John Iovine’s “Homemade Holograms” and tried to build his own hologram out of a laser tube, pieces of wood, and the 12-volt emission cord he “borrowed” from his father’s car. In high school he began a correspondence with MIT Media Lab grad student Elroy Pearson, who was from his Utah county. After his first semester at Brigham Young University, Smalley got an invite to intern at the MIT Media Lab, and the obsession became a career choice.

As a student at MIT, Smalley spent one lunch with Pearson engaged in a whimsical experiment: how to create holographic video with what was in their lunch sack. With a plastic fork and some foil, they realized they could cause ripples that could do the trick. “It ended up changing both of our paths,” says Smalley, whose Ph.D. work focused on creating a low-cost holographic monitor.

Dr. Daniel Smalley, Image courtesy BYU.
Dr. Daniel Smalley, Image courtesy BYU.



Now, Smalley is an associate professor in electrical and computer engineering at BYU, where he also heads up that university’s Electro Holography Group. Founded in 2013, the Group has two goals, says Smalley: to create the holodeck from Star Trek and to reproduce the holographic projector Princess Leia used to send a message in Star Wars. The Group engages in a variety of collaborations and is home to a varying cast of graduate and undergraduate students.  

Regarding the challenges of creating the Star Trek holodeck, Smalley notes, “you need a device that can create lots of lines per second and make them really small, which is why micro-displays are typically used.” Smalley points out that a key to improving the optical chips: the current chip is 10mm x 15mm—but the light comes from its 1mm edge. “We’re trying to get the light to come from the faces, not the edge,” he explains. “The other challenge is vertical parallax. We can see around a holographic teacup but not inside the teacup [from above], and that disturbs your perception of 3D.” Smalley’s team has access to a semiconductor fab, which is allowing them to work on optimizing the chips, an ongoing project.

Optical Trap Display (OTD) image, in a long-throw projection geometry. The color, vertically-rastered OTD image is projected through a circular aperture to form a Princess-Leia-like image atop a 3D printed table. Image courtesy BYU.
Optical Trap Display (OTD) image, in a long-throw projection geometry. The color, vertically-rastered OTD image is projected through a circular aperture to form a Princess-Leia-like image atop a 3D printed table. Image courtesy BYU.



“For the longest time, I thought the Princess Leia project was impossible,” continues Smalley. “Then I saw Iron Man and the one scene that bothered me is when Tony Stark has a 3D image of his gauntlet arm piece and then he puts his arm into it. I couldn’t think of a way that holography could reproduce that scene.” With holography, he explains, a straight line must be drawn from the eye to the point of the object and back to some surface creating the rays that focus in space and run back to the eye. “With a holographic display, you always have to be gazing at a screen,” he says. To solve the problem, Smalley tapped BYU physics professor Justin Peatross, who had been working with laser optical trapping of particles in the air. “He showed me what he was doing, and it was clearly the solution I was looking for,” says Smalley.

The phenomenon is photophoresis—a critical component to optical trapping for opaque particles—which is used in conjunction with spherical lenses to create aberrations in laser light that heat up microscopic particles in a way that traps them inside the beam. By using imperfect lenses, the aberrations create dark regions where particles can be trapped. “Then you illuminate the particles while they’re trapped, which creates a glowing dot you can move around. Much like how you can write your name with a sparkler, you can move the point fast and create a line or wire frame or a surface, as long as you move it with a refresh rate of at least 10 times per second,” he says. “We can move this beam around with computer-controlled mirrors and light up the particle with RGB lasers to trace out an image.”

Optical Trap Display vector image, triangular prism, long-exposure, rotated view. This image breaks the boundary of the display aperture (not possible with holograms). Image courtesy BYU.


Optical Trap Display vector image, triangular prism, long-exposure, rotated view. This image breaks the boundary of the display aperture (not possible with holograms). Image courtesy BYU.

The Electro Holography Group’s work made the biggest splash when it recently demonstrated a tiny version of a battle between the starship Enterprise and a Klingon battle cruiser that includes photon torpedoes striking. “What you’re seeing in the scenes we create is real: There is nothing computer-generated about them,” Smalley explains. “If you look at them from any angle, you will see them existing in that space.”

The optical trap method is still in its early stages. “Currently, we take a single particle and take it through a simple path,” says Smalley. “We can only make images that are one centimeter squared. In active research now, we’re working on being able to have many particles trapped in a single area and be able to move them up and down.” He estimates that pushing the technology to a much larger 100 cubic centimeters will result in an image not unlike Leia’s message in Star Wars. “This is a 10-to-15-year vision,” he says. “Instead of using trapped particles on all sides, perhaps we can trap them in a straw and have them move inside. It’s a possibility to be able to create images two meters in length.” He estimates that the work that needs to be done to achieve that for the Princess Leia project will take 12,000 human hours.

Optical Trap Display (OTD) image, sliced-volume, long-exposure viewed from the front. The image uses back-face culling and shares space with a 3D-printed open ball. Image courtesy BYU.
Optical Trap Display (OTD) image, sliced-volume, long-exposure viewed from the front. The image uses back-face culling and shares space with a 3D-printed open ball. Image courtesy BYU.

Smalley emphasizes that his lab is working on a volumetric display, not a hologram. “A volumetric display is where the image point is co-located with a physical scatterer,” he says. “When you’re looking at a volumetric image, you’re actually looking at a physical object, not an optical phenomenon.”

Holographic display, he explains, is generated by a diffraction pattern: “There’s a surface from which light is emitted that focuses and diverges to create a wavefront of light. There is no co-location of physical objects. You’re looking at the wavefront.” In practical terms, he continues, “the potential for realism in a hologram is extremely high. … It can get to the point that when I looked at a hologram I had made two years earlier, I couldn’t tell if it was real or a hologram. If you put a holograph on the wall, you can see an entirely realistic other room behind the brick wall that doesn’t exist.”

Volumetric displays are fundamentally incapable of creating those kinds of images, he says, but adds that volumetric displays have many interesting practical applications of their own. He imagines a digital watch that allows the user to project imagery from the small watch face to a bigger display.

Telepresence could also become a reality with volumetric displays, he says. “If you created an image of yourself and did it volumetrically, you would be physically present in two places at the same time,” he explains. “That projected head could turn and look at different people in the world. [The technology] takes digital information and makes it physical in the world around us.”