IF you still remember Sony Corp. as the company that defined the cutting edge of technology in the 1990s, its chief executive officer has some good news for you.

The past two decades have seen the company drift further into the entertainment business and away from the hard edge of new tech, but CEO Kenichiro Yoshida is keen to rebalance that equation, sending out the message that Sony is once again a place where engineers can dream big.

Sony held its first ever research-and-development briefing two weeks ago, showcasing projects that most people wouldn’t expect from the maker of PlayStation consoles and Spider-Man movies: from a lightweight motion capture system to a remotely operated robotic arm capable of handling objects just milliliters in size. After a six-year absence, Sony also plans a return to the CEATEC consumer electronics show next month with a medical tech exhibit.

“Technology is the thing that unites and animates Sony’s various businesses,” Yoshida said. “Sony’s purpose is to fill the world with wonder through the power of technology and creativity.”

A number of exhibits at Sony’s R&D showcase seemed close to the brink of becoming real products, while others were manifestly long-term investments and investigations of how technology might develop in the future.

Sony is developing a technology that can map an individual’s unique ear shape with just a smartphone photo, which can then be used to program headphones with 360-degree audio.

In another demo, engineers showed how six matchbox-sized wearable devices can convert a person’s entire body into a game controller. Sony’s setup uses two off-the-shelf sensors found in every phone to track acceleration and rotation. The trick is to use deep learning networks to extrapolate the positions of knees and elbows for full-skeleton tracking. The company expects applications will range from gaming and movies to health care and augmented reality.

Sony also showed off some real-time ray-tracing graphics, which at a 4K resolution require keeping track of 597,196,800 trajectories for each frame of video. The unique Sony spin to this is that the company has developed a neural network engine to extract textures from nearby objects and render them instantly on the screen.

“When it comes to research, we have been guided by where we believe the world is heading and the things we want to build, not by Sony’s immediate needs,” said Hiroaki Kitano, director of Sony’s Computer Science Laboratories, in an interview.

Created in 1988 and modeled after Xerox Corp.’s Palo Alto Research Center and AT&T Inc.’s Bell Labs, the CSL is Sony’s tech innovation incubator. It survived even the worst of the company’s budget cuts by keeping its small staff focused on high-impact, long-term projects, and now the company’s CEO is leaning on the group to help it steer a new course.

CSL’s most visible contribution to date is probably the development of the operating system for Sony’s Aibo robotic dog, but almost all of the tech giant’s businesses have benefited from CSL research, Kitano said.

Other projects currently underway include a robotic leg prosthesis, AI-assisted music composition software and a portable 360-degree video system that allows remote users to “jack in” in real time. — Bloomberg