Skip to content
archifiction_blog_1920

A Sophisticated Machine Vision System for Processing AI Data

“Intel RealSense acts as the eyes of the system, feeding real-world data to the AI brain that powers the MR experience.”

Today’s immersive mixed reality (MR) environments allow people to work, play, and socialize together in 3D spaces. Unfortunately, traditional MR applications require users to wear bulky headgear to enable these immersive experiences. ArchiFiction has revolutionized this expansive market with a “naked-eye” 3D spatial system that delivers a stunning audio-visual experience without requiring people to cover their faces with electronics.

The foundation of ArchiFiction’s unique 3D world is a human skeletal model and spatial human posture recognition algorithm, integrated with a sophisticated machine vision system. Intel RealSense depth cameras D455 capture the physical world in 3D, providing the raw data for the AI algorithms that ArchiFiction uses to create its immersive environments. Data from the machine vision system is used to identify objects and people, as well as to measure distances.

In essence, Intel RealSense acts as the eyes of the system, feeding real-world data to the AI brain that powers the MR experience. From architectural modeling to industrial simulation, the Integrated Intel Vision Stack lays the groundwork for a wide range of AI applications in retail, education, manufacturing, commercial entertainment, and other industries.

Architectural scale awareness allows people to walk through virtual homes, buildings, and cityscapes.

Architectural scale awareness allows people to walk through virtual homes, buildings, and cityscapes.

The Right Equipment for Demanding Machine Vision Tasks

n’Space system uses two types of camera sensors from Intel RealSense: the RGB visible light sensor and the infrared point cloud sensor. The RGB sensor is used to recognize human movements and postures, as well as to detect specific users. The infrared sensor measures the relative distances between important points on the human body and in space.

The result is a deeply immersive experience where the 3D environment reacts to the viewer’s every move. Photorealistic, three-dimensional images appear before them as they interact with virtual projections, immersing them in highly realistic environments for scenario simulations.

Read the complete case study about ArchiFiction’s AI-driven 3D worlds  

to get blog and news updates.

You may also be interested in

Ready to talk to sales?
Contact Sales
Scroll To Top