It is possible to “see” the Matrix source code in our reality thanks to the Apple Vision Pro, it tells us a lot about its technology and its future.

Perhaps it is a moment very similar to the one we are experiencing The arrival of the first iPhone. I'm referring to the launch of Apple Vision Pro (for now only in the US), which touts the emergence of new apps that take advantage of the hardware capabilities of Apple's new device.

One of the most curious things, and the one that causes the most talk, is… Magic room: This is a native implementation of the VisionOS operating system that makes so-called “network dynamics” visible Apple Vision Pro LiDAR sensor. Essentially, it is a “mapping” of reality using a polygonal grid consisting of thousands of points sampled by this type of sensor.

This polygonal network is capable of Detect depth, position and size For each item in real time (Hundreds of thousands of samples are produced per second), allowing VisionOS to know where each item is in relation to the user.

Apple's augmented reality glasses are called VISION PRO and they look like science fiction

The technology in Apple Vision Pro with LiDAR

Matrix Vision Pro 001 application

LiDAR technology has been present in iPhones since the iPhone 12 Pro, and allows the device to “know” reality to make decisions about depth of field and photographs. Mixed reality applications and other decisions that require distance calculations from this type. On iPhones, it is mainly used to improve photos in low-light conditions, since this type of sensor does not need to be turned on.

LiDAR is an active remote sensing system. That is, it emits infrared laser pulses and measures the return time to the sensor for each point it maps. It does this in real time, calculating about 120,000 samples per second, so we can use Apple Vision Pro to move through a real environment and the system will be able to detect our environment and its size.

See also  WhatsApp has stopped working on more than 20 cell phones

That's exactly what the Magic Room does “Reveal this magic”, as they define it. When the application starts suddenly This network of points is connected in the form of polygons – Just as VisionOS recognizes it through LiDAR – it will appear before us.

Matrix Vision Pro 007 applicationMatrix Vision Pro 007 application(Image captured with Apple Vision Pro) Above, right, is a photo of the balcony in daylight. In capture, interpretation is done in the dark using Apple Vision Pro LiDAR.

I tried the example you see in the photo above on my balcony. The reference photo was taken during the day, but I ran the test with the app at night so you can see that even without light, the LiDAR works perfectly – and Able to “track” reality with amazing detail and speed.

Not only is it intriguing — and amazing, when experienced with Apple's Viewer — it's also interesting Check the power of the device To explain reality. In the app, you can also adjust the development speed and some other special tricks.

Matrix Vision Pro 000 applicationMatrix Vision Pro 000 application(Photo taken with Apple Vision Pro) The room containing the washing machine can be seen perfectly through the wall, thanks to pre-sampling

The thing that surprises you the first time you try it is that It gives us the impression that we can “see things through.”. Not really: the magic room exists maps drawing reality constantly, even if we do not activate retinal perception. If we navigate through the environment previously, the spatial point data will be stored.

See also  How to free up space used by Apple's Mail app on iPhone or iPad

At home, for example, it happens to me with the room where the washing machine is located: He can understand that he is behind the wall With the information you already have from previous samples and the sample in the form from our point of view. This interpretation of the polygonal network almost makes us fool ourselves as if we can “see through the wall.”

Matrix in your living room

Matrix Vision Pro 005 applicationMatrix Vision Pro 005 application

One of the most unusual and exciting uses is to activate the “digital rain” visualization, where we will see how our house magically becomes Interpretation of matrix code, moving and flowing around us. As we move around the room, new parts are revealed, in perfect real time and without any kind of latency or delay.

Matrix Vision Pro 006 applicationMatrix Vision Pro 006 application

You can also change the font size of the flowing code, whose changes we'll see applied immediately in the experiment. And we have the potential Blur the real environment or blend it with the experience Interactive to give a more cinematic feel.

In the photos or videos that you can watch, you can appreciate the feeling of its presence not only in front of you, but also surrounding you. It's a very strange overwhelming feeling, which also conveys Technical capabilities in Apple Vision Proand that games like the wonderful Super Fruit Ninja are actually used to “map” the room and know where the furniture is in our living room so that the fruit bumps, splashes or slides (For example) on the back of our sofa. The future of these applications, more complete, immersive and consistent with the reality that surrounds the user, is just around the corner.

See also  Warner explains how the free character rotation works in MultiVersus

in applesphere | Apple Vision Pro, first impressions: After trying it out for 24 hours I can say it's the coolest thing Apple has done in recent years

Lovell Loxley

"Alcohol buff. Troublemaker. Introvert. Student. Social media lover. Web ninja. Bacon fan. Reader."

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top