Thank you for reading the news about technology: Mark Zuckerberg reveals the details of building the next generation screens for virtual reality and augmented reality and now with the details of the news
Cairo – Samia Sayed – The founder and CEO of Meta Mark Zuckerberg revealed the requirements for building the next generation of virtual and augmented reality screens, as they manufacture 3D screens that resemble the accuracy and realism of the real world, where it requires solving some basic challenges.
“These issues are very exciting because it’s all about how we perceive things physically, how our eyes process visual cues, how our brain translates them in order to build a model of the world,” he said.
Screens that fit the full potential of human vision will enable some very important things, he said. The first is the actual feeling of presence, which is the feeling of being with someone or being somewhere as if you were actually there. Considering our interest in helping people communicate, you’ll see why that’s important, I mean, one day I was testing some of our work on realistic avatars, and it was a mixed reality experience.
And you can see the atmosphere around you and the room you’re in and everything looks the same if you take off the glasses around you, except there’s someone with you and you can walk around where you can see them move and feel like they’re really there and imagine if that person is your family and lives far away Or someone you’re collaborating with on a project, or even an artist you love, too. Imagine what it would feel like to be actually here and physically together.”
Zuckerberg added, “Current virtual reality systems can already give you the feeling of being somewhere else and it’s hard to describe how deep this is, something you somehow need to experience for yourself, but we still have a long way to go in the field of screens and graphics sets before we can reach Visual Realism The reason behind this is that the human visual system is so deeply embedded, just seeing an image that looks realistic is not enough to get that simulated feeling, as you also need all the other visual cues.”
He continued: “It is a more complex problem than just displaying an image that looks realistic on a computer or TV screen, you need stereoscopic screens to create 3D images, and you also need to be able to view objects and focus your eyes on different dimensions, which is different from a screen or traditional displays. Where you need to focus on just one dimension, where you’re holding your phone or where your screen is.”
“You need a screen that can cover a much wider angle of your field of view than conventional screens,” he said. “Having retina-like resolution across the entire field of view requires a method that includes more pixels than a conventional screen.”
He emphasized that the user needs screens that can approximate the brightness and dynamic range of the real world, which requires about ten times more brightness than what we get in HDTVs today, as well as realistic motion tracking with low latency; So that when you rotate your head it feels like it’s in the right position in the simulated world you’re in.”
“To be able to get all of these pixels on the screen, you need to build a graphics stream that can get the best out of the CPU and GPU and which we can integrate into a headset without overheating it which could quickly drain your battery or overwhelm,” he said. Too much physical heat to get too hot on your head” and “definitely you need to combine all of that into a comfortably wearable device.”
“If any of these things are done incorrectly, it will spoil the feeling of the simulation; you will actually feel it more than you might feel on 2D screens right now,” he added. On the other hand, we still have a lot of work to do.”
The founder of Meta explained: “To build these devices, it is a combined effort that includes not only screens, but also recent work on software, silicon, sensors and other devices that must work together seamlessly, but today we will focus on the display system only; it is the last link in the chain , are the components that take the output of the displayed graphics and convert them into photons that your eyes can see, obviously an important step, and knowing what we need to do to get it done as perfectly as possible was the inspiration for what we internally call the visual Turing test. .
The main thing here, he said, is your eyes trying to focus and your inability to do so; Since the display is located at a fixed distance, while we worked on this, we focused our research on two primary areas that we believe have the best chances of making progress at this point.”
He added that the challenge we set for ourselves here is to figure out what it takes to make a headset with the accuracy of the retina, and that means striving to achieve about 60 pixels per degree in the screen, which is about a few times what is currently available, our display systems research team had to To be very creative to achieve this.
And looking at a prototype called Butterscotch that’s accurate enough to read a line of 20/20 vision in a VR eye chart, it’s the kind you’d use if you were going to take a vision test to see if you needed glasses. And these are prototypes, which are custom models that we built in our lab, and are not products for sale that you can buy. But when I tried it, it was an absolutely amazing experience, and you can see the image with incredible accuracy.
Zuckerberg considered this a huge improvement, especially since eye tracking is also an underappreciated technology in relation to virtual and augmented reality, it is how the system perceives what to focus on, how to correct visual distortions, and which parts of the image should be allocated more resources from The last part is crucial because the heat and power casings of the smaller head goggles are too restrictive for wearables to always try to get the best out of the system, and if you can only view the parts you focus on in as much detail, with Having margins at a lower resolution like the human visual system would be a significant improvement, and that brings us to another major limitation in screen technology, which is also important. Because while resolution, variable focal length, and noise all make meaningful contributions to realism, the most tangible aspect of them all is High Dynamic Range, or HDR.
Basically it’s the total brightness and contrast of the screen, since our experience is that while the lighting is bright, you see the colors are bright, the shadows are dark and so the scenes look natural, but the problem now is that the brightness of our screens is significantly behind what the eyes see in the real world.
“We need to reach significantly higher brightness levels than what we refer to as HDR on traditional screens today, so the challenge certainly is that we need to make something that’s battery-powered and can be worn comfortably,” Zuckerberg added. “To find the best path toward that our display systems research team built this prototype, or I think it’s part of the prototype where a very bright lamp was put behind the LCD panels, but it’s very heavy, so we came up with these knobs, but on As far as we know, it’s the first HDR virtual reality system and we internally call it “Starburst”.
“Let’s be clear, it’s pretty much impractical in this first generation for something that can be brought to market as a product but we’re using it for testing and for further studies so we can understand what the experiment might look like, and the goal of all this work is to identify technical paths that will allow us to improve.” purposefully in ways that begin to address the visual realism we need, if we can make enough progress in retinal-like resolution, if we can build systems that are suitable for focal depth, if we can reduce optical distortion and increase brightness dramatically, we have a chance Generally true to create screens that can balance all aspects of the beauty and complexity of physical environments, but it will take a number of iterations of each of these technologies to achieve that, so we have to combine them all together. Big steps forward, while we’re working on how to package all of these different technologies into smaller, lighter, and finally more affordable glasses.”
“We’ve got a couple more things to show you that we’re really stepping in that direction, trying to take everything we’ve learned from our research and try to put it together into small-sized glasses that can get us reasonably optically real. All of these are still just prototypes, but they’re steps Highly effective toward technology that could one day offer groundbreaking products.The first thing we’ll show is an experimental device that has combined some of the latest optical research into practical glasses unlike anything out there now””The ones we call “Holocake 2″, which are the thinnest and lightest reality glasses Virtually we designed it, and it can play any existing VR title on PCs.”
Usually in most VR glasses, the lenses are very thick and have to be positioned a few inches from the screen so that you can focus properly and direct the light directly into your eyes, which gives the glasses that look as they are bulky on the front, but the Holocake 2 has Two new techniques to overcome that.”
Mark Zuckerberg said, “The first is that instead of sending light through the lenses, we are sending it through lens holography, and holograms are basically recordings of what happens as light falls on an object, and because holographic optics are more flat than the object itself, photo optics Stereoscopic is more flat than the lenses it represents, but it affects the incoming light in the same way, so it is a very accurate trick, adding: “The second new technology is that it uses polarized reflection to reduce the effect of the distance between the screen and the eye.” Instead of going from the control panel through a lens Then to the eye, the light is polarized so that it bounces back and forth between the reflecting surfaces multiple times, meaning it can travel the same total distance, but in a much smaller beam. The result is this prototype thinner and lighter than any other configuration, but like anything, When you build these embedded systems, there are tradeoffs and problems.”
Michael Abrash, Senior Scientist for VR Labs, said, “The problem is deciding the right light source. Holocake needs dedicated lasers, very different from the LEDs that are used in VR goggles today. Today lasers are no longer a thing. Weird, but it’s not found in a lot of consumer products with the performance, size, and price you need for consumer VR glasses, we’d need to do a lot of engineering to get a customer-friendly laser beam that meets our specifications, that is, it’s safe, cost-effective, efficient, and can be integrated into a small VR headset. The search for suitable laser sources is still on, but if they prove to be traceable, there will be a clear path to VR screens that look like sunglasses.