READ MORE: Who’s going to make the metaverse? (RedShark News)
Astronomers tells us that our universe is expanding. Astrophysicists theorize that there are an infinite number of stars in the universe, that there are multiple universes, perhaps connected via wormholes, those folds in time and space. Scientists would dismiss any notion of a creator guiding all of this, while the concept of what it is exactly the universe is expanding into remains opaque.
All this by way of introduction to another #metaverse story:
“It’s easy to talk glibly about how the metaverse is an interconnected nexus of 3D worlds without asking some pretty fundamental questions,” muses David Shapton at RedShark News. “One of these is ‘who’s going to make the metaverse?’ ”
Shapton thinks that a true metaverse — that is one with no boundaries — is a thing of infinite complexity, not just in a physical sense, but in the sense of interactions and outcomes. So, the problem boils down to how we design something infinitely complex with finite tools.
Perhaps evolution can provide the answer? In contrast to an intelligent “watchmaker” creating the complexity of living beings in one masterplan, evolutionary theory argues that only by natural selection — the blind watchmaker — has life on earth reached its current state.
“Evolution leads to absolutely staggering complexity and results in seemingly impossibly complicated biological machines,” says Shapton. “So maybe that’s what we need for the metaverse: evolution.”
Evolution of the metaverse can of course be sped up by the trial and error of artificially intelligent computers but Shapton doesn’t think this will work.
“Natural evolution went through a phase like that. It was called the Cambrian period, and it was characterized by an explosion of weird and sometimes wonderful life forms. Unfortunately, few of their descendants exist today because they were so odd and, well, speculative, that they just weren’t destined for this world. Because of the trend towards oddness in the short term, we can’t rely on artificial evolution to give us a convincing metaverse. Instead, we need more organization and purpose than that.”
That implies human input — and some level of supervision. None of which solves the problem of where all the detail in this all-embracing, immersive virtual world will come from.
“Every object in the metaverse will have to contain and share its own data. That’s data about the physics of the object — texture, softness, rigidity, elasticity: any number of physical characteristics that will need to be able to interact with other things.”
He looks to games for an answer: “What if we can distil the essence of a 3D world to a set of procedural rules?” Shapton ponders. “Like the rules for designing a city? You could teach a generative metaverse program what a city is ‘like.’ What a forest is ‘like’ or what an alien planet is ‘like.’ If we can distil that ‘essence of the experience’ into a set of rules, we can generate fully authentic experiences.”
I’ve no doubt there are teams of programmers on this very case at MIT or academia or the labs at Meta and Microsoft.
Incidentally this goal, or something like it, has been envisaged by cinematographer Greig Fraser as the ultimate in virtual production. Fraser is the leading cinematographer engaged in virtual production, having established the template with Jon Favreau for The Mandalorian and using volume stages most recently on The Batman.
The state of the art of virtual production at present means that filmmakers need to specify in advance where the camera will be looking in a volume. It’s more time and cost efficient to build the specific digital assets that will be shot in the games engine rather than create a full scale photoreal digital construct of the entire virtual world. But that is where virtual production is heading, Fraser predicts.
“In theory you could build an entire world for your film in the games engine much like Fortnite,” he says. “For example, if we did that for Gotham City, it would allow a director to choose anywhere in that city they wanted to shoot on any given day. You might decide to shoot on the corner of first and second street. Or high up on the Empire State. You can change the light, change the props and shoot. That’s what the future could be once the processing speeds up.”
NAVIGATING THE METAVERSE:
The metaverse may be a wild frontier, but here at NAB Amplify we’ve got you covered! Hand-selected from our archives, here are some of the essential insights you’ll need to expand your knowledge base and confidently explore the new horizons ahead:
- What Is the Metaverse and Why Should You Care?
- Avatar to Web3: An A-Z Compendium of the Metaverse
- The Metaverse is Coming To Get You. Is That a Bad Thing?
- Don’t Expect the Metaverse to Happen Overnight
- A Framework for the Metaverse from Hardware to Hollywood and Everything in Between
A good analogy (though one I am sure can be easily broken down) is to play any computer game. The world you are playing in may be photoreal and interactive but you don’t have to stray far from the main board for your character to become stuck, to be reset, unable to proceed in the glitchy edges of the virtual design. Any true metaverse needs to counter that. It needs to be at very least self-healing, creating the world that surrounds the participant on the fly.
Taken this way, perhaps the answer is a form of intelligent coding, such as that which is being introduced to VR experiences. Sometimes called tiled encoding, the technique just renders the portion of the digital asset which you are looking at any point in time.
Shapton says, “What we’ll see next, and we’ll definitely need this, is a full-blown AI engine at the heart of metaverse creation and visualization. One that understands what a multiplicity of human experiences of the world are ‘like.’ ”