(click to enlarge - strongly recommended)
All screenshots so far (incl. wireframes and more techdemoish ones):
Originally Posted by http://www.wired.com/gamelife/2012/05/ff_unreal4/?pid=2555
The Imagination Engine: Why Next-Gen Video Games Will Rock Your World (click -> whole article)
UE4 represents nothing less than the foundation for the next decade of gaming. It may make Microsoft and Sony rethink how much horsepower they’ll need for their new hardware. It will streamline game development, allowing studios to do in 12 months what can take two years or more today. And most important, it will make the videogames that have defined the past decade look like puppet shows.
Will that be enough? Today’s videogame industry generates about $65 billion a year in revenue, and the vast majority of that comes from premium titles that can cost upwards of $150 million to produce (and have the potential to rake in hundreds of millions of dollars on release day alone). But paradigms are shifting: Cheaply developed mobile titles and an unforgiving economy have cast doubt on the future of the blockbuster game. Why go big and risky when you can be safe and profitable? Unreal 4 is Epic’s answer to that question. With it, the company is staking its existence on a bold prediction: that the future of the industry depends on ever-more realistic visual spectacle.
For all their profits, many videogame companies have never bothered to upgrade their offices to the bucolic campuses and Infinite Loops of the personal-computing world. Epic’s headquarters, outside of Raleigh, North Carolina, are housed in a squat concrete building off a winding drive in an industrial park. Still, there are plenty of perks inside—a gym, a quiet room for stressed developers, and an amply stocked kitchen. There’s also a massive motion-capture studio for translating live action into animation and a traditional art studio where game artists can keep their nondigital skills limber. For the requisite rejuvenile whimsy, there’s a tubular slide that takes people down from the second floor to the common lounge. (When it was first installed, it shot people out so quickly that Epic had to adjust the angle for safety—there’s no tweaking our planet’s physics engine.)
It’s late February, a week before the Game Developers Conference in San Francisco, where Epic will be unveiling UE4 for the first time outside of the office. Reps from Microsoft, Sony, Nvidia, and the most influential game developers in the industry will be seeing the demo behind closed doors; the NDA-only affair will be Epic’s first and best chance to convince them that the future of gaming is unlike anything they previously imagined. “There is a huge responsibility on the shoulders of our engine team and our studio to drag this industry into the next generation,” says Cliff Bleszinski, Epic’s design director. “It is up to Epic, and Tim Sweeney in particular, to motivate Sony and Microsoft not to phone in what these next consoles are going to be. It needs to be a quantum leap. They need to damn near render Avatar in real time, because I want it and gamers want it—even if they don’t know they want it.”
While “damn near render Avatar in real time” likely isn’t up on a whiteboard in the office, it’s the kind of rapid-fire hyperbole that has made Bleszinski the face of Epic to many gamers. For his part, though, Sweeney is a bit more diplomatic. “We’re much more in sync with the console makers than any other developer is,” he says. “That means we can give detailed recommendations with a complete understanding of what is going to be commercially possible.” In other words, Epic has seen the specs of proposed new consoles and is actively lobbying for them to be more powerful. It could be a bad sign for the industry if new, relatively underpowered consoles make an appearance at this year’s E3 consumer show (as is popularly rumored about Sony’s PS3 successor, the alleged specs of which leaked in April).
As early as last March, Epic was making the case for more power with a demo screened at the 2011 GDC. Called Samaritan and built in Unreal Engine 3 with a new set of specialized plug-ins, the video showcased the rendering power of current high-end hardware, displaying an impressive array of effects, like realistic clothing, lifelike lighting, and highly detailed facial expressions. It took three high-end graphics cards to handle the demand, but it grabbed people’s attention. “We used it as an opportunity to make a point to the developers,” Sweeney says. “‘We want 10 times more power; here’s what we can do with it.’”
And that was merely for a souped-up version of Unreal 3. For Unreal 4, yet another quantum leap in hardware has to happen. Creating a game that operates on a level of fidelity comparable to human vision, Sweeney says, will require hardware at least 2,000 times as powerful as today’s highest-end graphics processors. That kind of super-hi-def experience may be only two or three console generations away, but it hinges on manufacturers moving toward the power levels Sweeney is looking for today. He needs the next generation of consoles to be good.
In a scant three months of production, a team of 14 engineers has fashioned a video demo to show off the new engine, and it acts essentially as a full-featured, if small, top-of-the-line game—the first title of the next generation. “I had sleepless nights over this damn thing in the beginning, but I think we got the disasters out of the way,” says art director Chris Perna, the man responsible for the look and feel of the demo. Lead artist Wyeth Johnson adds, “In the time I have been here, we have never not pulled it off.” Johnson, a six-year veteran of Epic, is referring not only to the company’s ability to deliver on tight deadlines but also its track record of wowing the skeptics.
Like so many games, the demo begins with what’s known as a cinematic, a noninteractive scene meant to wow players with all the punch of a blockbuster movie trailer. In this case, it’s as if H. R. Giger and George R. R. Martin took peyote together. And had a baby. And that baby had a fever dream. But it’s not just empty spectacle—it’s a crystal ball. Every pixel is spent on visual effects that are impossible in today’s games because of hardware limitations. But those limitations could be overcome: In an impressive departure from the usual practice of such demos, this one is running on a single consumer-level graphics card—Nvidia’s new Kepler GTX 680.
Here’s what the Unreal guys are hoping will singe the eyeballs of executives, hardware engineers, and game developers when they see it at GDC: A heavily armored demon knight sits frozen to his throne in a ruined mountain fortress. As he awakens, lava begins to flow around him and flames engulf the world. A magma vent spews a column of smoke and smoldering embers. He stands, sending up showers of sparks that dance, fall out of focus, and fade into ash. The knight hefts a massive hammer that glows with an inner fire. As he stalks down an empty corridor, a deep rumble sounds and masonry falls from the ceiling—this is no mountain but a volcano on the verge of eruption. When the knight steps outside, we see a range of snow-capped peaks in the far-off distance, rendered in stunning clarity. Behind him the volcano belches black smoke, while burning embers mix with swirling snowflakes.
In previous engines, one floating ember was enough to slow performance considerably; a shower of them was impossible. With Unreal Engine 4, there can be millions of such particles, as long as the hardware is potent enough to sustain them. Game developers overuse features of every new engine, because they are suddenly so easy to implement. In the original Unreal Engine, for example, the ability to render colored lighting led to a rash of games that employed the effect. The same may prove true for UE4′s particle effects, for better or worse. (“Mark my words,” Bleszinski says, “those particles are going to be whored by developers.”)
In one 153-second clip, the Epic team has packed all the show-off effects that have flummoxed developers for years: lens flare, bokeh distortion, lava flow, environmental destruction, fire, and detail in landscapes many miles away. Plus, it’s breathtakingly photo-realistic—or would be if demon knights were, you know, a real thing.
But that’s just the opening scene. After the cinematic, Epic’s senior technical artist, Alan Willard, starts playing the demo. At this point the view switches to that disembodied first-person perspective made so ubiquitous by shooting games like the Call of Duty franchise and Epic’s own influential Unreal titles. Willard maneuvers his avatar into a dimly lit room where a flashlight turns on, revealing eddies of dust—thousands of floating particles that were invisible until exposed. In another room, globes of various sizes float in the air. Willard rolls a light-emanating orb along the floor (think of a spherical flashlight that rolls like a bowling ball) and beams of light wobble and change direction, illuminating parts of the room and revealing the clusters of floating spheres with a kind of strobe effect. At first it all seems perfectly familiar: “Well, yeah,” you think, “that’s how they’d act in the real world. What’s the big deal?” But it is a big deal: This is stuff that videogames have never been able to simulate—the effects simply aren’t possible on today’s consoles.
In the past, game developers employed a trick known as staged lighting to give the impression that light in a game was behaving as it would in the real world. That meant a lot of pre-rendering—programming hundreds of light sources into an environment that would then be turned on or off depending on in-game events. If a building collapsed in a given scene, all the light effects that had been employed to make it look like a real interior would remain in place over empty space. Shadows would remain in the absence of structure; glares that once resulted from sunlight glinting off windows would remain floating in midair. To avoid this, designers programmed the light to look realistic in any of that scene’s possible situations—one situation at a time. “You would have to manually sculpt the lighting in every section of every level,” Bleszinski says. “The number of man-years that required was astounding.” UE4 introduces dynamic lighting, which behaves in response to its own inherent properties rather than a set of preprogrammed effects. In other words, no more faking it. Every light in a scene bounces off every surface, creating accurate reflections. Colors mix, translucent materials glow, and objects viewed through water refract. And it’s all being handled on the fly, as it happens. That’s not realistic—that’s real.
Before Epic Games, there was Epic MegaGames. And before that, there was Potomac Computer Systems, the company Tim Sweeney founded in Rockville, Maryland, in 1991. At first the company released shareware games on 3.5-inch floppy disks, sometimes packed in Ziploc bags. Titles like ZZT and Jill of the Jungle were simple but showed Sweeney’s coding chops: ZZT was written in a scripting language Sweeney invented, and it was one of the first releases that allowed players to create their own games using its tools—essentially a playable experience and a development kit in one.
Work on the first Unreal engine didn’t begin until 1995, after Potomac had changed its name to Epic MegaGames. Sweeney and company were inspired (or perhaps threatened) by the rise of id Software’s classic shooting games Wolfenstein 3D and Doom. Id later became the first company to develop true 3-D graphics—not in the $10-for-special-glasses kind of way but by simulating a three-dimensional space through first-person perspective. And while 3-D graphics would eventually permeate all of gaming, id’s early titles used the technology for its original Platonic ideal: allowing you to run down cramped, claustrophobia-inducing hallways, shooting anything that moved. Faced with the prospect of obsolescence, Epic MegaGames brought together all of its smaller-project teams for a go at what Sweeney calls “the big boy’s business”: 3-D.
After three years of development, the Unreal Engine debuted in 1998 in Epic’s first-person shooter Unreal,powering the game’s wide-open outdoor areas—which had traditionally bedeviled developers with their combination of natural lighting and need to render far-off objects—and the highest level of detail ever seen. The success of the first Unreal Engine allowed the company to move to Cary, North Carolina, near Research Triangle Park, which offered a larger pool of prospective employees for the rapidly growing company, as well as a lower cost of living. Here it dropped the word Mega from its name and built its own dedicated studio. Once settled in, Epic Games set its sights on a new, improved engine. In 2002 Unreal Engine 2 debuted, featuring better graphics, animation, and lighting and the addition of rag-doll physics—so named for the way dead bodies behave when falling. While Unreal 1 had been licensed by a few dozen other developers, Unreal 2′s marriage of high-end visuals and usability was a boon to smaller studios, and the engine would eventually be used to create more than 100 titles. But it wasn’t until 2006, with the release of Gears of War, that Epic cemented its position as the industry standard.
Gears was one of the first games created with Unreal Engine 3, and it became the first runaway success of the newborn Xbox 360. A brutal third-person shooting game in which the player’s steroidally muscled supersoldier moved from cover to cover while fighting alien hordes, it displayed an unprecedented graphical fidelity: Fine details, lighting, and motion-blur effects came together to create an experience that had never been seen before. For many it was the moment when the current generation of gaming truly began.
Now, six years later, Unreal 3 is everywhere. Beyond the scores of console titles it powers, UE3 has pushed the limits of tablet gaming with the Infinity Blade series on the iPad. Epic recently adapted it to work within Flash, allowing the multiplayer shooting game Unreal Tournament 3 to run at a blistering 60 frames per second—the magic threshold for a game to be truly immersive in high-definition displays—in a web browser. Moreover, Epic has courted indie developers with the release of the free Unreal Development Kit, a simplified free version of UE3 that eschews expensive licensing fees in favor of a cut of any profits after the first $50,000 in sales, greatly reducing the financial overhead for first-time and noncommercial developers.
Making a splashy videogame used to be something that a small group could accomplish. Now it takes a small army. “Call of Duty was a game that a team of a few dozen could develop on PlayStation 2,” Sweeney says. “Now Activision has hundreds of people working on Call of Duty for the current-gen consoles. What’s supposed to happen in the next generation? Are they going to have 4,000 people?” To combat the bloat, Sweeney has stuffed UE4 with tools that promise shortened production pipelines and lower production costs (and all the profit that such efficiency represents).
How does that happen? For one thing, Unreal Engine 4 allows developers to see changes to the game instantly, as they work. Current production pipelines have the least WYSIWYG process imaginable: For example, when lighting elements are altered, computers have to parse the data and figure out how to render the changes. Depending on the extent of those edits, this process, sometimes called baking, can take half an hour or more. UE4 removes that bake time entirely. The effect it could have on studio workflow is staggering.
Most interesting, though, is Kismet 2, Epic’s newest visual scripting tool. Scripting is the way programmers define the attributes and actions of all the objects within the game world—everything from how doors open to when bad guys spring their preprogrammed ambush. In Unreal Engine 2 this was all accomplished using strings of code that connected objects and their behaviors in a web of cause-and-effect relationships. A good example is the connection between a switch and a lightbulb. Flip the switch one way and the light goes on; flip it the other way and the light goes off, as specified by the code. What happens, though, when turning on the light needs to trip a silent alarm that alerts guards in the next room? What if you’re wearing a stolen guard uniform when they enter? As events accumulate in a game, that web of relationships becomes significantly more elaborate, making it a Herculean task just to manage and troubleshoot the code. In Unreal 3, Epic addressed this by developing Kismet, a tool that simplified the scripting of minor tasks—that relationship between a switch and a lightbulb—by allowing the programmer to choose from a palette of options, no coding required. It was like jumping from the clunkiness of MS-DOS to the relatively intuitive world of Windows 3.0.
Then something surprising happened: Kismet democratized programming. “There were people who weren’t programmers but who still wanted to create and script things,” says James Golding, senior engine programmer. In other words, some artists weren’t content simply to draw the monsters; they wanted to define how they acted as well. Kismet let them do that. “When we got them a visual system,” Golding says, “they just went completely bananas with it.” This was off-label usage, though; while it was a great secondary benefit, Kismet hadn’t been designed for this task, so it was kludgy and slow.
And thus was born Kismet 2, which again converts tedious lines of code into an interactive flowchart, complete with pulldown menus that control almost every conceivable aspect of behavior for a given in-game object. Need to determine how many bullets it will take to shatter that reinforced glass? Kismet 2 is your tool. Once behaviors are set, they can be executed immediately and edited on the fly. With Kismet 2, Epic empowers level designers—the people responsible for conceptualizing the world—to breathe life into that world directly, rather than relying on programmers to do it on their behalf. Says Golding, “We’re turning our level designers into godlike creatures who can walk into a world and create with a swipe of their hand.”
The possible applications for Unreal Engine 4—augmented reality, medical simulation, even production pipelines for television and movies—seem to stretch to the horizon. At its core, however, UE4 is a videogame engine, and its first reveal outside the office is on a March morning at the 2012 Game Developers Conference. It’s D-day for the Epic team; this is what they’ve been working feverishly toward. Inside the Moscone Center in San Francisco, though, the mood is less Normandy than it is Camp X-Ray. Thirty people file into a windowless conference room to watch Epic’s demo. Around their necks hang badges advertising the names of their employers: Nvdia, Microsoft, AMD, Sony. Video cameras dot the walls, and there’s one hulking security guard on each side of the door. (Apparently, when you’re showing off progress, ingress and egress are out of the question.)
When Alan Willard walks the audience through the demo—complete with armored demon, dancing sparks, and rolling balls of light—the room falls still. Then the twist: Willard reveals that both the cinematic scene and the following tech demo haven’t been running off a game file but in real time from within UE4′s game editor. It’s like finding out that the actors on TV are actually tiny people living inside your set. It also helps him show that changes can be made to the game’s design and code, recompiled and executed nearly instantly—a technical feat that has been simply unheard-of in game development. And just like that, the silence in the room becomes reverent. The videogame industry has changed.
In June, UE4 will be revealed to the gaming public. The reactions will likely be as spontaneous as staged lighting effects used to be. It’s all pre-scripted at this point: Fanboys will wet their pants, contrarian analysts will wring their hands, message boards will explode in either fury or collective orgasm. In all of the clamor and fanfare, though, the simple truth will be lost. Epic has redefined gaming before, and with Unreal 4 the company is doing it again.