(none) Quintin Stone - Editorials
Home
Interactive Fiction
Role-playing Games
UT2003 Mod Summit
Monday, June 24, 2002 11:00 PM

On Saturday, Epic hosted what they called a "Mod Summit" for their upcoming Unreal Tournament 2003. Among the invited were mappers, modelers, and programmers from popular UT mods... and myself. This summit was Epic's attempt to familiarize modders with the finer points of changing UT2003, mainly how the process will differ from original UT. And, yes, there are some major changes in pretty much every category.

Okay, where to start. Let's start with mapping. Alan Willard, Cliffy B, Shane Caudle, Jack Porter, James Golding, and Dan Vogel went over a variety of mapping subjects with all of us. Probably the biggest change is the emphasis on what's called a "static mesh". The basic geometry of a level will still be based on BSP brushes, however, all of the intricate details will be these static mesh. A static mesh is a mesh that's built and skinned in a modeling program, such as 3D Studio MAX or Maya. Kind of like the prefabs people would create for UT, but a little more intricate and involved. Because once imported into a map, it's still a static mesh, it's not turned into a brush with all of a brush's limitations and restrictions. Indeed, a static mesh can be rotated in any direction without issues, it can even be scaled by any amount in the 3 axis directions (you can even negative scale it, for instance by -1 in order to flip it upside down).

Collision works a little differently for a static mesh. You can set it to collide on a by-poly basis, like normal brushes do, however in most cases, this is really unnecesary and wasteful (bad on performance). This is one of the "tricks" that Epic has introduced in order to really boost level performance and be able to put lots and lots of polys into a level. When you create the mesh in MAX or Maya, you also set up a simplified "primitive" version of it to work as the bounding "box". So if you were to create a control panel static mesh for your level, with lots of little buttons and levers on it, it's not really important that each and every button or lever collide with players or projectiles. Or if you think it is, your priorities may be a little off. Instead, your bounding mesh would follow the general contours of your control panel. If you don't want the hassle of making a bounding model for every static mesh, you can ask the editor to build one on its own. Apparently it has limitations in what it can do, though. We didn't see it in action, but we were told it will only auto-build a convex collision primitive, meaning no holes or openings in it. In other words, you can't have UnrealEd make a collision mesh for an archway. In general, we were told, any object/shape with over 200 polys should be a static mesh... anything else probably should be a brush.

All movers are now static meshes, not brushes. Static meshes can be set to either take raytraced lighting or vertex lightning. Smoothing groups are reportedly preserved on your model when you make it a static mesh. There are even more mapping changes to report! Lights no longer have a 255 max for brightness or radius. Textures can now be 1024x1024 and can be either 24-bit bitmaps or 32-bit TGA files (with an alpha layer). BSP brushes don't occlude any more, which is an important change. Occluding means that the engine knows an object is completely blocked from the player's sight and it won't be drawn. If you don't go out of your way to handle occlusion, the game will render all of the surfaces in your level, front to back. You won't be able to tell it's doing this until you look at your framerate. However, by setting up "anti-portals" you can tell the game "players can't look through this". Anti-portals are particular kind of "volume", which a kind of special brush or something like that. By setting up anti-portals throughout your level geometry, you're telling the engine what the player can and cannot see through. This is another way Epic lets you put lots of polys in your level but still get good performance. The engine doesn't have to calculate visibility through all of the surfaces in the game, just the anti-portals. Anyway, this is how I understood it. :)

Water is no longer a zone property, water is created by making a water volume. And wow, they have some damn amazing water effects. When you walk through water, you'll actually leave a wake behind you that spreads out, as do ripples and other splashes. A light can be set to sunlight, which means that any surface visible to a piece of the skybox along its direction will be lit. Something new in UT2003 is "terrain". Terrain is a tesselated plane that you can make any size made up entirely of triangles, which you can also make any size. The editor then lets you use custom tools to raise and lower parts of the terrain, using customizable brushes where you can control the inner and outer diameters as well as the opacity. You also use this brush for other effects, such as smoothing, adding noise, and painting. Each terrain has a base layer, which is a terrain or shader that can cover the whole terrain by default. You can then create other layers with different textures, and paint that texture on top of the base. This makes for amazing effects, as you can blend them together to make very smoothly transitioning textures. You can also turn off sections of the terrain, to make them both invisible and non-blocking.

I think that those who've done work with Quake 3 are probably familiar with shaders. These are composite textures, where you blend images together to get different effects. For instance, you might use one texture as the base color, then another image's alpha layer to tell the engine which spots on the skin should be shiny and which parts should be dull. Or have a texture with a mask that's partly transparent, so you have one part of an image overlayed on top of another image. I'm really just scratching the surface here, because it would take forever to go into all of the possible shader options.

Good news to modders and mappers: UT2003 will have ladder support, and crouching will actually let you pass through smaller areas. Woohoo!

Something new with their Karma physics engine is the ability to link static meshes together with Karma joints. In-game, if you shoot or hit such an object, it will start swinging about its joint. We were shown hanging slabs of meat, lamps suspended from chains, and meat cleavers, all of which would start swinging or spinning after being shot. A very nice effect. And of course, there's the notorious "rag doll" effect. When a person is killed, their corpse is then controlled by the Karma engine, which detects collisions of the player's various bones with other surfaces. The way the bones are connected mimics human joints. For instance, a thigh may rotate freely in any of a number of directions, but a leg joint won't normally bend backwards as the body tumbles down the stairs or crashes down through the geometry of the map. One thing to note is that until a player is dead, their bounding box is still a cylinder as in regular UT. They presenters weren't entirely sure if it was possible to tell which bone a shot hit (something I'm sure all the realism mod makers will want to know).

The particle system in UT2003 looks just amazing. UT just faked it, with particle meshes and animated sprites. UT2003 has a real true-to-life particle system, that lets you do sprite, mesh, or beam emitters. The beam emitters they showed us were used to make very cool lightning effects (and is likely what they use to do the lightning gun's electrical effects). I imagine it can also be used for lasers and similar things. Mesh and sprite emitters really only differ in the type of particles they create. A sprite particle draws a particular texture at its location, a mesh particle draws a fully skinned mesh. In either case, you have a great deal of control over the particles, including their scale, color, and velocity at varying points in time. The particle system in UT2003 looks to be so versatile and adaptive that you can probably spend days creating particle effects and no two will be alike.

Wow, that's a lot of mapping changes, and that's probably the biggest difference between UT and UT2003. But modelers, don't feel left out. Modeling should be easier than ever. It looks like all animation in UT2003 will now be using their skeletal system they introduced in one of the UT patches. In other words, no more vertex deformation it seems. Maya will have direct support for the UT model and animation file formats, and MAX will probably still be using their ActorX plug-in. You can do skin assignments directly in the UnrealEd animation viewer, which has a lot more functionality than before. No more trying to guess which skin was 1, which was 2, etc. You can even use UnrealEd to link different animation files with a single mesh, or various meshes with a single animation. The viewer will also let you create notifies by linking them directly with the time in the animation you're currently looking at. A notify can play a sound, spawn a particle emitter, or call an UnrealScript function directly. For example, all player models have notifies set to when their feet hit the floor, in order to play foot step sounds that are perfectly synced with the model's animation. These notifies are especially useful in Matinee, their custom system that lets you do movie-style animation sequences. Both Matinee and the regular game can also make use of AI scripts. When an AI script is triggered for a bot, it will contain a single instruction for it to follow ("defend"), or a whole set of orders ("walk to this spot, turn this direction, fire weapon, crouch, then fall over dead"). So you can use these scripts to control the actions of actors in a movie or cut-scene, or you can use scripts to make a bot perform special maneuvers in a CTF or other type of map.

Programmers, don't despair. In addition to the engine being rewritten and its backwards compatibility with Unreal being removed, a lot of the class hierarchy and relationships have been reworked. A pawn is now just an object or body (the meat) that can be controlled. Bots and players both use the same pawn classes, it's just that a bot and a player are "controllers" of pawns (the mind). With this new relationship, a pawn can be attached or detached from controllers at will. In other words, you could easily code it to jump from body to body, fully controlling each one in a Ghost Recon fasion. And you don't have duplicate code for both bots and players as you do in UT. If you want code to be shared by both, you can put it in the pawn class instead.

That's not all! No sir! Bot AI has been reworked, and Steve Polge discussed this with us. Hopefully I'll be able to remember it well enough to explain it properly. :) At the lowest level is the single Bot AI, which controls the actions of a bot itself. These include its most primitive functions, such as walking, strafing, firing. Then there's the Squad AI that actually instructs the bot in performing its current role. The Squad AI will tell the bot "move here" or "strafe in this direction" or "fire at this target". Then there's the Team AI, which organizes bots by squads and gives each group its instructions. In FFA Deathmatch, each bot would be its own squad and team, but in CTF, the team AI would organize bots into small groups (such as offense and defense) and then the squad AIs would coordinate these groups.

Included with UT2003 will be Visual UC++ I think it's called, which is the UnrealScript development environment. Think Microsoft Visual Studio, but designed to work with UnrealScript. In fact, it even has a debugger. You can set up stop points in your code and the game will pause when it reaches one, switching your window back to the debugger. You can then actually step through the code to isolate problems or understand how the code is working. Much better than littering your code with log statements! The new Unreal engine supports dynamic arrays, so all those linked lists have gone the way of the dinosaur. Simply increase the size of your array, do the assignment, and your done. Shrinking an array is also possible and just as easy.

UT2003 likely won't support a simple mod-switching system like that of Half-Life right out of the box. However, the team recognized its importance (especially when all us programmers insisted on one) and they plan to release it in one of the initial patches. This will be of great benefit to all us modders, because it eliminates the need for stupid shortcuts and custom installers.

Coders will have a lot of control over animations, apparently. For instance, you can instruct an animation to only play for a certain bone (and its sub-bones), and a different animation on other bones. In other words, you can combine a running animation (for the legs) with a shooting animation (for the torso) and a talking or head turning animation (for the head).

Wow. Okay, that's about all I've got in my notes here. There's probably even more that I didn't write down and can't remember. But I think that's enough to give an idea of what modding will be in UT2003, don't you?

These pages Copyright © 2004-2008 — Contact me at stone@rps.net