Thursday, June 18, 2009

Game Rule Goals, part 1

On my blog, I like to analyze game rules and determine whether they are effective and desirable, and I like to analyze games and suggest new game rules that would improve the game, and I like to design game rules for new games. However, thinking of role-playing games, it is important to note that there are different goals which one might be trying to achieve with a game or a game rule, and sometimes a game rule might be well suited towards achieving one goal, but badly suited for another. So I thought I'd describe some of the classic role-playing game goals.

Probably the most classic goal is the goal of realism. When a rule is being judged according to the goal of realism, it is judged according to whether the effect of the rule is as similar to real life as possible. Before D&D was invented, its predecessors in the hobby game field were wargames, which descend from the concept of actual wargames used to study actual military tactics. The traditional goal of old wargames was realism. This is to say that such games were unconcerned with playability and fun. These are required considerations in the design of any game. But with the focus on realism, these are treated as constraints, with the desire being to maximize realism while maintaining the desired level of playability. I always enjoy reading the Designer Notes of old Avalon Hill wargames, with the detailed discussions of how and why the game rules were chosen in such a way as to maximize historical accuracy.

A second, somewhat different goal is that of genre simulation. When a rule is being judged according to genre simulation, it is judged according to whether the flavor and situations resulting from the rule match the source material of the game as closely as possible. The source material is typically works of fiction, and the difference between realism and genre simulation is the difference between how the world works in fiction, and how it works in real life.

Going back to the original Dungeons & Dragons, it was clearly designed with some core game rules meant to simulate the fantasy genre. The idea of hit points that increase massively as characters gain experience is clearly not realistic, but meant to simulate the idea that the fantasy characters are larger than life, and that great mythological heroes are not going to be killed by a single stray arrow. And, of course, aspects of the game like the magic system couldn't possibly be realistic because the setting of the game is not based on real life, but on fantasy.

When I think back to old D&D, however, my memory of the rules discussions in the old game magazines was that they tended to focus on realism as a goal, rather than genre simulation. It seems to me that in 1st edition D&D, and a lot of other games, genre simulation was used more as an inspiration of the rules rather than a goal of rules. That is, the fantasy genre was used as inspiration to create the fundamental game rules, but once those were created, the original source material was no longer used to guide the development of new rules. Rather, the goal was what I might call "self simulation". The game became its own genre, and the focus of new rules and supplements was to logically extend the game as defined by the basic game rules.

An example of self simulation would be the third edition D&D idea of putting law vs. chaos as magical concepts on a par with good vs. evil, so that you would have lawful weapons that damaged chaotic creatures and so on. The goal here certainly isn't realism, nor is it based on genre simulation (there are certainly fantasy works that involve the forces of law vs. chaos, but not, I think, in the style of D&D, where you have lawful good and lawful evil in some sense banded together against chaotic good and chaotic evil). Rather, this is driven by looking at the alignment system and trying to extend it to its full logical extent. If you already defined that there are 9 equally valid alignments, and they "great wheel" cosmology which implies again that all of the alignment axes are equally important, then it creates a pleasing sense of logical completeness to put law vs. chaos on equal terms with good vs. evil in the game rules.

Going back to early D&D rules discussions, however, and the frequent focus on realism. D&D didn't seem to be founded on the basis of realism, but it did seem to be surrounded in an atmosphere that considered realism a highly appropriate goal. The fact is that the D&D rules were the most realistic RPG rules in existence, since they were the only RPG rules in existence. So it is hard to meaningfully discuss the intended goal of the core game rules compared to other games. However, certainly could be said that many of the individual rules could be easily recognized as realistic in intent. For instance, discussions of keeping track of your exact monetary status, of tracking the passage of time so that you know when your six-hour supply lantern oil is used up, and so on.

What I always found strange was the way that game rule concepts would focus on realism while simultaneously incorporating without question certain highly unrealistic game mechanics, creating very strange results. For instance, the idea that when you are hit by a fireball, it would be realistic to check to see whether all of the wooden items on your person catch on fire. This ignores the complete unreality of the character himself surviving the fireball without being totally disabled by serious burns. If the core game mechanic being modified here was trying to be realistic, it would make sense to add extra rules to make it more realistic. But when the core game mechanic is not trying to be realistic, I think this kind of rule just sacrificed playability for no real gain.

(to be continued when I return from vacation).

Thursday, June 11, 2009

Level progression balance discrepancy - statistic bonuses

In my previous article, I described the impressive way in which the level progression in fourth edition Dungeons & Dragons has been balanced such that character abilities retain more or less the same balance relative to each other as the characters gain levels. In general, attack, defense, damage, and skills all go up at the right rate to remain generally in proportion as the characters gain levels, and prevent high-level characters from becoming progressively more polarized. However, there is one major discrepancy in this system - the fact that character statistics do not go up at the same rate. This creates a number of discrepancies as the characters gain levels.

1. Each character build has certain primary and secondary statistics needed for the class. Many of the builds, and the suggestions for those bills, imply that you would want to concentrate on 3 statistics. However, this makes no sense as far as level progression is concerned because you can only increase 2 of your statistics at the maximum rate. Of course, it is possible to divide your statistic increases between 3 statistics. But the problem with this is that you will fall farther and farther behind a character who divides his statistic increases between only 2 statistics. Primary statistics are used to calculate your attack bonus, so it is absolute essential that they be increased at a maximum rate to comply with the mathematics of the game. Otherwise, you will become less accurate than the other characters as you gain levels. This is why the idea of a character class with 2 primary statistics and a secondary statistic, like a paladin, just doesn't work. It is relatively feasible (if not optimal) to make such a character at low levels, but as the character increases in levels, one of those statistics has to be left behind or the character has to become less and less powerful compared to the other characters. Some secondary statistics are not quite as critical, affecting only things like damage or healing or other things that are not as sensitive to single points. In this case, it would be more feasible to have a character with one primary statistic that splits his points between 2 secondary statistics. However, this isn't very tempting and still results in a character who, at high levels, won't be quite as appealing as his 2 statistic counterparts.

Once you realize this is the case, you can banish the idea of 3-statistic characters from your mind and concentrate on 2-statistic characters, which work properly. However, it remains peculiar that the game tempts you to build multi-statistic characters because they appear to work okay low-level, but you have to rethink your design once you discover they don't quite translate properly at higher levels.

2. The most significant level progression imbalance is with the skill system. Fourth edition gets rid of skill points in favor of fixed bonuses for training in the skill and universal bonuses to all skills according to the level of the character. This design promises to prevent huge discrepancies in skills between the characters at high levels, and in general provide the benefits that I described in my previous blog article. However, the fact that the statistics increase at an uneven rate disrupts the symmetry. At low level, the skill system allows just about everyone to participate in skill checks whether they are skilled or unskilled. And a character who takes skill training in a skill governed by a statistic is not especially good in, still has a pretty good chance of having the best skill roll in the party if no one else took that skill. But at high levels, each character will have 2 statistics that are much higher than the others, and the skills controlled by the statistics will start to get much bigger bonuses. Now the characters who are trained in skills that match their statistics far outshine every other character. Not only does this mean that the balance of the game changes at high levels, it has effect that which class/build you have taken has more effect on your skills and which skills you are trained in. This is an effect I always disliked in games, where the streetwise fighter is forced to be less streetwise than the otherworldly fey warlock, regardless of the background stories and options chosen in character generation.

3. There are some other small effects which diverge in ways similar to the first 2 points. For instance, characters with Dex or Con bonuses gain more of a relative initiative / healing surge bonus compared to the rest of the party as they gain levels. So an infernal warlock may have fewer healing surges than a paladin at low level, but more at high level. This isn't really fatal, but it doesn't quite fit the symmetry of the uniform level progression idea. Also, there are some situations where characters make attacks which don't use primary statistics, such as longbow attacks from melee characters. At low levels these may be at least somewhat effective, but at high levels that will become less and less effective compared to the main attack to the point of uselessness.

So the next question is, what would you do if you decided you wanted to fix this imbalance? Well, a very straightforward way to do so would be to have every statistic increase every 4 levels instead of just 2 statistics increasing every 4 levels. This would have a significant advantage of fixing the problems listed above. I should also note that it would make all sorts of unusual multi-class combinations possible and fun. However, it would also carry some potential disadvantages:

1. As mentioned above, primary and secondary statistics often have effects where the same amount of bonus is just as good at high-level is a low level. I think a good word to capture this is that some bonuses are "logarithmic". A 2-point bonus to attack or defense is just as good for a high-level character with a +30 attack as it is for a local character with a +4 attack. Other benefits are non-logarithmic. A +2 damage bonus for a high-level character who does about 22 points of damage is much less effective than the same bonus for a low-level character who does about 8 points of damage.

If every statistic increased at the same rate, then for non-logarithmic abilities controlled by secondary statistics, the actual value of your statistic would mean less and less as you gain level. That is, at low level, a barbarian with a relatively high constitution might gain 3 temporary hit points when he defeated a foe, while barbarian who had basically ignored constitution might only gain 1 temporary hit point, a big difference. But at high-level, it might be the difference between 8 temporary hit points and 6 temporary hit points, not a big deal. So in this case, having all of the statistics increase at the same rate would cause some abilities to scale less well with level. I think this would be the major disadvantage of allowing every statistic to increase.

Although this definitely causes some problems with the scaling, some of the other effects it has on the game might be good or bad. It would mean that at high-level, classes with 2 secondary statistics would now find it very practical to take powers relying on their lesser secondary statistic. My general impression is that such powers are normally much less tempting to take then the flavor of the game would indicate (in particular, they seem to imply that you should take some of these powers when they describe the class builds, but I find it quite difficult to ever want to take a power which is based on a secondary statistic that I am not improving, unless the power is really broken to begin with). So feeling more free to take these powers might be a good thing. Or maybe not, I can't really say.

2. With constitution increasing at a faster rate, high-level parties would have noticeably more healing surges than low-level parties. Since they have noticeably more healing powers, I don't know whether this is a good or bad thing.

3. Many feats have prerequisites that prevent certain class builds from getting those feats. With every statistic increasing, pretty much every build would eventually be able to get most feats. If those statistic minimums were carefully designed and considered as part of the class balance (it isn't clear whether this is true), this might mean the class balance is disrupted somewhat. For instance, normally only fighters who invest in high Dex scores can take scale armor specialization, but now even Con-based axe fighters could do so.

4. I would expect that a major actual reason that not all statistics are made to increase with level is a combination of inertia/tradition (only one statistic increased with level in third edition, and none before that), and the idea that it is fun getting to choose where your statistic points to go with levels. It is true that there is a small amount of discretion in the way you allocate your statistic bonuses, it is sometimes possible to make little trade-offs, not increasing a secondary statistic in order to raise constitution or a statistic necessary to qualify for a feat. But in general, the game balance so strongly demands that you increase the 2 statistics associated with your class build, that I haven't found getting to choose where your statistic bonuses go to be a particularly interesting process in practice.

Thursday, June 4, 2009

Balanced Level Progression

A very impressive feature of fourth edition Dungeons & Dragons is that the level progression has been balanced so that, on average, the various capabilities of players and monsters go up at the same rate as levels are gained.

In previous editions of Dungeons & Dragons, the distinctions between the various characters and classes started out at a fairly modest level, but grew greater and greater at higher and higher levels. For instance, at low levels a fighter had only a small “to hit” advantage over a cleric with the same strength. But as the fighter gained levels, the fighters hit advantage would become greater and greater compared to the cleric. And previous to third edition, the fighter would gain multiple attacks at a certain point, and the cleric never would. Of course, this was balanced out by the fact that at low levels the cleric spells were not all that amazing compared to the natural fighting ability of the classes, whereas at high levels they grew disproportionately and became massively powerful. However, the point is that at low levels the cleric could be a "secondary fighter" who didn't fight as well as the fighter, but fought decently. On the other hand, the cleric could have some pretty useful spells, but couldn't really concentrate on being a major spellcaster, he had to do a lot of conventional fighting. But at higher levels, the cleric’s fighting ability would become less and less important, and the spell ability more and more important. In other words, the essential purpose and style of different classes and characters would vary continuously as levels were gained. In theory, there is nothing intrinsically wrong with this. In practice, however, it made the game very difficult to design and balance correctly. Typically there would be a “sweet spot” level where the relative capabilities of the 2 classes felt about right, and the farther you were above or below this level, the less well the game seemed to work.

This issue of gains in levels changing the essential balance of the game occurred in many other places as well. For instance, in third edition you earned skill points every level, and it was possible to put the maximum number of skill points into one skill, or to divide up your skill points more evenly. Dividing them up worked interestingly at low level, but at high level it would mean that you were so far behind someone who had maxed out the skill that the GM practically couldn't design an adventure which would challenge one skill level without being either impossible or trivial for the other skill level. And it meant that at low level, an ordinary character might notice a stealthy opponent, but at high level, a character class which didn’t specialize in perception was practically blind against a stealthy opponent.

Designing the hit bonuses and defenses of monsters was also tricky, since with a d20 system it is important for the values to fall within a narrow range in order to prevent the rolls from becoming trivial or impossible, but when the characters diverge at higher levels is soon becomes impossible for maintain this balance. You don’t want characters to hit on 2’s or only hit on 20’s, but when hit bonuses diverge tremendously you can’t cover everyone.

Note that the fact that fighters got more hit points per level than clerics was not an example of this sort of divergence. If a fighter starts with, say, 50% more hit points than the cleric, and gains 50% more hit points than the cleric does every level, the fighter will always have 50% more hit points than the cleric. So for game design purposes, the relative abilities of the fighter and the cleric in this regard stay constant, even though both are becoming more and more powerful. But if the fighter hits in melee combat 20% more often the cleric at low level, but at high level he hits 80% more often and gets twice as many attacks, then his power relative to the cleric has tripled at high levels, drastically changing the nature of the relationship between fighters and clerics.

Fourth edition Dungeons & Dragons tries to solve this problem by having the capabilities of the characters improve at a uniform rate such that the characters remain more or less in balance compared to each other. I think this works really really well and is a great idea. Classes of all kinds get new powers, feats, magic item bonuses, skill bonuses, and attack and defense bonuses at the same rate. And this rate is carefully calculated to be equal to the rate at which the monsters improve. Some of my previous articles have noted some small discrepancies in his calculations, but in effect these mistakes are the "exception that proves the rule". That is, the fact that small discrepancies are actually noticeable and worthy of correction shows the extent to which the game has been balanced; such small differences would be lost among the vast disparities in powers and abilities between the classes in previous editions.

Now, the relative stability in capabilities between the classes as levels progress is not going to be 100% perfect. Since players have flexibility in what feats, paragon paths, and epic destinies they take, and in what magic items they discover, the various characters are likely to become more different from each other at higher levels just because they have more choice available. But this is no big deal, it isn't a system of inevitably increasing disparity with level. Whenever I see an ability that changes disproportionately with level (that is, becomes less effective or more effective than other comparable abilities as the character level changes), I call that out as a deviation from the normal level progression balance. In most cases, such deviations detract from the game, as the ability will be "just right" at some levels, and too weak or too strong at others.

A good practical example of this is the way that attack and damage bonuses work. A bonus of +1 to hit is equally valuable at all levels, because the increase in monster defenses means that your base chance to hit monsters of your level should stay the same as you gain levels. But a bonus of +1 to damage becomes less useful at higher levels, because the total damage you are doing is higher and thus the additional one point of damage is a smaller percentage increase in your damage output. Therefore, in order for a damage bonus to not become ineffective as you gain levels, the damage bonus needs to increase with every level you gain. This is also true with hit point bonuses. But with attack and defense bonuses, they need to stay the same, otherwise they become overly effective as you gain levels.

This is why the bonus from weapon focus increases with every tier; it has to increase in order to stay equally effective relative to the character. The fact that the damage bonus from Dwarf Weapon Training is a fixed +2 bonus means that it does not retain its usefulness at high levels.. It means that dwarves are incredibly potent with axes and hammers at low level, but lose a lot of this advantage at high levels.

On the other hand, the fact that the new Weapon Expertise feats give you increasing hit bonuses at higher levels means they increase in power at a disproportionate rate. At low level they are very useful, at high levels they are absolutely incredible and indispensable. This would be broken if it wasn't clear that these feats were meant as mathematical fixes rather than legitimate optional feats.

The idea that attack bonuses should not increase with level and damage bonuses should is a corollary of the underlying game system and seems to have been fairly well followed by the designers of features and powers, although there are a few mistakes. For instance, the fact that the wizard’s orb feature gives an ever increasing penalty to saving throws as the wizards level increases, despite the fact a penalty to saving throws, like just about anything else rolled on the d20, does not need to and should not increase as you gain levels.

I'm not going into examples of this right now. But there is one major area in which the game's wonderfully balanced level progression is disrupted at a more fundamental level; I shall describe this next week.