And what does dodge exactly do? Dodge all attacks dots, physical, elemental damage? or is it only for physical attacks?.... I think I saw a video of a skeleton mage casting arcane bolts which got dodged by a monk....

And what does dodge exactly do? Dodge all attacks dots, physical, elemental damage? or is it only for physical attacks?.... I think I saw a video of a skeleton mage casting arcane bolts which got dodged by a monk....

I'm not too sure what attacks are dodgeable... but I thought only physical melee/ranged attacks? Dodging spells sounds way too imba Oo

A total of 37%Dodge from a 30%Dodge Skill and a 10%Dodge Passive would calculate like this:

(1-0.3)*(1-0.1) = 0.63

(1-0.63) = 0.37 = 37% Dodge.

So Dodging is handled multiplicatively.

Ugh... I hate it when they include percentage-based damage reduction. Barely anyone actually understands its benefits. For example, the person you quoted mentioned Diminishing Returns, but when you look at the survival time you're getting from Dodge there are actually INCREASING returns.

For example, let's say you have 10,000 HP and are taking 100 damage per hit. It takes 100 hits to kill you.

Now let's say you stack the two aforementioned dodge bonuses and get 37% dodge. You will dodge 37% of the hits you take, so on average you'll take 100 * (1- 0.37) = 63 damage per hit. It now takes 158.73 hits to kill you.

A 37% dodge bonus translated into a 58.7% increase in survival time. From this you could say that your HP is 58.7% more effective, which is the same thing as having +58.7% Max HP and HP restoration from all sources.

As for Increasing Returns:

On their own,
Mantra of Evasion gives 1/(1-0.3) = +42.86% Survival Time
Guardian gives 1/(1-0.1) = +11.11% Survival Time

Together they give 58.7%, which is 58.7 - 42.86 - 11.11 = 4.73% of your base survival time more than the sum of their parts. This dodge setup has INCREASING returns.

Whether or not Dodge works on magic attacks, I just hate that they're making it so unintuitive to compare dodge to other stats. If I wanna see how much more damage I can withstand I have to do a bunch of subtraction and division, which greatly convolutes things.

On their own,
Mantra of Evasion gives 1/(1-0.3) = +42.86% Survival Time
Guardian gives 1/(1-0.1) = +11.11% Survival Time

Together they give 58.7%, which is 58.7 - 42.86 - 11.11 = 4.73% of your base survival time more than the sum of their parts. This dodge setup has INCREASING returns.

You have a funny definition of increasing returns. The benefit each source of dodge provides is constant and doesn't change how much dodge you already have from other sources. Diminishing or increasing means that you get less or more benefit the more dodge you have, but that's not the case. If your math shows otherwise it's wrong.

The classical example of increasing returns is going from 0 -> 1% dodge vs going from 98% -> 99%. The first step is a minuscle increase, the second step doubles average survival time compared to its previous value. We don't have this or anything similar in D3.

1/(1-0.3) = 1.42
1.42 * 1/(1-0.1) = 1.57
repeat for each source of dodge.

On their own,
Mantra of Evasion gives 1/(1-0.3) = +42.86% Survival Time
Guardian gives 1/(1-0.1) = +11.11% Survival Time

Together they give 58.7%, which is 58.7 - 42.86 - 11.11 = 4.73% of your base survival time more than the sum of their parts. This dodge setup has INCREASING returns.

You have a funny definition of increasing returns. The benefit each source of dodge provides is constant and doesn't change how much dodge you already have from other sources. Diminishing or increasing means that you get less or more benefit the more dodge you have, but that's not the case.

Thats...just flat out-wrong. I showed you that stacking two dodge bonuses at once gave you almost 5% more of your base survival time together than the sum of their parts. The benefit of each dodge bonus is proportional to your current survival time and proportional to each other dodge bonus. What you're looking at is exponential scaling the more dodge bonuses you stack.

The classical example of increasing returns is going from 0 -> 1% dodge vs going from 98% -> 99%. The first is a 1% increase, the second a 50% increase over the previous average survival time. We don't have this or anything similar in D3.

Actually we have precisely that. I just mentioned exponential scaling, however that's with respect to stacking multiple separate dodge bonuses. For a single dodge bonus on its own you in fact have the exact same kind of scaling you mention here.

For example, if you have no dodge and get a 1% dodge bonus you now survive 1/(1-0.01) = +1.01% Survival Time

If I replace a 30% dodge bonus with 31% dodge, I now survive (1/(1-0.31)) / (1/(1-0.30)) = 1.45% Survival Time

Almost 50% more!

The classical example of increasing returns is going from 0 -> 1% dodge vs going from 98% -> 99%. The first is a 1% increase, the second a 50% increase over the previous average survival time.

EDIT: Sorry I misunderstood what you were saying, but yes we do have increasing returns. Here, I'll show how:

You say that if there is increasing returns on Dodge, you should get more benefit the more dodge you have. To test that, let's look at your survival time with six +10% dodge bonuses.

The first dodge bonus will give you 1/(1-0.1) = 11.1% Survival time
The next will give you (1/(1-0.1))^2 = 23.46% Survival time, a marginal increase of 12.35%
The next will give you (1/(1-0.1))^3 = 37.17% Survival time, a marginal increase of 13.72%
The next will give you (1/(1-0.1))^4 = 52.42% Survival time, a marginal increase of 15.24%
The next will give you (1/(1-0.1))^5 = 69.35% Survival time, a marginal increase of 16.94%
The next will give you (1/(1-0.1))^6 = 88.17% Survival time, a marginal increase of 18.82%

Certainly looks like Increasing Returns to me.

1/(1-0.3) = 1.42
1.42 * 1/(1-0.1) = 1.57
repeat for each source of dodge.

Now how 1.57 * 0.9 = 1.42

Your math is correct as far as I can tell, but I'm not sure what you're saying at the end. Why are you multiplying by 0.9?

What you are doing is subtracting previous survival time from current survival time which is the wrong way of looking at it.

I'm not good at explaining things so I'll make a very simple example:
Somebody who dodges 50 of 100 hits is only hit half as much as somebody who takes all the hits. Somebody who dodges 75 of 100 hits is also only hit half as much as somebody who dodges 50 hits. You are doing something funny with the numbers and calculating something else entirely.

What you are doing is subtracting previous survival time from current survival time which is the wrong way of looking at it.

Let's say you have 10,000 HP and six item slots. In each slot you can choose either +1,000 HP or +8% Dodge.

For the first bonus, you have:

11,000 HP with the HP bonus (+10% more HP)
10,870 effective HP with the Dodge bonus (+8.7% more HP)

So the flat bonus is better. However, if you compare six of each bonus:

16,000 HP with the HP bonus (+60% more HP)
16,492 effective HP with the Dodge bonus (+64.92% more HP)

Even though one Dodge bonus was weaker than the flat HP bonus, six of them together beats the flat HP bonuses. That's the definition of Increasing Returns.

If you're saying Dodge does not have increasing returns, then that would imply that flat +HP bonuses, which scale linearly, have DIMINISHING returns, which is not true.

I'm not good at explaining things so I'll make a very simple example:
Somebody who dodges 50 of 100 hits is only hit half as much as somebody who takes all the hits. Somebody who dodges 75 of 100 hits is also only hit half as much as somebody who dodges 50 hits. According to you, the survival times are somehow not proportional to these ratios. They are proportional, but you are doing something funny with the numbers and calculating something else entirely.

I'm calculating the marginal benefit in terms of base HP (or base survival time, whichever you prefer). Base HP is constant, allowing you to compare in terms of absolutes.

You're calculating the marginal benefit in terms of current HP. Current HP varies, which makes it harder to compare any given bonus.

I'm calculating the marginal benefit in terms of base HP (or base survival time, whichever you prefer). Base HP is constant, allowing you to compare in terms of absolutes.

The term diminishing returns has a specific meaning in the context of theorycrafting. If you're calculating something else, then should use some other term to avoid misunderstandings.

To clarify what I'm talking about, we should first talk about number of attacks required to kill a character by the way, for the sake of simplicity. Survivability time is just attacks required to kill * attack interval. Hitpoints are also not relevant because we're already working with attacks required to kill which is simply hitpoints / damage per attack. So let's take some items with a 20% dodge bonus and see how the attacks required to kill changes

1/(1-0.2) = 1.25

This means it takes 1.25 times more attacks to kill a character with 20% dodge compared to the same character with 0% dodge.

If we add another 20% dodge
1.25 * 1/(1-0.2) = 1.5625

This means it takes 1.5625 times more attacks to kill a character with 2 20% dodge bonuses compared to the same character with 0% dodge.

Note however 1.25 * 1.25 = 1.56. The character with 2 20% dodge items requires 1.25 more attacks to kill than the amount it takes to kill the character with 1 20% dodge item, which in turn takes 1.25 more attacks to kill than the character with 0% dodge.

I'm calculating the marginal benefit in terms of base HP (or base survival time, whichever you prefer). Base HP is constant, allowing you to compare in terms of absolutes.

The term diminishing returns has a specific meaning in the context of theorycrafting. You should use some other term to avoid misunderstandings.

We should talk about number of attacks required to kill a character by the way, for the sake of simplicity. Let's take several items with a 20% dodge bonus

1/(1-0.2) = 1.25

This means it takes 1.25 times more attacks to kill a character with 20% dodge compared to the same character with 0% dodge.

If we add another 20% dodge
1.25 * 1/(1-0.2) = 1.5625

This means it takes 1.5625 times more attacks to kill a character with 2 20% dodge bonuses compared to the same character with 0% dodge. Note however 1.25 * 1.25 = 1.56. The character with 2 20% dodge items requires 1.25 more attacks to kill than the amount it takes to kill the character with 1 20% dodge item, which in turn takes 1.25 more attacks to kill than the character with 0% dodge.

This is linear returns.

Then by that definition, flat +HP bonuses have Diminishing returns.

Then by that definition, flat +HP bonuses have Diminishing returns.

They do. The relative benefit decreases.

A character with 50 hitpoints can double the number attacks required to kill him by adding 50 more hitpoints.

A 500 hitpoint character adding 50 hitpoints isn't doing that.

The more hitpoints you have, the less valuable additional hitpoints become, although the term diminishing returns isn't generally applied to flat bonuses.

Then by that definition, flat +HP bonuses have Diminishing returns.

They do. The relative benefit decreases.

A character with 50 hitpoints can double the number attacks required to kill him by adding 50 more hitpoints.

A 500 hitpoint character adding 50 hitpoints isn't doing that.

The more hitpoints you have, the less valuable additional hitpoints become, although the term diminishing returns isn't generally applied to flat bonuses.

That's not true. Hit points are always worth the number of hits they allow you to withstand, which increases linearly. It's enemy damage you should be comparing this to, not your current HP.

Just looking at the graphs you can see what's linear and what's increasing.

You can argue semantics with misleading graphs if you want, but as I said in videogame theorycrafting linear, diminishing and increasing returns have specific meanings. If you don't want to conform to that, fine. Just don't expect to understand others or to be understood.

You can argue semantics with misleading graphs if you want, but as I said in videogame theorycrafting linear, diminishing and increasing returns have specific meanings. If you don't want to conform to that, fine. Just don't expect to understand others or to be understood.

It makes no sense. You can't compare that result to your enemies or even your allies. What use are those semantics if they compare things in units that can't in turn be compared relative to any practical goal?

It makes no sense. You can't compare that result to your enemies or even your allies. What use are those semantics if they compare things in units that can't in turn be compared relative to any practical goal?

The definition of the term linear/increasing/diminishing returns in videogames simply reflect how this concept is used in practice.

Let's say you have a character with x% dodge, y%block and z%armor. You have the choice between an item that increases dodge, one that increases block and one that increases armor and you need to figure out which item will give you more survivability. In that situation, you take your CURRENT survivability and use that as point of reference. We can compare the effects of each of these items to our point of reference and we then know what's best in that situation.

To make a more concrete example:

Let's say 100 points of armor reduce damage by 25%.
Let's say our character has 1000 hp, 0 armor and is attacked by monsters that hit for 347 damage.
Let's say that our character wants to maximize survivability and can choose between an item with 100 armor or 400 hp.

1) Our point of reference is the following:

1000/347 = 2.8818 attacks required to be killed

Choosing the 100 armor item makes it 1000 / (347 * 0.75) = 3.84 attacks to be killed.
Choosing the 400 hp item instead would make it 1400 / 347 = 4.03 attacks to be killed.

So we take the +400 hp item because it's better in that situation.

2) Then we find the same item again and need to decide again which one we put on.

Our new point of reference is 1400 / 347 = 4.0345 attacks to be killed

Choosing the 100 armor item makes it 1400 / (347 * 0.75) = 5.35 attacks to be killed.
Choosing the 400 hp item instead would make it 1800 / 347 = 5.18 attacks to be killed.

The +armor item is better this time. What has happened? Diminishing returns on hitpoints. So we take the +armor item.

3) Once again we are put in front of the same choice:

Point of reference 1400 / (347 * 0.75) = 5.35 attacks to be killed.

Choosing the 100 armor item makes it 1400 / (347 * 0.50) = 8.06 attacks to be killed.
Choosing the 400 hp item instead would make it 1800 / (347 * 0.75) = 6.91 attacks to be killed.

+Armor is better and the gap widened since last time. The increasing returns on the armor are starting to become noticeable.

4) Same choice again

Point of reference 1400 / (347 * 0.50) = 8.06 attacks to be killed.

Choosing the 100 armor item makes it 1400 / (347 * 0.25) = 16.13 attacks to be killed.
Choosing the 400 hp item instead would make it 1800 / (347 * 0.50) = 10.37 attacks to be killed.

+Armor is starting to become much, much better.

I hope it's clear now why this concept exists and how it is used. Just for fun, the value of armor/hp for each scenario. In a fifth scenario, the value would be infinite since you'd become immune to damage.

Whether or not Dodge works on magic attacks, I just hate that they're making it so unintuitive to compare dodge to other stats. If I wanna see how much more damage I can withstand I have to do a bunch of subtraction and division, which greatly convolutes things.

Dodge doesnt work on magic attacks, was tested by a beta monk. Link

If you have 10% more HP, you will survive 10% longer (on average).
If you have 10% Dodge, you will survive 10% longer (on average).

The difference is that HP accumulates, so as you get more HP you need more of it to make the same percentage. (ie a player with 5,000 HP needs 500 HP to get 10%) You could call this diminishing returns or linear returns, it's just semantics.

Dodge is a non-accumulative skill, so 10% is always 10% no matter what. You could call this linear returns or "ascending returns" although it's a stretch to say the latter.

Also, once Dodge stacks beyond a certain point the randomness really starts to hurt you. If you have a 65% chance to dodge but have very little HP, you run the risk of dying from a string of bad Dodge luck. Just ask anyone who tanked at level 70-80 when Avoidance percentages were much higher than at 85.

If you have 10% more HP, you will survive 10% longer (on average).
If you have 10% Dodge, you will survive 10% longer (on average).

NO. That is not true at all. If you have 10% dodge you will survive 1/(1-0.1) = 11.11% longer.

110% = 1/(1-x)
1.1 = 1/(1-x)
1.1*(1-x) = 1
1/1.1 = 1-x
1-1/1.1 = x
x = 9.09%

It takes 9.09% dodge to survive 10% longer.

Ugh... this kind of blatant lack of comprehension is why I wish game developers would stop making stats that reduce damage by a percentage, or at least display those stats using numbers that scaled linearly with survival time. It's obvious that anyone without a math, science, or technology degree can't understand it.

Dodge is a non-accumulative skill, so 10% is always 10% no matter what. You could call this linear returns or "ascending returns" although it's a stretch to say the latter.

If you measure it in terms of the amount of damage you can withstand, it's increasing returns as each dodge bonus allows you to withstand more additional damage than the last. If you measure its benefit as a percentage of your current HP, then it's linear.

I just don't understand why anyone would ever favor the latter semantics because if you just use the former you can compare the results directly to enemy damage, and to other players, while the latter is only useful for comparing items for your own use (which the former works just as well at.)

Ugh... this kind of blatant lack of comprehension is why I wish game developers would stop making stats that reduce damage by a percentage, or at least display those stats using numbers that scaled linearly with survival time. It's obvious that anyone without a math, science, or technology degree can't understand it.

All damage reduction stats have increasing returns with respect to survival time. Being percentage based or not has nothing to do with it.

Example:
Suppose there was a stat that reduces monster damage by 1. For simplicity's sake, monster damage is 10 damage each second by default. Players have 100 hitpoints.

Survival time with 0 DR:
playerHP/(monsterDamage-DR) = 100/(10-0) = 10 hits, or 10 seconds.

Survival time with 1 DR:
100/(10-1) = 11.1 seconds, a 11.1% increase

Survival time with 2 DR:
100/(10-2) = 12.5 seconds, a 25% increase

Survival time with 9 DR:
100/(10-9) = 100 seconds, a 1000% increase

Survival time with 10 DR:
100/(10-10) = Error, cannot divide by 0. But it's clear that at this point, the monster is swinging for 0 damage, thus the player will live infinitely. Any further DR is wasted unless the player can find a monster that hits for more than 10 damage.

The point is that the relationship between Damage Reduction and survival time is necessarily one of so-called increasing returns.

For what it's worth, the way dodge works now, it has:
Diminishing returns with respect to dodge chance.
Linear returns with respect to damage reduction.
Increasing returns with respect to survival time (but since this is always true with DR stats, it's not necessarily worth pointing out).

Survival time is a decent stat to look at when you're comparing a damage reduction stat to, say, a health increase. But for the purposes of understanding how a new dodge buff compares to an old dodge buff, I think it's much easier to quantify and examine when you use a relative comparison (i.e. new dodge buff will reduce all future damage by 10%).

Ugh... this kind of blatant lack of comprehension is why I wish game developers would stop making stats that reduce damage by a percentage, or at least display those stats using numbers that scaled linearly with survival time. It's obvious that anyone without a math, science, or technology degree can't understand it.

All damage reduction stats have increasing returns with respect to survival time. Being percentage based or not has nothing to do with it.

Example:
Suppose there was a stat that reduces monster damage by 1. For simplicity's sake, monster damage is 10 damage each second by default. Players have 100 hitpoints.

Survival time with 0 DR:
playerHP/(monsterDamage-DR) = 100/(10-0) = 10 hits, or 10 seconds.

Survival time with 1 DR:
100/(10-1) = 11.1 seconds, a 11.1% increase

Survival time with 2 DR:
100/(10-2) = 12.5 seconds, a 25% increase

Survival time with 9 DR:
100/(10-9) = 100 seconds, a 1000% increase

Survival time with 10 DR:
100/(10-10) = Error, cannot divide by 0. But it's clear that at this point, the monster is swinging for 0 damage, thus the player will live infinitely. Any further DR is wasted unless the player can find a monster that hits for more than 10 damage.

The point is that the relationship between Damage Reduction and survival time is necessarily one of so-called increasing returns.

For what it's worth, the way dodge works now, it has:
Diminishing returns with respect to dodge chance.
Linear returns with respect to damage reduction.
Increasing returns with respect to survival time (but since this is always true with DR stats, it's not necessarily worth pointing out).

Survival time is a decent stat to look at when you're comparing a damage reduction stat to, say, a health increase. But for the purposes of understanding how a new dodge buff compares to an old dodge buff, I think it's much easier to quantify and examine when you use a relative comparison (i.e. new dodge buff will reduce all future damage by 10%).

It's true that an health increase is a useful stat to compare survival time to, and isn't so great for comparing new bonuses to old. However, the reason I think survival time is better is that it can also be compared against enemies' damage and allies' survival time. But regardless I've given up on that argument.

My main point right now is that, as PiousFlea has so graciously demonstrated, barely anyone understands what benefit they're getting from dodge. PiousFlea mistook a 10% dodge bonus for a 10% survivability time bonus, which is a very easy mistake to make, but which can nevertheless drastically throw off calculations. This is why I think the game would really benefit from a stat in the stat window which showed the benefits your dodge bonus is giving as a percentage of survival time.

Rollback Post to RevisionRollBack

To post a comment, please login or register a new account.

just came across a post on the US Monk Forum where a Beta Tester confirmed how Dodge works

Link

A total of 37%Dodge from a 30%Dodge Skill and a 10%Dodge Passive would calculate like this:

So Dodging is handled multiplicatively.

I'm not too sure what attacks are dodgeable... but I thought only physical melee/ranged attacks? Dodging spells sounds way too imba Oo

For example, let's say you have 10,000 HP and are taking 100 damage per hit. It takes

100 hits to kill you.Now let's say you stack the two aforementioned dodge bonuses and get 37% dodge. You will dodge 37% of the hits you take, so on average you'll take 100 * (1- 0.37) = 63 damage per hit. It now takes

158.73 hits to kill you.A 37% dodge bonus translated into a 58.7% increase in survival time. From this you could say that your HP is 58.7% more effective, which is the same thing as having +58.7% Max HP and HP restoration from all sources.

As for Increasing Returns:

On their own,

Mantra of Evasion gives 1/(1-0.3) = +42.86% Survival Time

Guardian gives 1/(1-0.1) = +11.11% Survival Time

Together they give 58.7%, which is 58.7 - 42.86 - 11.11 = 4.73% of your base survival time more than the sum of their parts. This dodge setup has INCREASING returns.

Whether or not Dodge works on magic attacks, I just hate that they're making it so unintuitive to compare dodge to other stats. If I wanna see how much more damage I can withstand I have to do a bunch of subtraction and division, which greatly convolutes things.

You have a funny definition of increasing returns. The benefit each source of dodge provides is constant and doesn't change how much dodge you already have from other sources. Diminishing or increasing means that you get less or more benefit the more dodge you have, but that's not the case. If your math shows otherwise it's wrong.

The classical example of increasing returns is going from 0 -> 1% dodge vs going from 98% -> 99%. The first step is a minuscle increase, the second step doubles average survival time compared to its previous value. We don't have this or anything similar in D3.

1/(1-0.3) = 1.42

1.42 * 1/(1-0.1) = 1.57

repeat for each source of dodge.

Note how 1.57 * 0.9 = 1.42

Actually we have precisely that. I just mentioned exponential scaling, however that's with respect to stacking multiple separate dodge bonuses. For a single dodge bonus on its own you in fact have the exact same kind of scaling you mention here.

For example, if you have no dodge and get a 1% dodge bonus you now survive 1/(1-0.01) = +1.01% Survival Time

If I replace a 30% dodge bonus with 31% dodge, I now survive (1/(1-0.31)) / (1/(1-0.30)) = 1.45% Survival Time

Almost 50% more!

EDIT: Sorry I misunderstood what you were saying, but yes we do have increasing returns. Here, I'll show how:

You say that if there is increasing returns on Dodge, you should get more benefit the more dodge you have. To test that, let's look at your survival time with six +10% dodge bonuses.

The first dodge bonus will give you 1/(1-0.1) = 11.1% Survival time

The next will give you (1/(1-0.1))^2 = 23.46% Survival time, a marginal increase of 12.35%

The next will give you (1/(1-0.1))^3 = 37.17% Survival time, a marginal increase of 13.72%

The next will give you (1/(1-0.1))^4 = 52.42% Survival time, a marginal increase of 15.24%

The next will give you (1/(1-0.1))^5 = 69.35% Survival time, a marginal increase of 16.94%

The next will give you (1/(1-0.1))^6 = 88.17% Survival time, a marginal increase of 18.82%

Certainly looks like Increasing Returns to me.

Your math is correct as far as I can tell, but I'm not sure what you're saying at the end. Why are you multiplying by 0.9?

previous survival timefromcurrent survival timewhich is the wrong way of looking at it.I'm not good at explaining things so I'll make a very simple example:

Somebody who dodges 50 of 100 hits is only hit half as much as somebody who takes all the hits. Somebody who dodges 75 of 100 hits is also only hit half as much as somebody who dodges 50 hits. You are doing something funny with the numbers and calculating something else entirely.

For the first bonus, you have:

11,000 HP with the HP bonus (+10% more HP)

10,870 effective HP with the Dodge bonus (+8.7% more HP)

So the flat bonus is better. However, if you compare six of each bonus:

16,000 HP with the HP bonus (+60% more HP)

16,492 effective HP with the Dodge bonus (+64.92% more HP)

Even though one Dodge bonus was weaker than the flat HP bonus, six of them together beats the flat HP bonuses. That's the definition of Increasing Returns.

If you're saying Dodge does not have increasing returns, then that would imply that flat +HP bonuses, which scale linearly, have DIMINISHING returns, which is not true.

I'm calculating the marginal benefit in terms of base HP (or base survival time, whichever you prefer). Base HP is constant, allowing you to compare in terms of absolutes.

You're calculating the marginal benefit in terms of current HP. Current HP varies, which makes it harder to compare any given bonus.

The term diminishing returns has a specific meaning in the context of theorycrafting. If you're calculating something else, then should use some other term to avoid misunderstandings.

To clarify what I'm talking about, we should first talk about

number of attacks required to killa character by the way, for the sake of simplicity.Survivability timeis justattacks required to kill*attack interval.Hitpointsare also not relevant because we're already working withattacks required to killwhich is simplyhitpoints/damage per attack. So let's take some items with a 20% dodge bonus and see how theattacks required to killchanges1/(1-0.2) = 1.25

This means it takes 1.25 times more attacks to kill a character with 20% dodge compared to the same character with 0% dodge.

If we add another 20% dodge

1.25 * 1/(1-0.2) = 1.5625

This means it takes 1.5625 times more attacks to kill a character with

220% dodge bonuses compared to the same character with 0% dodge.Note however 1.25 * 1.25 = 1.56. The character with

220% dodge items requires 1.25 more attacks to kill than the amount it takes to kill the character with120% dodge item, which in turn takes 1.25 more attacks to kill than the character with 0% dodge.This is linear returns.

They do. The relative benefit decreases.

A character with 50 hitpoints can double the number attacks required to kill him by adding 50 more hitpoints.

A 500 hitpoint character adding 50 hitpoints isn't doing that.

The more hitpoints you have, the less valuable additional hitpoints become, although the term diminishing returns isn't generally applied to flat bonuses.

Just looking at the graphs you can see what's linear and what's increasing.

Flat +HP Bonuses: LINEAR

http://www.wolframalpha.com/input/?i=Graph y = 1,000 x

20% Dodge Bonuses: INCREASING

http://www.wolframalpha.com/input/?i=graph y=(1/(1-0.2))^x from 0 to 10

Current Dodge +1%: INCREASING

http://www.wolframalpha.com/input/?i=Graph y = 1/(1-x) from 0 to 1

The definition of the term linear/increasing/diminishing returns in videogames simply reflect how this concept is used in practice.

Let's say you have a character with x% dodge, y%block and z%armor. You have the choice between an item that increases dodge, one that increases block and one that increases armor and you need to figure out which item will give you more survivability. In that situation, you take your CURRENT survivability and use that as point of reference. We can compare the effects of each of these items to our point of reference and we then know what's best in that situation.

To make a more concrete example:

Let's say 100 points of armor reduce damage by 25%.

Let's say our character has 1000 hp, 0 armor and is attacked by monsters that hit for 347 damage.

Let's say that our character wants to maximize survivability and can choose between an item with 100 armor or 400 hp.

1) Our point of reference is the following:

1000/347 = 2.8818 attacks required to be killed

Choosing the 100 armor item makes it 1000 / (347 * 0.75) = 3.84 attacks to be killed.

Choosing the 400 hp item instead would make it 1400 / 347 = 4.03 attacks to be killed.

So we take the +400 hp item because it's better in that situation.

2) Then we find the same item again and need to decide again which one we put on.

Our new point of reference is 1400 / 347 = 4.0345 attacks to be killed

Choosing the 100 armor item makes it 1400 / (347 * 0.75) = 5.35 attacks to be killed.

Choosing the 400 hp item instead would make it 1800 / 347 = 5.18 attacks to be killed.

The +armor item is better this time. What has happened? Diminishing returns on hitpoints. So we take the +armor item.

3) Once again we are put in front of the same choice:

Point of reference 1400 / (347 * 0.75) = 5.35 attacks to be killed.

Choosing the 100 armor item makes it 1400 / (347 * 0.50) = 8.06 attacks to be killed.

Choosing the 400 hp item instead would make it 1800 / (347 * 0.75) = 6.91 attacks to be killed.

+Armor is better and the gap widened since last time. The increasing returns on the armor are starting to become noticeable.

4) Same choice again

Point of reference 1400 / (347 * 0.50) = 8.06 attacks to be killed.

Choosing the 100 armor item makes it 1400 / (347 * 0.25) = 16.13 attacks to be killed.

Choosing the 400 hp item instead would make it 1800 / (347 * 0.50) = 10.37 attacks to be killed.

+Armor is starting to become much, much better.

I hope it's clear now why this concept exists and how it is used. Just for fun, the value of armor/hp for each scenario. In a fifth scenario, the value would be infinite since you'd become immune to damage.

1) 0.95

2) 1.03

3) 1.16

4) 1.55

If you have 10% more HP, you will survive 10% longer (on average).

If you have 10% Dodge, you will survive 10% longer (on average).

The difference is that HP accumulates, so as you get more HP you need more of it to make the same percentage. (ie a player with 5,000 HP needs 500 HP to get 10%) You could call this diminishing returns or linear returns, it's just semantics.

Dodge is a non-accumulative skill, so 10% is always 10% no matter what. You could call this linear returns or "ascending returns" although it's a stretch to say the latter.

Also, once Dodge stacks beyond a certain point the randomness really starts to hurt you. If you have a 65% chance to dodge but have very little HP, you run the risk of dying from a string of bad Dodge luck. Just ask anyone who tanked at level 70-80 when Avoidance percentages were much higher than at 85.

110% = 1/(1-x)

1.1 = 1/(1-x)

1.1*(1-x) = 1

1/1.1 = 1-x

1-1/1.1 = x

x = 9.09%

It takes 9.09% dodge to survive 10% longer.

Ugh... this kind of blatant lack of comprehension is why I wish game developers would stop making stats that reduce damage by a percentage, or at least display those stats using numbers that scaled linearly with survival time. It's obvious that anyone without a math, science, or technology degree can't understand it.

If you measure it in terms of the amount of damage you can withstand, it's increasing returns as each dodge bonus allows you to withstand more additional damage than the last. If you measure its benefit as a percentage of your current HP, then it's linear.

I just don't understand why anyone would ever favor the latter semantics because if you just use the former you can compare the results directly to enemy damage, and to other players, while the latter is only useful for comparing items for your own use (which the former works just as well at.)

All damage reduction stats have increasing returns with respect to survival time. Being percentage based or not has nothing to do with it.

Example:

Suppose there was a stat that reduces monster damage by 1. For simplicity's sake, monster damage is 10 damage each second by default. Players have 100 hitpoints.

Survival time with 0 DR:

playerHP/(monsterDamage-DR) = 100/(10-0) = 10 hits, or 10 seconds.

Survival time with 1 DR:

100/(10-1) = 11.1 seconds, a 11.1% increase

Survival time with 2 DR:

100/(10-2) = 12.5 seconds, a 25% increase

Survival time with 9 DR:

100/(10-9) = 100 seconds, a 1000% increase

Survival time with 10 DR:

100/(10-10) = Error, cannot divide by 0. But it's clear that at this point, the monster is swinging for 0 damage, thus the player will live infinitely. Any further DR is wasted unless the player can find a monster that hits for more than 10 damage.

The point is that the relationship between Damage Reduction and survival time is necessarily one of so-called increasing returns.

For what it's worth, the way dodge works now, it has:

Diminishing returns with respect to dodge chance.

Linear returns with respect to damage reduction.

Increasing returns with respect to survival time (but since this is always true with DR stats, it's not necessarily worth pointing out).

Survival time is a decent stat to look at when you're comparing a damage reduction stat to, say, a health increase. But for the purposes of understanding how a new dodge buff compares to an old dodge buff, I think it's much easier to quantify and examine when you use a relative comparison (i.e. new dodge buff will reduce all future damage by 10%).

My main point right now is that, as PiousFlea has so graciously demonstrated, barely anyone understands what benefit they're getting from dodge. PiousFlea mistook a 10% dodge bonus for a 10% survivability time bonus, which is a very easy mistake to make, but which can nevertheless drastically throw off calculations. This is why I think the game would really benefit from a stat in the stat window which showed the benefits your dodge bonus is giving as a percentage of survival time.