Posted 5 May 2013 The derivative of x^{2}, with respect to x, is 2x. However, suppose we write x^{2} as the sum of x x's, and then take the derivative: Let f(x) = x + x + ... + x (x times) Then f'(x) = d/dx[x + x + ... + x] (x times) = d/dx[x] + d/dx[x] + ... + d/dx[x] (x times) = 1 + 1 + ... + 1 (x times) = x This argument appears to show that the derivative of x^{2}, with respect to x, is actually x. Where is the fallacy? 0 Share this post Link to post Share on other sites

0 Posted 5 May 2013 Derivative measures the rate at which a given quantity changes with a change in x. When you write x*x as a sum of x x's, one of them becomes a constant. for instance, if x = 5, we start with 25. But if you increase it to 6, x^2 becomes 36, but (x+x+x+x+x) becomes just 30 (with the plus notation, what you differentiate does not mathematically capture the 'x times' part because you do not magically add a d/dx(x) term when you increase x. 0 Share this post Link to post Share on other sites

0 Posted 5 May 2013 (edited) above poster seems right. It is not right to write f(x)=x+x+x+.....(x times) as x is a variable, so the number of times x comes is also not defined, till we give x a value. Edited 5 May 2013 by dark_magician_92 0 Share this post Link to post Share on other sites

0 Posted 5 May 2013 vigmester - nice solve. This is one of the better puzzles of the "find the fallacy" genre. Welcome to the Den! 0 Share this post Link to post Share on other sites

Posted

The derivative of x

^{2}, with respect to x, is 2x. However, suppose we write x^{2}as the sum of x x's, and then take the derivative:Let f(x) = x + x + ... + x (x times)

= d/dx[x + x + ... + x] (x times)

= d/dx[x] + d/dx[x] + ... + d/dx[x] (x times)

= 1 + 1 + ... + 1 (x times)

This argument appears to show that the derivative of x

^{2}, with respect to x, is actually x. Where is the fallacy?## Share this post

## Link to post

## Share on other sites