I do not understand how this program works, If the original sum is set to 0
and then the r(remainder)
is added to the sum which is 0
. How is that the correct sum of the two digits? I feel like it should look like this... sum = num + r;
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
namespace Application
{
class MainClass
{
public static void Main (string[] args)
{
int num, sum = 0, r;
Console.WriteLine("Enter a Number : ");
num = int.Parse(Console.ReadLine());
while (num != 0)
{
r = num % 10;
num = num / 10;
**sum = sum + r;**
}
Console.WriteLine("Sum of Digits of the Number : "+sum);
Console.ReadLine();
}
}
}
Don't forget that it's in a loop. sum
is 0 on the first iteration, but on the second iteration it contains the value of the last digit.
So imagine you had a number of 195692. It would go through 7 iterations:
num sum (before) r sum (after)
195692 0 2 2
19569 2 9 11
1956 11 6 17
195 17 5 22
19 22 9 31
1 31 1 32
So the result is 32, which is 1 + 9 + 5 + 6 + 9 + 2.
Note that you could produce that table yourself by adding some diagnostic lines within the code... or you could produce it on pen and paper by debugging through the code to see how it works "live".
If you had sum = r + num
that would be adding the last digit to the result of dividing the number by 10... and completely ignoring the previous sum.
See more on this question at Stackoverflow