Java charAt getting wrong character

I am trying to make an algorithm that will do the following:

64 
797
  7
===
 79

that will take 7, multiply it by 7, and then write the last digit of the answer down and then multiply it by seven and add the first digit of 7 times 7, and so on - so if you do it three times (write it down), you get what I wrote up there as a multiplication.

I got a not good code, instead of showing (for example) whats above in this form:

7,9,4
9,7,6

and so on I get something like this:

7, 9, 52
9, 55, 54

My code:

    for(int i = 0; i<3; i++){//Run the code three times
        temp=upper*7%10+tens;//get the temp but keeping the upper because I am going to need it for the tens part
        tens=(upper*7+"").charAt(0);//get the first character of upper*7
        System.out.println(upper+", "+temp+", "+tens);
        upper=temp;
    }

As far as I can see the problem is in the charAt, because obviously the first character of 7*7 is not 52.

EDIT Now that the code is working fine I have another problem. I tried my new working code (put tens as the int value of the string value of the char instead of just the char), I have another problem. Hooray!

Last tens: 0  Now's plain number:7, New:9, Tens:4
Last tens: 4  Now's plain number:9, New:7, Tens:6
Last tens: 6  Now's plain number:7, New:15, Tens:4
Last tens: 4  Now's plain number:15, New:9, Tens:1
Last tens: 1  Now's plain number:9, New:4, Tens:6

My code now is as the same as the old code just with the tens fixed. But now, I am getting 15 for a number. That is supposed to be a digit. Whats wrong? I honestly don't know if the code I wrote will fullfil my purpose. What code will?

Jon Skeet
people
quotationmark

I strongly suspect that the problem is the type of tens. You haven't shown this in your code, but I suspect it's int.

So this line:

tens = (upper*7+"").charAt(0);

takes the first character from a string, and then stores it in an int. So for example, the character '4' is Unicode 52 ('0') is 48. The conversion to int just converts the UTF-16 code unit from an unsigned 16-bit value to a signed 32-bit value.

You're then displaying the value of tens - but if tens is indeed an int, it's going to display that as a number.

As far as I can see the problem is in the charAt, because obviously the first character of 7*7 is not 52.

Well, the first character of the string representation of 7*7 will be '4'. When that's been converted to an int, you'll see that as 52.

If you just want tens as a char, you should declare it as type char. You shouldn't do arithmetic with that value of course - but then when you display it, you'll see 4 displayed, because the string conversion will still treat it just as a character instead of as a number.

people

See more on this question at Stackoverflow