How does C# know what type the literal is?

Consider this code:

double i = 0xF0000000;
Console.WriteLine(0xF0000000.GetType());
Console.WriteLine(i.GetType());

Why C# prints System.UInt32 for first one and System.Double for the second one?

Is that because, the compiler by default infers a literal to be of type var?

Jon Skeet
people
quotationmark

In this line:

double i = 0xF0000000;

the literal is of type uint, but it's being implicitly converted to a double. When you call i.GetType() that would always print System.Double, because the variable is of type double... the only kind of value it can hold is a double.

Note that this conversion to double means you can lose precision, if you start off with a long or ulong. For example:

using System;

public class Program
{
    static void Main()
    {
        long x = 123456789012345678;
        double y = 123456789012345678;
        Console.WriteLine(x.ToString("n"));
        Console.WriteLine(y.ToString("n"));
    }
}

prints

123,456,789,012,345,678.00
123,456,789,012,346,000.00

Note how the final few digits are lost in the double, because the implicit conversion from long to double can lose precision. (Both have 64 bits available, but in double only some of those bits are used for the mantissa.)

It's not a problem for int or uint literals, but it's worth being aware that the conversion is happening.

people

See more on this question at Stackoverflow