can anyone please explain why, in 'en-GB' culture formatting, the decimal.Parse() function would return 15 for 1,5?
I see the same result for 'de' culture when inputting 1.5, the return is 15.
I know this is programmatically correct, I'm just trying to understand why this is correct. I've Googled this and come up with nothing :(
i'm trying to validate user inputs against culture, and populating the input field with a 0 when the parse fails, but populating the field with 15 when they've entered 1,5 doesn't feel right, i feel like it should be "failing" when a 1,5 is entered for english formating, instead of returning 15.
try {
validateSource.Text = decimal.Parse(validateThis, NumberStyles.Number, UserCulture).ToString();
} catch {
validateSource.Text = "0";
}
,
is the NumberGroupSeparator
in the UK culture. There's no validation in terms of how many digits are in each group. So "1,2,3" would be parsed as 123, even though that's not how we'd normally expect it to be written.
While it would make sense for parsing to check the NumberGroupSizes
property, it simply doesn't.
You could emulate this check in a very primitive fashion by formatting the result of parsing, and see whether that's equal to the original input:
decimal value;
if (decimal.TryParse(text, NumberStyles.Number, culture, out value))
{
if (text != value.ToString("N", culture))
{
// Okay, looks like something's up... report an error
}
}
Of course, that would the stop someone from entering "1234.56" as it would expect "1,234.56"... you might want to check multiple pattern formats, e.g. "N" and "G", to see whether it matches any of them.
See more on this question at Stackoverflow