What is the difference between double and decimal in c#?
Answer Posted / Abdul Samad
Double is a floating-point data type that can represent real numbers with a precision of up to 15 significant digits, while Decimal is a more precise floating-point data type that can represent real numbers with a precision of up to 28 significant digits and has a greater range.
| Is This Answer Correct ? | 0 Yes | 0 No |
Post New Answer View All Answers