What is the difference between double and decimal in c#?
Answer / Abdul Samad
Double is a floating-point data type that can represent real numbers with a precision of up to 15 significant digits, while Decimal is a more precise floating-point data type that can represent real numbers with a precision of up to 28 significant digits and has a greater range.
| Is This Answer Correct ? | 0 Yes | 0 No |
I am using a data table and a datagridview. I am creating two columns in data grid view : 1)DataGridViewComboBoxColumn 2)DataGridViewTextBoxColumn. The items of datagridviewcombo are the list of all items in data table column 1. I want that the data type of the DataGridViewTextBoxColumn should be the same as the data type of the item selected from the datagridviewcombo at runtime. How can I do that.Also I want that each cell in column of datatable should be of variable data type. Pls help. thnx.
What is a Assembly?
What is boxing and unboxing in c#?
Why use a singleton instead of static methods?
Does c# support #define for defining global constants?
Difference between debug.write and trace.write?
hi In my database i put id column as identity(the database is incremented by it self only)what i want is if i enter a data in my website, id will come automatically can u pl z tell me the code thax in advance
What is difference between ienumerable and iqueryable in c#?
What is field in c#?
Is a c# interface the same as a c++ abstract class?
What is an enumerator in c#?
what is read only and constant
Visual Basic (800)
C Sharp (3816)
ASP.NET (3180)
VB.NET (461)
COM+ (79)
ADO.NET (717)
IIS (369)
MTS (11)
Crystal Reports (81)
BizTalk (89)
Dot Net (2435)
Exchange Server (362)
SharePoint (720)
WCF (340)
MS Office Microsoft (6963)
LINQ Language-Integrated Query (317)
WPF (371)
TypeScript (144)
Microsoft Related AllOther (311)