What is the difference between ANSI and UNICODE strings
when passed as arguments to a DLL?
Answers were Sorted based on User's Feedback
Answer / kaan
ANSI support 256 different characters
so it will take 8 bit to store single character
but
UNICODE support 65536 different character,so it will take
16 bit to store single character and it is more complex
This differences should consider while u interact with
unmanaged code(win32 api),export functions etc.,
Is This Answer Correct ? | 22 Yes | 1 No |
Answer / vipul
ANSI: This standard provided 256 different symbols that a
computer can use. It was quick, efficient, and easy to
implement. All modern operating systems fully understand
ASCII.
UNICODE: Unicode which allowed for up to 65,536 different
characters. Since Unicode is more complex it is not
implemented on many operating systems. In terms of
Microsoft; Windows NT, Windows 2000, Windows XP, and
Windows 2003 support Unicode as will all future releases of
Windows.
Is This Answer Correct ? | 14 Yes | 0 No |
Answer / guest
ANSI - one byte for a char
UNICODE - two bytes per char - works only on NT
Is This Answer Correct ? | 17 Yes | 10 No |
how to get printout between two dates informations in datareport in vb6.0
What is the use of Scalewidth and ScaleHeight Proeperty?
What methods are used for DBGrid in unbound mode?
What is the benefit of wrapping database calls into MTS transactions?
How to declare Dll Procedure?
What is the difference between a property a method and an event? Give an example of each.
What are the types of Error? In which areas the Error occurs?
What is Dataware Control?
which Property is used to count no. of items in a combobox?
What are the types of line styles available in Treeview Control?
What are the uses of List View Control?
What is the use of parameters collection?