Tell me is null always defined as 0(zero)?
No Answer is Posted For this Question
Be the First to Post Answer
#include <stdio.h> int main() { if ("X" <"x") printf("X smaller than x "); } my question is whats the mistake in this program? find it and please tell me..
How to add two numbers without using semicolon at runtime
how to print "hai" in c?
What is a method in c?
What is wrong with this program statement?
what are bit fields in c?
What is the difference between int and float?
Explain what is wrong in this statement?
What is unsigned int in c?
If the size of int data type is two bytes, what is the range of signed int data type?
What do mean by network ?
why ordinary variable store the later value not the initial