The language guarantees that int
is at least 16 bits, long
is at least 32 bits, and long
can represent at least all the values that int
can represent.
If you assign a long
value to an int
object, it will be implicitly converted. There's no need for an explicit cast; it would merely specify the same conversion that's going to happen anyway.
On your system, where int
and long
happen to have the same size and range, the conversion is trivial; it simply copies the value.
On a system where long
is wider than int
, if the value won't fit in an int
, then the result of the conversion is implementation-defined. (Or, starting in C99, it can raise an implementation-defined signal, but I don't know of any compilers that actually do that.) What typically happens is that the high-order bits are discarded, but you shouldn't depend on that. (The rules are different for unsigned types; the result of converting a signed or unsigned integer to an unsigned type is well defined.)
If you need to safely assign a long
value to an int
object, you can check that it will fit before doing the assignment:
#include <limits.h> /* for INT_MIN, INT_MAX */
/* ... */
int i;
long li = /* whatever */
if (li >= INT_MIN && li <= INT_MAX) {
i = li;
}
else {
/* do something else? */
}
The details of "something else" are going to depend on what you want to do.
One correction: int
and long
are always distinct types, even if they happen to have the same size and representation. Arithmetic types are freely convertible, so this often doesn't make any difference, but for example int*
and long*
are distinct and incompatible types; you can't assign a long*
to an int*
, or vice versa, without an explicit (and potentially dangerous) cast.
And if you find yourself needing to convert a long
value to int
, the first thing you should do is reconsider your code's design. Sometimes such conversions are necessary, but more often they're a sign that the int
to which you're assigning should have been defined as a long
in the first place.
与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…