Not specifically, no. You can create an array with dtype='object'
, which creates an array of Python objects (including but not limited to ints). This will get you a lot of Numpy array-like functionality but few to none of the performance benefits.
Which is to say, an array of Python objects is not significantly different from a Python list
in terms of memory performance. Though if you must use bigints it may still be preferable to using a list
since you still get element-wise arithmetic operations, including when doing operations with other Numpy arrays. For example:
In [1]: import numpy as np
In [2]: big = np.array([10**100, 10**101, 10**102], dtype='object')
In [3]: big
Out[3]:
array([ 10000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000,
100000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000,
1000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000], dtype=object)
In [4]: big + np.array([1, 2, 3])
Out[4]:
array([ 10000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000001,
100000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000002,
1000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000003], dtype=object)
I've never used this capability myself though, so I'm not entirely sure what other surprising limitations might arise.
与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…