It just so happens that the tokens you want split are already Python tokens, so you can use the built-in tokenize
module. It's almost a one-liner; this program:
from io import StringIO
from tokenize import generate_tokens
STRING = 1
print(
list(
token[STRING]
for token in generate_tokens(StringIO("2+24*48/32").readline)
if token[STRING]
)
)
produces this output:
['2', '+', '24', '*', '48', '/', '32']
与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…