What is the difference between String.prototype.codePointAt() and String.prototype.charCodeAt() in JavaScript?
String.prototype.codePointAt()
String.prototype.charCodeAt()
'A'.codePointAt(); // 65 'A'.charCodeAt(); // 65
From Mozilla:
The charCodeAt() method returns an integer between 0 and 65535 representing the UTF-16 code unit at the given index (the UTF-16 code unit matches the Unicode code point for code points representable in a single UTF-16 code unit, but might also be the first code unit of a surrogate pair for code points not representable in a single UTF-16 code unit, e.g. Unicode code points > 0x10000). If you want the entire code point value, use codePointAt().
charCodeAt() is UTF-16, codePointAt() is Unicode.
charCodeAt()
codePointAt()
1.4m articles
1.4m replys
5 comments
57.0k users