The point of the list is not to complain, but to educate (informing people about unexpected and suprisingly broken things), and propose workarounds.
Bug report: nodejs/node#8569.
var x = Buffer.from("вот зе фак", "base64") // -> <Buffer d8 1e f9 0f 4f>
var y = x.toString('base64') // -> '2B75D08='
Validate strings manually, for example, using this regex:
function validateBase64(s) {
if (!(/^(?:[A-Za-z0-9+/]{4})*(?:[A-Za-z0-9+/]{2}==|[A-Za-z0-9+/]{3}=)?$/.test(s))) {
throw new TypeError('invalid encoding');
}
}
All logging should go to stderr, not only console.error
, otherwise it's impossible to write programs that use stdio, especially if third-party libraries would use console.log
.
Replace global console.log
with your own:
console.log = console.error.bind(console);
Hopefully this doesn't break anything.
Modern operating systems use 64-bit inode
numbers (~file identifiers). In Node.js they are converted to JavaScript numbers (double
in C++), which can hold exact integers only up to 253. If the operating system gives a number in range from 253+1 to 264-1, some of them will collide during conversion — basically, the OS gives one number, but it ends up a different number in Stats.ino
.
Bug report: nodejs/node#12115
Looking at the code of fs
module I see that theoretically there may also be problems with other data types in the fs.Stats
structure: libuv declares most of the fields as uint64_t
for portability. From a quick research, though, operating systems only use 64-bit integers for inode
, size
, blksize
, so currently the problem is limited to inode
, which can be any random number, as file size
over 253 bytes is whopping 9 petabytes, and block size is even more than that (for example, NTFS and ext4 don't even support such larger files).
Because JavaScript doesn't have 64-bit numbers yet, the current proposal that makes most sense is to expose inode
number as a string.