labellementeuse: a girl sits at a desk in front of a window, chewing a pencil (Default)
worryingly jolly batman ([personal profile] labellementeuse) wrote2004-10-27 08:19 am

(no subject)

I'm re-reading Snow Crash, by Neal Stephenson.

And I'm so curious as to wheterh he's talking out of his ass or not.

So, [livejournal.com profile] gianp, oh font of all things comp sci, what is the significance of the number 65,536? Or 256? (Anyone else can also apply. But it doesn't count of you've read the book, [livejournal.com profile] sixth_light .)

[identity profile] gianp.livejournal.com 2004-10-26 10:28 pm (UTC)(link)
65536 is the largest number which can be represented as a 16-bit unsigned integer. 256 is the largest unsigned 8-bit value.
ext_2569: text: "a straight account is difficult, so let me define seven wishes" image: man on steps. (books | zebra_patronus)

[identity profile] labellementeuse.livejournal.com 2004-10-27 01:20 am (UTC)(link)
I don't even know what that means.

... but I'll just take your word for it, and assume Stephenson wasn't actually making it up. How 'bout that. :D

[identity profile] gianp.livejournal.com 2004-10-27 04:37 am (UTC)(link)
What was it that he claimed?
ext_2569: text: "a straight account is difficult, so let me define seven wishes" image: man on steps. (Default)

[identity profile] labellementeuse.livejournal.com 2004-10-27 07:17 pm (UTC)(link)
I left the book at home, so I can't quote like I was going to ... but basically he was talking about hackers, and saying that "to most people... the number 65,536 is insignificant... except to hackers, who recognise it more readily than their mother's birth date." He went on to explain that 65536 is 2^16, 256 is 2^8, and so forth- not wuite what you said, but I'm thinking they relate. He went on to say that "any number that can be created by fetishly multiplying twos together and subtracting the occasional one is recognisable to a hacker...two is the most important number in computers, because that is the number of digits a computer can recognise: 0 and 1."

That's more of a summary than a quote. It was interesting, but the book's about ten or so years old, and I was wondering how many programmers actually code in binary any more, or how important binary is in computing... I mean obviously it'll always be important, right, but on a day to day basis?

[identity profile] gianp.livejournal.com 2004-10-27 08:14 pm (UTC)(link)
The 2^16, 2^8 thing is basically what I said - I just drew further conclusions.

Binary usually ceases to be important when you get about 3 steps removed from hardware - there are some particular circumstances where it remains useful beyond that, but I think that the majority of programmers these days could survive without a knowledge of binary beyond 'it exists'.
ext_2569: text: "a straight account is difficult, so let me define seven wishes" image: man on steps. (Default)

[identity profile] labellementeuse.livejournal.com 2004-10-28 02:09 am (UTC)(link)
Yeah, that's what I thought. It all looked about right, anyway.

I thought it might be something like that... kind of like the way it's not all that necessary to know how to add anymore, what with calculators, or something. Actually, no, that analogy's total crap, I can't really think of a good one. *shrugs* But I can understand why that would be so.