Most of this seems silly. OSX is the only laptop OS? Since so many people buy PC laptops, that answer is clearly wrong more than 75% of the time (unless he wants newbies to try and hackintosh). Also, what's with the "no black on white" thing? Can someone explain that to me?
"In order for legibility to be achieved, a certain degree of contrast must exist between the background and foreground colors. However, it’s important to remember that computer screens have much greater black/white contrast than the typical printed page. To that end, many web designers prefer off-black to pure black on white backgrounds. Likewise, it is sometimes more elegant to use very light gray instead of pure white on black backgrounds."
The above quote is take from "Elegant Web Typography" by Jeff Croft
He gave his 75% answers. The readers of this article will have their own 75% answers, and if they don't, then his are just fine for the demographic. It's pretty smart really, and subtle.
They are not my answers, but that's very far from being the point. Well, actually, that IS the point!
The idea behind "no black on white" is that the contrast on most monitors is high enough that using the darkest color for text and the brightest color for background ends up being uncomfortable to read. So, use black on off-white (like HN), gray on white (like the posted article), or some similar combination.
No; If you want a laptop and don't know what OS to go for and don't have a strong preference already, pick OS X. Not "OS X is the only option".
If you already want Windows or want the cheapest and therefore picked Windows then you're not asking non-questions on the web and you're not the target of the blog post.
What a horrendous answer, no new programmer should start with C in 2009, java or php or python or any damn thing but C and jockeying memory references.
And it's so much better to go "Oh, this is so much easier than C!" rather than "Oh, this is so much worse than Python!" later on. It also tends to be easier to learn a new, easier way to do something in a new language than figure out the harder way with the easy way looming on the top of your mind.
Assembly's too high-level :) In my school's sophomore year, CS majors are required to implement a CPU on an FPGA; the instruction set, CPU architecture, everything are up to the student groups to design and implement. At the end of the class, all the student processors are compared w.r.t. size and speed of execution of various simple programs. Our group had a guy that wrote an assembler for our ISA, but most of the groups just wrote their programs in raw machine code, using a hex editor.
Really, nothing teaches low-level programming better than starting with gates and building your way up to programs. The sequel to that class teaches how to deal with pipelining, CPU caches, and multiple-execution chips like the TI DSPs, so after you design your processor, you get to see how real CPUs work. It's really fun, actually.
The intro computer architecture course used _Computer Organization and Design_ by Patterson and Hennessy. The practical component described here was driven by handouts, I'm not sure the text follows building something like this.
Perhaps C offers a sweet spot of being able to accomplish something with minimal effort, but still gaining a deep understanding of what the computer is actually doing.
Those are actually good ideas, and if you received a computer science degree from Carnegie Mellon University, you may have witnessed first hand their actualization in CS 213, Introduction to Computer Systems:
1. Int’s are not integers, Float’s are not
reals. Our finite representations of numbers
have significant limitations, and because of
these limitations we sometimes have to think in
terms of bit-level representations.
2. You’ve got to know assembly language.
Even if you never write programs in assembly,
The behavior of a program cannot be understood
sometimes purely based on the abstraction of a
high-level language. Further, understanding the
effects of bugs requires familiarity with the
machine-level model.
3. Memory matters. Computer memory is not
unbounded. It must be allocated and managed.
Memory referencing errors are especially
pernicious. An erroneous updating of one object
can cause a change in some logically unrelated
object. Also, the combination of caching and
virtual memory provides the functionality of a
uniform unbounded address space, but not the
performance.
4. There is more to performance than
asymptotic complexity. Constant factors also
matter. There are systematic ways to evaluate
and improve program performance
5. Computers do more than execute
instructions. They also need to get data in and
out and they interact with other systems over
networks.
That's a 200 level class, as far as I know there is always a CS1 100 level class that usually starts with some EXTREMELY high level language, sometimes a metalanguage just for the class itself? I heard somewhere Python is in style for this. Here is one such course : http://www.academicearth.org/courses/introduction-to-compute...
*EDIT - I made a stupid comment... they have C assignments in that class. I'll leave it for others amusement. I have seen that style of starting with a high level language though...
I never liked this argument. Doesn't have any inherent limitations, or proper evidence. You can just as easily say "you should not use a washing machine until you understand the pain of washing clothes with cold water". Or you should drive stick before going automatic. Or start programming in assembly, everything will seem easier afterwards.
Yes, modern languages are better then C. But they are not perfect. Programmers using them still have a lot of problems to solve and things to learn. It's just different things then with C, or assembly, or wiring transistors by hand. You have to face it, pointer arithmetic, while tremendously important in C, isn't really very useful in other languages. Knowing when the gc likes to start collecting on the other hand may be a good piece of information.
Wouldn't it be silly if a grown man got a bit of mud on his shoes and yelled "OH NO" and ran out of the room to the nearest washing machine to get his entire suit cleaned? In my experience, these are the actionscript-only coders, the people who only ever learned higher-level languages. They'll spawn a whole new object just to store a temporary number, or other silly things. They don't think it's silly, though. They don't know what's going on behind the scenes.
There's also just an assload of code out there in C. Regardless of what you think of getting close to the hardware, since most of the stuff running our desktops / servers is C / C++ / Obj-C, it's quite useful if you want to be able to dip into that world.
I certainly agree that learning C illustrates some extremely useful compsci principles. I also believe you can be a wizard at solving computational problems with very little practical programming experience - I met many such algorithms phds in my failed journey to become an algorithms phd. That being said, I made this comment because the title of the post is "The 75% answer to all newbie questions" C is not a language that a newbie who needs a 75% answer is going to do well with! There's a reason they don't use C for CS1.
Helvetica Bold for headings, Times New for body text.
If you want to get fancy, Gill Sans and some variety of Garamond respectively.
If you're printing code or data, Consolas is a very pretty monospaced font, followed up by Droid Sans Mono, Lucida Typewriter, DejaVu Sans Mono, or Andale Mono.
This is really good. I like the idea of using statistically relevant answers in lieu of absolute answers. Like in this case, 75% of the time, Linux servers may be the best type of server, but there is always that other 25% of time where there are exceptions.
Reminds me of using the principle of mediocrity proposed by physicist Alex Vilkenkin to make cosmological predictions when dealing with things like inflationary theory. If there is a bell curve for some constant we are trying to establish, we predict this constant will be somewhere in the middle, chopping of the two extremes at both ends. Thus, say the theory is some version of Guth's inflationary theory, and it predicts a constant that varies over some spectrum. Thus we predict we will see that constant to lie somewhere in the middle of the bell curve. Thus, if we observe or measure that constant to be somewhere around where we had predicted it, that lends weight to the theory we were working from. The idea being that we live in an average world, e.g. in an "average" inflationary bubble, hence the principle of mediocrity. The key point is that we CAN actually use statistics to make predictions about the world, and need not have to always make precise predictions, which, in some versions of string and/or inflationary theory, one cannot, simply because of the infinite number of bubbles in inflation or the near-infinite number of possible string theories (10^500 values for the false vacuum I think I read somewhere).
Anyway the idea that there is no one right answer to a question that works everywhere seems sound in some cases.
"Stupid people" is a bit harsh and not very accurate. It's more accurate to say that we need more quick answer guides for the less experienced. We were all there at one point in time.
Accept their our other common misteaks people make, so its not at all clear that you'll get to 75%.