Hacker Newsnew | past | comments | ask | show | jobs | submit | greggraham's commentslogin

I was planning on buying a KIM-1 when my dad surprised me by buying an Apple II. I didn't end up writing anything in assembly on the Apple II, though. My first assembly language was IBM-370 in college. I wish now I had started with the KIM-1, though.


Exactly! The existence of something like a reliable Internet depends on some form of physical civilization with laws and law enforcement, not to mention food production and distribution required to keep the netizens alive.

Will some aspects of life move from the physical to the virtual? Sure; they have and they will continue to some extent. However, there is a limit. Cyberspace cannot exist without physical space. If you want to ascend into some kind of purely spiritual existence, you need to look elsewhere than technology.


Mesh networks.


I learned to program in BASIC on an Apple II, and although it was my entry language, I don't see it as necessary now. The school I work at uses MIT Scratch with the 5th graders, and they love it. I'm teaching Java in a high school computer science elective because that's what is used on the AP Exam, and they're likely to start with in college. I think Python, Ruby, or Lua would be better choices, but Java is good enough.

However, I'm also exposing them to logic circuits, machine language, and assembly so that they have a better understanding of how the computer actually works. I want them to understand the basics of what the compiler and VM are doing so that Java is not completely abstract to them. As experienced developers know, these abstractions have leaks so it's helpful to know what's going on underneath. I see BASIC as a dead-end abstraction. I would rather use assembly to teach a low-level understanding, and then use modern languages at the higher level.


Notice that technology is not a critical part of their success. I didn't see a lot of shots of kids in front of computers, but rather, teachers and students interacting with each other. I work at a private school that has a similar philosophy, which contrasts with the other private schools in the area with which we compete. We're not interested in 1-1 laptop programs because we don't want devices getting in the way of personal relationships. Of the upper tier private schools in the area, we have the highest scores and the lowest tuition, and very good acceptance rates in top colleges. We invest in teachers who are well educated (all at least masters, several PhDs) and are happy (very high retention rate).

I don't mean to boast, but as technology director at this school, I'm always being asked about laptops, SmartBoards, and other cool and popular education technology things that we have decided not to use. I think it's important to carefully consider how technology can best be used, and to know when to stay with low-tech methods that work well.


I recently read _Cryptonomicon_ after having put it down last year. It was fun, and overall it was definitely worth it. I'm reading his _Snow Crash_ now. It was published in 1992, before the Web, and it predicts a time close to present day. It's interesting to see what he got right, and what he got wrong. It's also a fun read if you are generous with the disbelief suspension.


Gattaca is a well-done movie in general. My wife liked it, and she prefers art films and usually does not like science fiction.


I use aquamacs at home on my Macintosh, and GNU emacs on Windows XP at work. I have switched back and forth between emacs and vi/vim over the last 25 years because they are both great editors and have various advantages and disadvantages. I had been using vim back when I was using Ubuntu at home because the fonts looked better on vim than on emacs. Now that I have a Macintosh and can run aquamacs, I switched back to the emacs world. I admit I'm a sucker for pretty text.


Old versions of BASIC used LET for assignment, but Microsoft BASIC and other BASICs of that era dropped it as being unnecessary.


It doesn't necessarily take any longer to live in the moment. It's a matter of where your mind is focused. In fact, if you are focused on what you are doing, you will likely be more efficient and make fewer mistakes. If the task ends up taking longer, it would only be because you did a better job. That is, you might be able to rush through washing the dishes with your mind on the future, but your dishes would still be dirty.


I currently use Python and Django for the following reasons: 1) Coworker familiar with Python 2) I had an easier time understanding the mechanics of Django than I did Rails 3) Some libraries I needed were only in Python

However, from a language point of view, I have a slight preference towards Ruby. Here are some things I like: 1) More consistent commitment to object paradigm, e.g. all function calls are actually method calls on an object. In Python, everything is an object, but sometimes you use methods, and sometimes you don't, e.g. list.pop() vs. len(list). 2) I think blocks are very handy. 3) I like the more Perlish regex syntax. 4) There is a cleverness to Ruby that might not be the best thing for a corporate environment, but it makes it fun.

So, if I was writing code just for myself, I might choose Ruby over Python, assuming performance and library support were sufficient.


Humor me for a moment, because I've been asking this question of many people for a long time and have never yet gotten a straight answer.

Why is len() always the thing people pick on? Why is it always "Python uses len(obj) instead of obj.len()", and never "Python uses str(obj) instead of obj.str()", or any of the various polymorphic built-ins like sum()? I've seen this so many times now that I'm honestly quite curious about it.

Also:

"More consistent commitment to object paradigm, e.g. all function calls are actually method calls on an object."

Every function call in Python is always an invocation of a method on an object, even if it doesn't look like it. len(), of course, delegates to a method on the object. But even standalone functions are method calls. Try this:

>>> def add(x, y):

... return x + y

...

>>> add(3, 5)

8

>>> import types

>>> types.FunctionType.__call__(add, 3, 5)

8

The last couple lines there are what's really going on when you call the function.

(also, just as Ruby's "built-in" functions are really methods on Kernel which is made available globally, Python's "built-in" functions really exist in a module that's made available globally (and __builtins__ is its name)


I'm starting to feel like a broken record here, but whenever anyone wants to pick in "str" I find myself compelled to point out:

  >>> type(str)
  <type 'type'>
That is to say, str isn't a magical function, it's a type, and "str()" is how you call that type's constructor. How is this not somehow object oriented?

Of course, even if str was a function and not a constructor, it would be an object factory which delegates to its argument via a well defined interface (the __str__ method). OOP doesn't have to mean everything-is-an-object.


Even without going to the trouble of importing types, every Python callable has an __call__ method. Including the __call__ method:

    >>> def f():
    ...  print("Helllo __call__")
    ... 
    >>> f.__call__.__call__.__call__.__call__()
    Helllo __call__
Don't use this in real life though, there's a performance penalty.


Of course, there's also the singleton that always returns itself:

  >>> recursive = lambda: recursive


len(list) is actually a shortcut for calling list.__len__() , any object that implements the __len__ method is therefore compatible with it. So in practice, len is in fact a method call. Almost every python function that operates on objects is just a similar shorthand.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: