I notice the Arc source is littered with a few references to Scheme 48, specifically things not working in it. But if that got fixed, then you'd have access to C libraries, because IIRC Scheme 48 was built to interface well with C.
The only thing that seems to affect, going by comments, is the conversion from characters to integers and vice versa. If that's the only actual problem, there are only about ten places in the Arc code that need some replacement fix.
agreed, the little schemer is an awesome book, nothing about lisp made sense to me until i worked through it. Afterwards, pg's book just felt like a bunch of useful information, not a conceptual overload.
speaking of hash tables, I remember in ACL you explained why CL has two return values for gethash, to differentiate between the nil meaning "X is stored as nil in my table" and the nil meaning "X is not stored in my table". So why not in Arc?
Because it turns out that in practice it's rarely to never an issue. If you have nil as a val in a hash table, you usually don't care whether that's because it was never set, or explicitly set to nil.
Dear Paul, I can not believe you would make such a statement. Either you're living in a vastly different programming universe than the one I am living in, or you really haven't done that much programming at all. In any case, there are many situations where one stores types of values that may include nil in a hash table, and in most of these there is a very significant difference between 'value is nil' and 'value is not stored'. I understand that Arc isn't trying to all 'enterprisey', but these are fundamental concepts that, I thought, only complete amateurs did not understand. Sincerely, Dr. Drake
You know, I do actually understand the difference between the two cases. What I'm saying is that in my experience hash tables that actually need to contain nil as a value are many times less common than those that don't.
In situations where the values you're storing might be nil, you just enclose all the values in lists.
My goal in Arc is to have elegant solutions for the cases that actually happen, at the expense of elegance in solutions for rare edge cases.
Seems like the Arc-philosophy-response would be, let 'em redefine _ if they want to. And that philosophy will probably make me want to murder somebody one day.
Driving a Ferrari is rarely a collaborative endeavor :-)
Keeping `_' safe in fact increases the programmer's freedom. If I knew that by doing something I was going to fk up w/ other people's code and how they expect their code to behave, I would rather not* do it. OTOH, if there are namespaces, etc. then I know that I can enforce whatever coding conventions I want, in the privacy of my personal sandbox.
Programming is not only about communicating w/ computers; it is also about communicating w/ other people, i.e. your coworkers and so on. Nothing that makes it harder can be a Good Thing. (Of course, unless you are the Lone Wolf coding in your cave---but then you are probably using your Own Better Language and don't care about Arc anyway ;-).)
Freedom is an interesting philosophical topic. How can a restriction make you more free? Well if you drink the FSF GPL Richard Stallman branded Kool Aid, you might "get it", but I sure don't.
I don't want the freedom from doing things, I want to the freedom to do things.
That said, strictness and protection and safety of some language features is something some people would like to rely on.
I think operator overloading in C++ was a huge freaking mistake for instance. You can look at two piece of code without going through some contextual learning to find out what the hell "+" was just redefined to do, and I think that's crappy.
Other languages get by some how without operator overloading. And often without a loss of expressiveness.
That said, judicious use, and education about how subsystems in software work is a necessity even in the presence and absence of things people might consider to be abominations, like operator overloading :-)