The release notes mention that there has been work on static Ruby compilation as well as well as fixes for clang/llvm.
The one weakness on Apple's platform is that they don't have the rich, language-agnostic runtime and a modern language to match Microsoft like .NET/C#. And Google is building on top of the JVM.
I think the future Apple stack looks like:
Objective-C & Ruby
----
Language-agnostic runtime (Grand Central Dispatch, Blocks, etc)
----
CLANG
----
LLVM
----
Unix (iOS or OS X)
Garbage collection was added in 2.0, boilerplate properties code can be removed with the latest LLVM/CLang, and it's message passing rather than method calling. Add namespaces and remove the .h/.m split and I'd argue it was a better language than C# or Java.
To be fair the Java and C# specs are written in a terse but exhaustive manner because standards bodies are the main audience. But it still says something about the conciseness of the languages.
The header/@interface is more there for you than the compiler. That is, you don't have to have the @implementation and the @interface in separate files. One way is to use the @class declaration to say that the class will exist at run-time and the compiler doesn't have to worry about the lack of a header. Another way is to #import the .m directly.
There are two issues that come up with the first method: A warning when you use the class about it and methods/dot-syntax accessors possibly not existing and problems if you want to subclass a class. There's probably a compiler flag to suppress the warnings for the former problem and class_setSuperclass for the latter. Even if it is deprecated.
For the second method, you'll need to do some trickery to prevent duplicate symbols. Its solvable, but the exact details escapes me at the moment.
If you guessed that its a giant hack either way, you'd be right!
Add namespaces
This is tricker than you'd think, mostly due to the C heritage it has. We'll probably be stuck with class prefixes for a long time. At least they makes classes Googleable.
> One way is to use the @class declaration to say that the
> class will exist at run-time and the compiler doesn't have
> to worry about the lack of a header.
Sure, you could just put "uses NSWindowController;" in your .m file and the compiler would know all about the NSWindowController class and its methods, but what about all the typedefs, defines, functions, and other bits C/C++ heritage that everyday Obj-C code uses?
Obj-C has been around for 25+ years, and while the libraries have changed a lot during that time the language can still be implemented essentially as a preprocessor on top of a plain old C/C++ compiler, and I don't think that will ever change.
Long live MacRuby, though. I look forward to the day that becomes an official first-class language for OS X development.
Granted, Apple has a lot of engineering to do before they turn all these disparate parts into a coherent, battle-tested runtime on the order of .NET or the JVM. But it will be interesting to see how this evolves...
Why would "Sandbox" functionality need to be first-class implemented in any Ruby? Perhaps a version of Ruby that will run on the iPhone where some carrier's rules could include "No free apps can use cellular internet access". Let AT&T dig their own grave.
...limit potential damage that can happen if a vulnerability is exploited.
Security aspects are a much more likely reason for using sandboxing than carriers complaining is. Plus, sandbox is already available from C -- see: `man 7 sandbox`.
Anyway, I imagine tptacek (or anyone who knows more about security than I do) can explain this better than I can, but, I'll give it a shot: In limiting your process, you're preventing an unknown state from happening within your process. Its another level of control that you can use to ensure that your program will only do what you programmed it to do and that it can't do anything else.
One part of how this sandboxing works is that any process spawned will also be contained within the sandbox. So, say there is a vulnerability -- in your code, in a gem you use, whatever -- the point is, there is a vulnerability. Maybe it will give an evil nasty person remote access to your system, or perhaps a local user can use it to gain privileges that they shouldn't have. Or something else that I'm not thinking of.
Anyway, as a result of a vulnerability, the user now has access to your system. Or maybe they have more access than they should. You can use Sandbox to forcibly limit how much access a process and child processes can have. You can make it so that a process can't write to any directory except the temporary directory that it uses and block access to the internet. Make it harder for the evil nasty person to break, steal or otherwise continue doing things that they shouldn't be doing.
// edit: to summarize: No matter how safe your Ruby code happens to be, if you use an unsafe C extension, you're screwed if you're not using sandbox(7).
The release notes mention that there has been work on static Ruby compilation as well as well as fixes for clang/llvm.
The one weakness on Apple's platform is that they don't have the rich, language-agnostic runtime and a modern language to match Microsoft like .NET/C#. And Google is building on top of the JVM.
I think the future Apple stack looks like: