Coming from .NET/C++ background symbols were a bit of an alien concept. I had a hard time really mapping it to my brain. In the trying to do so I’ve read couple of posts that were describing symbols as a frozen string. Unfortunately that didn’t help me that much. I did manage to use them by writing a colon in front of the “string”. After I while I’ve realised that:
Symbol is a representation of an idea, but distinct from it. In a way it is a new entity that you name/create.
For example if you need to talk about animals in your neighbourhood we as humans already have concepts like cat, dog, pigeon…
While coding if you don’t have the idea of symbols you can:
- Use magic numbers - and try to remember them all while ensuring uniqueness if needed
- Give names to magic numbers with enumeration of some sort - and group them upfront
- Create structure or class to represent the concept - usually an overkill
Each of this approaches has flaws which are noted above.
With symbols you can say :cat, :dog and :pigeon. You are giving a name to a new concept. A concept that you can now talk about with your computer.
Internally they are stored as frozen strings in a global symbols hash. Technically name for this is interned string. Every time you add a new symbol hash is looked up and the symbol is added only if there was none before. In Ruby we have one global hash. This is different than for example in Lisp where every package has it’s own table of symbols. The Ruby symbols are most similar to symbols found in Lisp “KEYWORD” package, and less powerful than “proper” Lisp symbols. Excellent article about difference between Lisp and Ruby symbols can be found here. I believe that the main inspiration for Ruby symbols came from Smalltalk, where they are pretty much the same except for the syntax and the Smalltalk version being case insensitive.
|
For the conclusion don’t let implementation details fog your view.
Don’t think of a symbol as an optimized-immutable-global-syntax-sugared-string. Symbol is an entity that you’ve named.