No particular reason. Or perhaps I should say, to scratch an itch.
There is no need for this language to exist. I have no practical use for it, and there are no great innovations to justify its creation. It is mostly just a rearrangement of things that have existed for decades.
As a programmer who spends most of my life thinking in and about a range of computer languages, I simply woke up one morning with an urge to work out exactly what I consider to be the Right Thing.
Pigeon is part experiment, and part my personal statement about the aesthetics of coding. If it turns out to be useful for anything practical, that will be a pleasant side effect.
Many years ago I learned Perl. This was good for getting things done quickly, and fun to use because it encouraged clever syntactic hacks, but the results were never exactly readable.
Later on I learned Python. I liked this very much because it combines a clean and consistent syntax with a great deal of expressive power. Along the way it somehow lost the "neat hack" quality of Perl coding, though.
More recently, I've been doing a lot of metaprogramming with templates in C++. This has been both a revelation and a huge disappointment. The C++ template syntax is in fact a Turing complete functional language that runs at compile time, and this is cool because it lets you step outside the bounds of the regular language, writing programs that generate programs in all sorts of creative ways. The trouble is, templates aren't a particularly good functional language! Beyond a certain level of complexity, the results tend toward the illegible, although fancy code like Boost::lambda can help somewhat.
Very recently indeed, I learned Lua. This is kind of like Python, but even more so: smaller, cleaner, but still extremely powerful. Lua is actually slightly better than Python for functional programming, as it supports anonymous local functions without the restrictions of the Python lambda syntax. Lua is also simple enough that for the first time I found myself looking at a programming language and thinking "hey, I could make one of these..."
I wanted to make a language that would be clean and beautiful like Python and Lua, but hackable like Perl, and that would support higher order metaprogramming techniques like C++ templates.
I'd always been interested in the idea of Lisp, but never got around to actually learning it. As I began researching ideas for my new language, I soon figured out exactly why Lisp programmers have always been such smug bastards...
My main problem with Lisp is the (lack of) syntax. I'm sure I could get used to it over time, but most people initially find Lisp code very hard to read. So I decided to make Pigeon work like Lisp internally, but with a more conventional external syntax. The parser (written in Pigeon itself, but using only the Lisp subset of the language) can turn arbitrary input code into Lisp s-expressions.
Having this difference between input syntax and internal form does make macro programming slightly more complicated than in a conventional Lisp, but hey. I think it is worth it. This approach makes the easy things easy, while the hard things are still possible. You can always ignore the infix syntax and write s-expression code directly if you want to.
Another thing missing from Lisp is a proper object oriented type system. The Common Lisp Object System is kind of backward in this regard, passing objects to generic functions rather than sending messages to objects. That may seem superficial, but it actually makes a big difference to the way you think about your code, and affects the ability of each object to have a unique message namespace.
I reckon Smalltalk got object orientation pretty much right, so I decided to use that as the basis of my type system. You could sum up the results as "Lisp code, Smalltalk type system, C syntax".
Allow everything. Programmers outrank language designers, because only they know what kind of problem they are trying to solve. The language should not play nursemaid.
Single paradigms are all very well for fanatics, but in the real world some things just aren't objects, and some functions naturally have side effects.
Small is good, but so is comprehensive. Extremes are bad, so meet somewhere in the middle. Best of all is if you can implement comprehensive functionality over the top of a small core.
Implicit is better than explicit, except where that would be ambiguous.
Sugar is good, as long as the underlying functionality is simple and consistent. Even if a syntactic feature adds no significant new capability, it is worthwhile to make common idoms tidier and more pleasing to the eye.
Punctuation is good, up to a point. Languages like Lisp and Smalltalk tend to use textual names rather than punctuation characters. This can make code more readable for people who don't know the language, but for an expert, a single character is faster to type than a long word, and reading dense text takes longer than recognising the familiar visual shape of a special character. This can be taken to extremes (Perl), but in moderation, and especially for common features borrowed from C, punctuation is better than text.
Code formatting can affect syntax, but only if that formatting is clear and unambiguous. Indentation can be ambiguous due to varying tab sizes, so this should be ignored, but it is safe for line breaks to have significance.
It is ok if programmers choose to play dirty, but good to stop them accidentally hurting each other. In other words, if somebody really wants to go poking around in the private data of some other class, we should trust they know what they are doing, but there should be namespace features to prevent a derived class accidentally conflicting with internal implementation details of its parent.
Variables are dynamically typed, but must be explicitly declared. The alternatives would be that undeclared variables default to global (Lua) or dynamic (Perl) scope, which is rarely what you want, or to local scope (Python), which is confusing and ambiguous when working with nested closures. Explicit declaration has the advantage that typos can be trapped as errors, and that class methods can implicitly reference instance variables from their self object.