in most compiled languages you need something like them to get fully featured metaprogramming. It's not necessarily a case of liking them so much as they often become the best option. (note that C macros are a terrible implementation of macros, and yet despite that and the C++ committee hating them with a passion they are still often the only way to do certain things in C++, because the C++ committee has not sought to make better macros but attempt to extend other language features to accommodate some of their use cases.).
Code generation is just external macros; it's the same thing in a worse form.
To maintain the code, you have to understand the input language to the code generator and its metaprogramming constructs. You're no better off in that regard.
The grandparent comment is saying that if you don't give people metaprogramming built-in, they will resort to outboard metaprogramming.
code generators are programs written in an existing programming language, which produce target language source code as output
macros are programs written in a separate, unique, often turing-complete meta-language, which is implemented entirely in the compile phase of the language which supports them
If done right, a macro system allows you to make your language modular and experiment with new language features without having to change the core language and the compiler. With the macro approach, languages become libraries.
The Racket people took this concept very far. The kernel of the language is very small and well defined. All Racket programs (or more precisely, expressions) are eventually reduced to a handful of syntactic core forms (see [1]). For example, thanks to forms such as (#%variable-reference id) you can specify rules for variable access, for example, w.r.t. life-time.
With tools such as the Macro Stepper you can fully step through the transformations of any expression in your program, from the highest to the lowest level.
This has numerous benefits. Extensions or modifications of the language can be rolled out (and used!) as libraries. This makes collaboration and testing far easier. Also, if a language feature turns out to be a bad idea, you deprecate the library. You do not have to change the compiler. This allows you to shrink your language and explore different directions without the burden of an ever growing language spec and implementation.
Is it a perfect solution? No. Changing a widely used language always has big impact, but the impact can be compartmentalized and users of the language are given a graceful migration path by updating their libraries, at their choice and pace.
Is Racket perfect? No, not by a long shot. But, frankly, language authors should at least take a look at the possibilities and consider the technological options for controlling the evolution of a language.
Macros are very helpful for making many things. There are some problems with how macros are working in C, but it is also beneficial sometimes.
One thing that C does not have something like METAFONT's "vardef"; if you have "scoped macros" that can be declared as global or local variables and as members of structures, it would cause less interference than the current system and can even sometimes provide other benefits too.
Others include e.g. possibility for macros to define other macros (probably several other programming languages have such a thing), for appending to existing definitions (like {append} in Free Hero Mesh), for altering existing macros with a value calculated immediately instead of each time the macro is called (like {edit} in Free Hero Mesh), etc.
(this is PG's take, but IIRC the thread as a whole had some interesting ideas. Unfortunately Guy Steele's replies seem to have been under a different Subject: and haven't been threaded by the archive)
Not including Unicode would not make it unable to process Unicode, since you can implement whatever character encodings you want to do, without being stopped by the programming language's bad idea of character sets.
However, processing Unicode is often unnecessary anyways. Sometimes you only need to deal with ASCII (and can pass through non-ASCII unaffected). Sometimes the Unicode handling can lead to bugs (and sometimes can be security problems; and this is not usually due to Unicode being implemented incorrectly). Unicode can also make the code less efficient especially when Unicode is unnecessary but sometimes even if you do deal with Unicode (due to internal conversions and counting and other stuff it is doing when it is not helpful or might even be against what you are trying to do). Inherent Unicode handling can also sometimes make it difficult to handle byte strings especially if they do not have many of the same operations available to them.
It also tends to lead to bad design (API design and programming language core design). Sometimes it is used even though byte strings would be more appropriate, or sometimes you might want a separate "file path" string type (I think Common Lisp does this). Treating source files as Unicode text can also be probematic.
Unicode is not a very good character set anyways; it is bad in many ways. I could say what ways it can be bad for many different languages and for other purposes such as security and efficiency and accessibility, too. (Some people say it is better than dealing with other character encodings for multilingual text. I have worked with it and found the opposite to be true.)
- It uses "0o" instead of "0" prefix for octal numbers.
- Underscores to separate groups of digits in numeric literals.
Not good:
- It uses Unicode.
- It does not have a "goto" command.
- There are no macros.
- Perhaps, the name is too similarly than the other programming language, maybe?
This isn't everything that can be commented about it; I have not examined it completely yet.