Talk:Compiled language

C# compiled?
C# is NOT a compiled language. — Preceding unsigned comment added by 210.184.231.65 (talk) 14:25, October 9, 2006 (UTC)


 * According to me C# is compiled. cdc.exe compiles C# source to MSIL code which is then JIT compiled to machine code whiche is the interpreted by the CPU. Dylan Borg (talk) 21:05, 21 April 2010 (UTC)

Java compiled?
Hello, would it be appropriate to say that Java is "compiled", in that sense that it is compiled to bytecode? Or is that too strong? I ask, because the article on Clojure says, near the end: "Clojure is a compiled language producing JVM bytecode"... Or am I nitpicking... Anyone know a *reference* for this phrasing? 65.183.135.231 (talk) 15:02, 30 March 2009 (UTC)

Article organization
Please divide the list to languages compiled to native code and compiled to bytecode, because it's confusing. C#, Java, Cobra are compiled to bytecode. —Preceding unsigned comment added by 178.73.63.140 (talk) 21:11, 19 December 2010 (UTC)

"Compiled language" is a meaningless term
There can't be agreement on this. The list of languages that are compiled languages have so many in common with the list of languages that are interpreted languages. The problem is that the property is not in fact intrinsic to the language. This should be merged with the compiler page just as the interpreted language page should be merged with interpreter. Trying to state that there is such a thing as an interpreted or compiled language causes very really confusion in people who are trying to understand programming languages. -MM — Preceding unsigned comment added by 209.6.55.158 (talk) 05:43, 22 June 2013 (UTC)
 * Agree, the term has no possible meaningful definition. This page should be deleted. 190.41.173.174 (talk) 22:45, 27 December 2015 (UTC)
 * Yep. Compiled language is not a thing. Stevebroshar (talk) 22:16, 3 June 2024 (UTC)

Traditional compiled language
I just blasted in a large amount of new text concerning the centrality within computer science of traditional compiled languages.

Every practicing computer programmer with any sense of history at all knows precisely what this means. Enough with this airy-fairy ivory tower business of "well, the term 'compiled' could really mean anything, in a pure taxonomic sense".

We're doing a terrible disservice to the reader if we fail to point out that there's an extremely concrete sub-reference implied by this phrase among all working professionals, with 70 years of deep history and still going strong.

I don't have the time or inclination to stick around and futz with my voluminous initial draft. I saw the gap, and figured I could blast something out that was representative and serviceable in two frantic hours, which I did.

For the past two weeks, after having lapsed for about a decade, I decided to bring myself back up to speed with embedded development, which lead me to the RISC-V development, most of which was entirely new to me. What do they talk about (almost obsessively): the felicity of RISC-V with respect to exactly the kinds of optimizations that traditional compilers perform. Register colouring remains hot news. I watched an entire hour presentation about register allocation in the LLVM framework.

This new open source hardware movement is not exploding out of the gate because everyone wants to integrate a dozen tiny, dedicated RISC-V cores onto the same chip to run Python. Anyone who wanders into the modern world of DIY EDA (yikes!) has a harsh surprise coming if Haskell is their closest working approximation of a perfectly good low-level language.

Make no mistake, there's a rising tide here of "back to the machine" at the silicon/RTL layer, with plenty of academic muscle behind it, in addition to growing industry participation.

As a final note, though I am not without strong opinions, it is my personal policy to be 100% non-possessive about my contributions here on Wikipedia. Fire away at liberty on my fresh prose contribution, I've blown the joint after my hasty one-time dump and moved along. &mdash; MaxEnt 17:05, 24 September 2020 (UTC)


 * One point I'd like to stress is that a C program does not have a defined meaning until after translation. Even then, many behaviours are not defined until the translated program is run on a particular version of the OS. And even then, a great many C programs still contain undefined behaviour. For example, signed integer overflow is undefined behaviour in both C and C++, meaning that anything can happen, including the host computer reinstalling its operating system from 5.25" floppy disk or nasal daemons. A great many large C programs fail to input validate anywhere near adequately to ensure that the code never lapses into signed integer overflow. This issue is inherent to the tradition of compiled languages: some deliberately delay (or eschew) semantics to an ungodly degree. Java, in particular, tried to depart from this tradition, and managed to define floating point so rigidly that a fully conformant x86 implementation could only operate at 10% of its native performance level (the guard bits on the extended precision 80-bit internal representation had to be explicitly sheared off for floating point to achieve precise compliance with Java rounding semantics; Sun apparently didn't give a shit about this, I guess because they didn't think anyone should be running on Wintel machines in the first place—and look where Sun is now: the oracfice where the Sun never shines). Anyway, my point is that the compiler (translation phase) is hardly a neutral party to compiled-program semantics, and entire shelves of books could be written about this, without turning any new stones. &mdash; MaxEnt 17:36, 24 September 2020 (UTC)


 * First, thanks for your contribution. However, that was a huge amount of text with only one sentence being referenced, and raises many WP:OR and WP:RS issues. I've reverted it, but if you have actual sources, I'd encourage you to begin re-making the changes you can support with sources. It might make sense to do so in stages so that other editors don't have to digest tons of text all at once, but that's not an essential point. The important point is avoiding the OR/RS issues. TJRC (talk) 18:04, 24 September 2020 (UTC)

Approaches to compiled language specification
Each separate language article on Wikipedia can have some text about this buried in a different subsection each time, but I'm suggesting here that it would actually be useful to cover this subject in rough overview here in this article, because it matters vastly more than the uninitiated can possibly fathom.

Source code is not written with a single compiler for a single target in mind, but often a family of similar compilers (e.g. x86 architecture) for a family of related targets (e.g. Linux/Unix). Depending on the nature of your application, you might need to do a considerable amount to also run that code on Windows/x86 or MacOS/x86 or MacOS/RISC-V (perhaps coming soon).

There's a CS 201 view of compilation as a formal translation stage. This has next to no relationship to industry practice for big languages like C or C++, where the language specification, the library specification, the language implementation, the library implementation, the target memory model, the target memory model implementation, the ABI specification, the ABI implementation, the OS specification, and the OS implementation all actively participate as distinct cogs in the gigantic mill. And all of this still gives way, in many instances, to dumb random luck. &mdash; MaxEnt 18:19, 24 September 2020 (UTC)


 * No idea what your point is. ... And source code is often written for exactly one compiler; sometimes for one processor model. Stevebroshar (talk) 22:18, 3 June 2024 (UTC)