Skip to main content

Programming Language Governance (1)

As software eats the world, the programming languages used to build the software themselves grow and evolve. When new programming languages start out, they are often the product of an individual or a small group of people, though there are several famous examples of top-down programming language design where the PL is designed by committee from the outset (e.g., ALGOL. However, when a PL gets to a certain level of adoption, the inventors sometimes submit the PL to one of the standardization bodies: ISO C++, ANSI C, ANSI FORTRAN, ECMA/ISO C#, and ECMA ECMAScript. Others continue to retain their independent governance bodies (e.g., Python, Java). In either case, the governing bodies specify a process for evolving PLs:

Each of these governance bodies is like a small government. Many have voting requirements for advancing a proposal. For ISO standards, being subject to a supranational organization, sometimes votes are per represented country. There is often a sense of consensus building among the core developers. For a given proposal to advance, there must be a known, vetted advocate and also a core developer who can vouch that the proposal can indeed be implemented in the main compiler or interpreter without aversely impacting the community.

Comments

Popular posts from this blog

Top 5 Books for Language-Specific Interview Questions

Shrunk and White of Programming When you put down that you know a certain programming language or languages on your resume, you are setting certain expectations for the interviewer. I would strongly caution against putting down "expert" in a language unless you invented or are one of the language's maintainers. You are giving your interviewer the license to quiz you on programming language lore. There are a handful of concepts that are considered "standard" knowledge for each language which go broadly beyond syntax and general semantics. These concepts commonly involve major pitfalls in a given language and the idiomatic technique for negotiating these pitfalls and writing efficient and maintainable code. Note, although the concepts are considered idiomatic, you can seldom infer them from knowledge of syntax and semantics alone. The tricky part here is that most courses that teach a particular programming language do not cover these idiomatic techniques and e...

Interview Gotchas

It's a challenge to outperform all the other candidates in a competitive tech job only, but there is hope. You can improve your performance with practice and watching out for these gotchas: Make absolutely sure you are solving the right problem: I ran into this the other day. It is entirely a communication issue. When doing an initial screen over the phone, this problem is compounded. For example, maybe an interviewee is hacking out a function that returns the k highest priced products when the interviewer is expecting the kth highest priced product. One can squander a lot of time due to these misunderstandings. A good interviewer will try to guide you back to the right path, but you can't expect this. Be sure to ask questions. Confirm that the input and output are exactly what you expect. Use examples. Don't ever give an interviewer the impression that you are avoiding writing real code. This is an impression thing. This is a coding interview so you should be expecting...

Complexity Analysis for Interviews, Part 1

This is part 1 of a two part series. Skip over to part 2 you'd like . For coding interviews, we are interested in gauging the asymptotic efficiency of algorithms both in terms of running time and space. The formal study of analysis of algorithms is called complexity theory, a rich field with fascinating and complicated math. For interviews, we only need a few basic concepts. Asymptotic efficiency is concerned with how the running time or memory requirements of an algorithm grows with the input size, so it is intimately concerned with how well algorithms scale to larger inputs. This is important in Computer Science and in practice because whereas some algorithms work well enough for small inputs of say < 10 inputs, the running time and space grows far faster than the input size and thus large inputs of say 10s to millions of inputs becomes impractical (usually meaning taking hours or even years of execution time). Consider sorting. Say for the sake of argument that sorting ...