Costs of Programing Language Operations
documenting time+space costs of computer language core operations
in my recent study of the relations of Programing Language and Its Machine 〔see Programing Language and Its Machine〕 i noticed that the cost of operation in programing lang is critical. i.e. each operator or func should have doc about its costs: a “cpu cycle” cost and memory cost.
low level programers, C C++ are intimately aware of the costs of each operator/function/statement. For scripting langs e.g. python ruby JavaScript not so much, but most know about it from list vs array vs hashtable.
one primitive way to get cpu+memory cost of each operation (the core operator, statement, function) of a lang is simply by profiling, and count cpu cycle and memory footprint. The result then normalized to have a scale say 1 to 10.
but, note that the cost also depends on the compiler/implementation, and the specific hardware. still, it seems reasonable to have a “score” of time+space costs for each and every core operations of a lang.
the explicit documentation of operation costs of computer langs have strong advantages. It makes all programers painfully aware. And that these costs are inherent, not man-made complexity. It also makes it trivial to know a program's efficiency.
Sometimes when coding, no matter what u do, there seems always a inelegant extra var, init, slot, assignment, push, copy …. When studying cpu e.g. Superscalar, RISC, CISC …, u see lots of cycles r wasted or memory copied vainly. So, “inelegance” seems INEVITABLE in algorithms.
Programing Languages and their Machines
- Programing Language and Its Machine
- Programing Languages and Their Computational Models
- Costs of Programing Language Operations
- Why C Language Has the Main Function
- What is Closure in Programing Language (2012)
- Computer Science, Modeling Modern Software and Hardware on an Abacus
- Why is Array Access Constant Time
- On GPU vs CPU Algorithms Compilation, Cpp, Parallel Computation, Wolfram Physics (2023)