# Xah Programing Blog

syntactic obsession, imperative despisal, unicode love ♥, hacker hate

### follow xah

follow me on my new mastodon account at https://mstdn.io/@xahlee and on reddit https://www.reddit.com/user/xah

### code elegance vs efficiency vs modularity vs readability

for my patreons https://www.patreon.com/posts/17567297

### to my patreons, an intro to Xah Lee, is that right?

Unicode Circled Numbers ① ② ③ (major update)

Standard Fonts on Linuxes (minor update)

### start to post to patreon again

am going to start to post to patreon again. see https://www.patreon.com/posts/17332715

for paid audience. It's better that way, increases my quality, with a goal.

the problem just blogging or posting to twitter, is that, nobody really cares. With patreon post, there's a goal, similar to commercial org. You write, if it's good, people pay. When no good, people stop.

And comments tends to be more valuable. It has a basis.

20 years of open source, has eliminated the power of individuals. Instead, you get these “we support opensource” mega corps like google and Facebook. The programers, by day work for these greed, by night sing open source, put to git, wipe out bread of small programers.

### Open Source, worst suck of the century

[Why There Will Never Be Another RedHat: The Economics Of Open Source By Peter Levine. At https://techcrunch.com/2014/02/13/please-dont-tell-me-you-want-to-be-the-next-red-hat/ ]

[RethinkDB: why we failed By Slava Akhmechet. At http://www.defmacro.org/2017/01/18/why-rethinkdb-failed.html ]

my day schedule for the next 3 months looks like this

- 1 hour compiler
- 1 hour golang
- 1 hour logic, proof theory
- 1 hour differential geometry, complex analysis
- 5 hour js math graphics

### Chinese Chess pieces in unicode

now you can play chinese chess! Play Chinese Chess here Play Chinese Chess Online

see the https://twitter.com/xah_lee/status/965774858746347520unicode 11 draft https://www.unicode.org/versions/Unicode11.0.0/

See also: Unicode Characters ☯ ⚡ ∑ ♥ 😄

linked from home page, bottom ∑ Xah Code

### programer's canvas reverse y-axes

one very annoying thing when coding a 2d grapher is the reversed y-axes in programer idiot's coordinate.

In svg, either u do `(ymin+ymax-y)`

for every point, or,

`g.setAttribute("transform", `translate(0 ${ymin+ymax-y}) scale(1 ${-1})``

but, your text get flipped. what a fing pain.

why programer's canvas reverse y-axes? not looking into history, but i surmise, cuz 'was just easy in the early days, like unix f and opengl f. Its ok to realize necessity, but the programer idiots is such that they see it as good. Every lang design, you run into such idiotic problems.

from unix to c to java to opengl (or whatever gl fk). and these idiots say, well but it is fast! it's efficient! the problem with these fheads is that they don't really understand math or meaning of efficiency, they just know micro-tuning and memory address.

I think the hard-ware based mindset is slightly going away today. Because , new generation dunno it anymore. And, langs like haskell golang, are teaching people to unstand the separation of efficiency and abstraction.

1 example of pseudo-efficiency trumps language design is Bitmask Used as Boolean Parameters. See

http://xahlee.info/UnixResource_dir/writ/bitmask.htmltypical hacker get confused by this, and defends it vociferously.

this happens for languages up to about 2005, but not after.

See also:

### high resolution monitor

time to buy a high resolution monitor. 2560x1440. You. See more, sans fidgeting with scroll swipe.

find out your screen resolution size JS: Find Window Size

lots opinions on coding, eg coding style, what makes productivity etc. Most pop blog posts r drivels, thrive for a few years, then buried. This is the best book on programing opinions, based on statistics. I read parts in 1990s. the book is now old.

Idiocy of Computer Language Docs: Unix, Python, Perl, Haskell

software testing, is one of the wart product of eXtreme Programing (that became agile). The effect is, even less understanding of programing languages, math, design, but more trial n error, speed prototype, churning, Cuz, there's test!

On Meta Syntax, Formal Language, and Logic (minor update)

### Clojure Lisp Has Failed

i'd have to say, clojure lisp has failed, due to, complex intermix with java/jvm, and the clojure community as a cult refusing to listen. While, golang , @kotlin are thriving. The kotlin actually replaces clojure, with 10x more users.

it's funny, that the clojure head Rich Hickey emphasize simplicity, and makes a fuzz about simple may appear hard. But that's a tinted glass look of simplicity. Clojure, the way it forces intermix with java, is more complex than ANY programing language.

See also:

### safari don't have favicon in tabs

safari don't have favicon in tabs? what extreme idiocy. I was switching to it, but now i don't think so.

https://www.reddit.com/r/apple/comments/4sm2d1/its_2016_why_do_we_still_not_have_favicons_in/

having favicon in tabs makes it easy to identify tabs. When you have 10 tabs, not having favicon is a major problem.

### xah comp sci growth roadmap

of my personal comp sci growth roadmap, there's few thnigs i like to horn. In no particular order:

- ① get experience in writing a compiler.
- ② understand what features make language slow.
- ③ understand the relation of various kind of strict type to its computability
- ④ understand what language features allow what degree of static analysis
- ⑤ a grip on proof systems, computability theory, math foundation

### golang syntax not composable

here's systactic complexity of golang.

slice type is declared like this:

`var `

`name` []`type`

but map type is declared like this:

`var `

`name` map[`type`]`type`

you see, there's a inconsistency.

That is, there's no single form for both. Basically, every semantic thing in the language has its own ad hoc syntax. The syntax is not composable.

see

See also:

### xah reddit channel

been posting to my reddit r/Xah. see https://www.reddit.com/r/Xah/

just trying it out. If you are reddit user, find my post on reddit as convenient, vote up my posts. If i don't see much user, i'll stop doing it.

also, since i have blogs on emacs, JavaScript, programing, and other topics, do let me know what you want to see, so i can shape up what kinda things i post there.

### brave browser annoyances

.@brave browser allow/deny notification bar is annoying. It pops up for fullscreen, notification, and others. If you ignore, the bar don't go away but adds up space to the top of the browser. And when there's multiple, you might click the wrong allow/deny button.

another thing @brave browser is annoying is that when you hover mouse over a tab, it shows that tab's content temporarily (and in grayish fog). Very annoying and confusing, especially if you have hover tab/dwell on in OS.

### xahlee info popularity ranking 2018-01-15

climbing the ladder.

arduous.

### how to become a mathematician for programers

xah's edu corner extempore! episode №20180115122340 on math courses, and how to become a mathematician for programers

for young coders out there, the proper order of learning math is: highschool algebra, trig, pre-calc, calculus, linear algebra, then optional intro diff equations. These are basics. Must be in that order. These are basic needed for engineering. But are not math major basics yet.

for math major (to become mathematician), you need abstract algebra, real analysis. Order doesn't matter. After you had these, you are acquainted with math “language”, or math maturity. i.e. the way mathematician talks. These are typically 3rd year math major.

after that, you may have the following in arbitrary order: complex analysis, different geometry, topology, set theory. After studying these, you can consider yourself mathematician. You know what math is about, or know where to go. All the above, are traditional main math courses.

for programers, you might wonder, where is graph theory, type theory, game theory, logic, combinatorics, statistics. These are not typical main courses of MATHEMATICIAN. These are called discrete math, sometimes as comp sci. (while statistics n probability r applied math)

the discrete math, they do not have the elaborate pre-requisite sequence as analysis/algebra. Anyone can start to learn graph theory, game theory, combinatorics, number theory, etc. But you don't get deep without the years of analysis/algebra of real mathematician stuff.

See also: Free Math Textbooks

Logical Operators, Truth Table, Unicode (updated)

Programing: “or” Considered Harmful (updated)

Proliferation of Computing Languages there are so many in past 5 years, that people don't care anymore. Certainly, i stopped listing new ones. #elmlang #purescript #TypeScript #haskell

### ask Xah, 2018-01-07

[CoffeeMiner: Hacking WiFi to inject cryptocurrency miner to HTML requests By Arnau Code. At http://arnaucode.com/blog/coffeeminer-hacking-wifi-cryptocurrency-miner.html ]

seems like a good article.

See also: http://xahlee.info/linux/linux_iptables_basics.html

### pop stars and programer stars

pop stars, quietly drops out every decade or 2. And this is also true for programer stars. A star everybody knows and talks about, but you don't know or recall who's in previous decade.

### classes of the rich, and classes of programers, and you don't know nothing

when something is beyond us, we can't distinguish the level. e.g. some are millionairs, but some r billionairs. To us, we don't comprehend. We group them as just “rich”. To them, it's 1 thousand times diff, literally. This goes for pop stars, to politicians, businessman.

Similarly, in programing, to outsiders, we all smart nerds beyond their comprehension. But to us, there r script kiddies, and “web designers”, to those who wrote google search engine. This applies to any field or community u r not familiar with. which basically means, everything.

### intel chip bug Meltdown and Spectre

just read a few articles/papers about the intel bugs Meltdown and Spectre. Meltdown require OS kernel patch, cause 5% to 30% slow down for system call intensive apps (networking/file). and Spectre is unfixable.

We are screwed. by next week, nobody will remember the problems, except hackers and gov. But have a cookie, you are screwed already anyway with ur phone and usb and ssh etc.

if you are not a computer nerd, here's the gist: hackers discovered one very big hole to your password, and there's nothing you can do.

instead of Santa Claus, the unicode calls it Father Christmas.

🎄 🎅 🤶 ⿅ 𐂂 🦌

see Unicode Search

so why Unicode call Santa Claus as Father Christmas? Wikipedia has history https://en.wikipedia.org/wiki/Father_Christmas

but be warned it probably contains biased writing.

this Wikipedia from 2005 gives a easier to follow picture of history of Father Christmas https://en.wikipedia.org/w/index.php?title=Father_Christmas&oldid=33065629

quote:

Father Christmas is a name used in the United Kingdom, Australia, New Zealand and several other Commonwealth Countries, as well as Ireland, for the gift bringing figure of Christmas or yuletide. Although Father Christmas, Saint Nicholas and Santa Claus (the latter deriving from the Dutch for Saint Nicholas: Sinterklaas), are now used interchangeably, the origins of Father Christmas are quite different.

Dating back to Norse mythology, Father Christmas has his roots in Paganism. Before Christianity came to British shores it was customary for an elder man from the community to dress in furs and visit each dwelling[citation needed]. At each house, in the guise of "Old Winter" he would be plied with food and drink before moving on to the next. It was thought he carried the spirit of the winter with him, and that the winter would be kind to anyone hospitable to Old Winter. The custom was still kept in Medieval England, and after a decline during the Commonwealth, became widespread again during the Restoration period. A book dating from the time of the Commonwealth, The Vindication of CHRISTMAS or, His Twelve Yeares' Observations upon the Times involved Father Christmas advocating a merry, alcoholic Christmas and casting aspersions on the charitable motives of the ruling Puritans.

He was neither a gift bringer nor was he associated with children. During the Victorian era when Santa Claus arrived from America he was merged with “Old Winter”, “Old Christmas” or “Old Father Christmas” to create Father Christmas, the British Santa which survives today.

### Does Automated Theorem Prover Exist? Can Neural Net Solve Math Problems?

can neural net solve the Collatz conjecture problem?

The Collatz conjecture is a conjecture in mathematics that concerns a sequence defined as follows: start with any positive integer n. Then each term is obtained from the previous term as follows: if the previous term is even, the next term is one half the previous term. Otherwise, the next term is 3 times the previous term plus 1. The conjecture is that no matter what value of n, the sequence will always reach 1.

[2017-12-14 the Collatz conjecture Collatz conjecture ]

this is a interesting question, because it is a simplified version of whether neural networks based AI can answer math problems. “Math problems” here means, for example, any of

Millennium Prize Problems Millennium Prize Problems

The above math problems are complex to computerize. Computerizing math is not a solved problem.

So, we might ask instead, whether neural network can solve math problems that are simple to computerize. For example,

are there odd perfect number? (a positive integer that's equal to its divisors excluding itself)

or group theory List of unsolved problems in mathematics#Group theory

my understanding is, no. Neutral networks cannot answer any such question. Unless, an AI reaches what's called Strong AI (which basically means human level intelligence). But, at that level, the AI is simply like a human, that it isn't using neural net to solve a specific given problem, rather, it has become something else, even possibly sentient.

Also, i think it's understood that we cannot learn much from neural net other than observing the results. Other AI approaches (such as say brute force, or genetic algorithm), we can actually learn something from it. For example, by brute force, we have solved many chess end game problems, and know some theory about it.

we have also learned, that some stalemate rules are bad. That our rule says after x moves the game is a draw, but computer by brute force have found forced checkmate that require more than x moves. (i think the x is 50)

AlphaGo (the precursor of AlphaZero), even though it beat world champion Ke Jie and many other international champions of go early in 2017, and have given us many games it played with itself. However, we did not learn anything about theory of go. What we did learn, is only by observing its play, and theorize ourselves.

other “simple” math problems are, for example,

is there an odd perfect number Perfect number#Odd perfect numbers

“a perfect number is a positive integer that is equal to the sum of its proper positive divisors, that is, the sum of its positive divisors excluding the number itself”.

or Boolean satisfiability problem Boolean satisfiability problem

traveling salesman problem Travelling salesman problem

am wondering, with the seemingly powerful AlphaZero, how's such neural network based AI can tackle these kind of concrete, absolute, seemingly simple, math problems. Or, was it agreed that neural network simply cannot work on these kinda problems?

if neural net can deal with these problems, what's the approach? are there examples?

i don't know much about netural net, but it seems, it is never used on these type of problems.

as for working on disease and other general human problems, am just wondering, what exactly are the problems in concrete terms? Since “disease” is quite general, nothing like chess or go.

suppose you write a chess program. And by brute force, you completely solved chess. That is, you've determined, the optimal move for every position. That is, automated theorem proving.

That, is the idea, and beginning, of automated theorem proving.

of course, we cannot brute force all the way, since there are more ways than we can fathom. Therefore, we try to cut corners, and be smarter, in our ways of enumeration, such as the neural networks of AlphaZero.

aside from that, of mathematics, we cannot even begin to brute force or neural net, since math is not codified as chess or go is. The problem, of turning a human math question into logic and into computer, is itself, not a solved problem. Before we can automate prove theorems, we need codification of math, and that's in the realm of foundation of math.

and in this realm, even though we made a lot progress, or none, relative to the cosmos, there are still mysteries and unbelievers and glory holes. We make do what we can. Thus, we have “conjecture” searchers, “assisted” provers, alterantive foundations such as homotopy type theory and such. Their meaning and context, evolves. Few, knew what they are talking about, reality speaking.

### can neural net solve math problems?

Dear Lu, here's a problem you might find illuminating.

suppose you went to RadioShack and built a tiny neural networks Artificial Intelligence software. In just 1 hour of playing with itself, it plays so good at tac-tac-toe that it never loses.

Now, that's some accomplishment. But, now, how to solve, say, x + 1 = 2, for arbitrary 1 2, with your neutral net?

Can your neural net solve such math problem?

what would be your approach?

source of discussion https://plus.google.com/u/0/+johncbaez999/posts/Xk36jKsosGT

History of OCaml and Haskell Syntax, and The Next 700 Programming Languages

google AlphaZero beats best chess software stockfish. And AlphaZero only learned the game from scratch, in 4 hours.

Then, in 2 hours it learned shogi (Japanese chess (much more complex)) and beat the best shogi software.

scary AI is happening.

https://www.chess.com/news/view/google-s-alphazero-destroys-stockfish-in-100-game-match

ASCII Table (minor update)

Computer Language: Predicate Function, Terminology, and Naming Convention

### removed comment system on my site

just removed disqus comment on all my sites for now. They are now forcing image ads. And their ads are those low quality sensational types. To opt ad free, would be $10/month. But, comment takes 30min/day to reply, and 95% are garbage. (i have 5 thousand pages on my sites) might add back, we'll see. let me know what you think.

Unicode Flags 🏁 (major rewrite)

### Unicode User Interface Icon Becomes Emoji

unicode emoji should be ban'd. Extremely annoying to show a symbol it becomes a emoji.

if you have ◀ ▶ ⏯, the last becomes a emoji.

Adding U+FE0E does not always work.

And in MacOS, it has a bug forcing emoji tiny, same size as a letter. It ignores CSS font size spec.

and which symbol will become emoji is unpredictable. On twitter, ◀ ▶ both become emoji.

ok, the whole thing is pretty fkd.

see [Apple did not invent emoji By Eevee. At https://eev.ee/blog/2016/04/12/apple-did-not-invent-emoji/ ]

and see replies at https://twitter.com/xah_lee/status/926994405046722560

the problem of computerizing math, began with: THERE EXIST ∃, and FOR ALL ∀. #haskell #coq #typetheory

### Leon Chwistek, a Founder of Type Theory

Leon Chwistek (Kraków, Austria-Hungary, 13 June 1884 – 20 August 1944, Barvikha near Moscow, Russia) was a Polish avant-garde painter, theoretician of modern art, literary critic, logician, philosopher and mathematician.

Starting in 1929 Chwistek was a Professor of Logic at the University of Lwów in a position for which Alfred Tarski had also applied. His interests in the 1930s were in a general system of philosophy of science, which was published in a book translated in English 1948 as The Limits of Science.[1]

In the 1920s-30s, many European philosophers attempted to reform traditional philosophy by means of mathematical logic. Leon Chwistek did not believe that such reform could succeed. He thought that reality could not be described in one homogeneous system, based on the principles of formal logic, because there was not one reality but many.

Chwistek demolishes the axiomatic method by demonstrating that the extant axiomatic systems are inconsistent.[2]

2017-11-03 Wikipedia Leon Chwistek

Plants Emoji 🌵 (added a macOS screenshot)

xah edu corner extempore! episode №20171101042745, ban recursion in programing languages

1 with regards to computer science, recursion should be ban'd in programing languages.

2 it's got the chicken n egg problem: before it's defined, it's calling itself. Like russell's paradox, or 1st order logic in twilight zone

3 But in math, we have recursive relation, and comp sci recursive theory. How to resolve that?

4 in math, nth term is defined by previous term, and f[1] is defined non-recursively. so, it's well defined. In a sense, no “recursion”

5 in most programing langs, body of recursive f use “if” to check input. So, “no recursion” really. But chicken n egg remain in defining f.

6 some lang, (Mathematica, haskell), can define f(1) and f(n) separately. So, no chicken n egg recursive definition problem.

7 actually chicken n egg recursive definition problem remain. With respect to order of evaluation.

need to think about this more

### Quiz, Write a NestList Function

Quiz. write a function r(f,x,n) that returns a list [f(x), f(f(x)), ...], length n. write in your fav lang.

f is function (e.g. f(x) = x+1), x is a number, n is a number ≥ 1. we want [f(x), f(f(x)), ...]

#haskell #javascript #golang #clojure

Someone asked why is this useful? For example, factorial function, or fibonaci sequence. In math it happens often. Check out logistic map or iterated function system or dynamical systems, mandelbrot set.

comment at

https://noagendasocial.com/@xahlee/98929138430987793

### programing language popularity 2017

lol.

wait, why is haskell on the left side?

Unicode search at Unicode Characters ☯ ⚡ ∑ ♥ 😄

### programers and docs

remember, boys n girls, there's no lang that has rigorous math-like doc or spec. None.

2, in programing, if you spend 1 min with with good doc, you spend 1 hour without, or even 10. When there's no doc, 10 days.

3, but then why programing community don't appreciate or have good doc? because

4, ① the nature of code, changes all the time. Docs usually don't keep up.

5, ② it's hard to convert how-to into what-is, the latter is math style doc/spec.

6, ③ doc in software are literally useless, in some sense. It adds nothing to the software behavior.

7, ④ programers, partly due to above, don't know how to write well.

### there's no lang in practical use that has rigorous math-like doc or spec

been reading math 2 hours a day in past months. what a joy. In contrast of reading programing doc n lang specs. Programers are such idiots.

programers don't appreciate good docs. n they have this nasty concept of “grok” (from unix fkheads), n in a flash they'll tell you to dig the source code.

there's no lang in practical use that has rigorous math-like doc or spec. #Haskell? #Ocamel? lol, they've the worst “grok it” doc and spec.

yet the haskell fkheads's like, “algebraic” data structure and monoid and suff. Each one sounds like superior mathematicians. Monad ya ass.

homotopy, a continuous function between 2 functions. How can such topology, differential geometry notion, be tied to logic, set theory, foundation of math? that's the story of homotopy type theory. Absolutely fascinating.

### programers and math jargons

#math if you haven't studied group theory before, do so now. Wikipedia article is very good.

after Wikipedia #math group, read http://xahlee.info/Wallpaper_dir/c0_WallPaper.html

when programers use math jargons, they dunno which side is ass, which is mouth. #haskell #lisp

if a programer mention idempotent monad directed graph, n they can't talk basic abstract algebra, tell them 2 shut piehole #haskell #lisp

programers talking garbage math jargon happens, from 1990s perl and sql to 2000 lisper homoiconicity to today js haskell category idiots.

programers and mathematicians are very distinct communities. The 2 basically don't communicate, not unlike engineers and lawyers.

mathematicians, in general look down on programing. They dunno what's a subroutine, function, object, class.

programers, usually lookup and idolize math, yet, have 0 clue. you wouldn't have a clue of math unless you had 3 years worth of undergraduate MATH MAJOR.

now n then we see hacker idiots discuss how important is math to programing. that's, like, guys in bar on the tao of quantum cosmos.

### xah lee, schizoid

- 1, due to my public website since 1995, i've talked to lots people, coders, geeks, and many weird people. (same ilk attract)
- 2, Usually, they know me, but i don't know/remember people. (plus, they are often anonymous)
- 3, It has happened quite a few times, in argument about coding or other, something ticked me off, and my screed turned supporters/fans to stone.
- 4, am a schizoid. That basically means, loners, or, people with very little emotion. Any attachment, relationship, trouble us greatly.

### Google Doodle? Never click it

you see those Google Doodle? Never, ever, click it or read about it. If you do, your brain is tainted. This is similar to never watch TV.

Google Doodle was fun in 2000s. It's casual, non-intentional. Today, it's commercialization plus propaganda.

### pngquant?

there's a idiotic program called pngquant.

it reduces png file size by a lossy compression.

if you want lossy, goto jpg or webp

in September, i'll be blogging on my patreon account only.

https://www.patreon.com/xahlee

If you like my stuff, i hope you patreon me there.

to my patreon supporters, new article https://www.patreon.com/posts/13809835

### golang's choice of tab for indentation

golang's choice of tab for indentation is the correct one. However, emacs golang mode forcing it to be DISPLAYED as 8 spaces, is the most idiotic. It undo the correct thinking.

See also: Programing: Tab vs Space in Source Code

### golang is superb

golang is truly a simple superb practical language. + Real functional programing features. And fast! Puts clojure haskell in shame.

despite my supreme love for functional programing, i'd say, clojure is a complex idiocy, on so many levels. And Haskell too.

my golang tutorial is coming in shape.

See also: Clojure Tutorial

my site ranking, i think that's the highest.

find some sites you know, and let me know what you get. On twitter, Google Plus.

See also: Practical git in 1 Hour

### programer hieroglyph

[see Egyptian Hieroglyph 𓂀]

### be my first patreon

now i have a patreon account. be my first patreon. see first post at https://www.patreon.com/xahlee

Java: Unicode in Java (minor update)

drawing a maze with Unicode. Unicode Box Lines, Shapes ┌ ┬ ┐

Jargon Lambda in Decline (expanded for the general public.)

Ask me question on patreon