Xah Programing Blog
syntactic obsession, imperative despisal, unicode love ♥, hacker hate
xah reddit channel
been posting to my reddit r/Xah. see https://www.reddit.com/r/Xah/
just trying it out. If you are reddit user, find my post on reddit as convenient, vote up my posts. If i don't see much user, i'll stop doing it.
brave browser annoyances
.@brave browser allow/deny notification bar is annoying. It pops up for fullscreen, notification, and others. If you ignore, the bar don't go away but adds up space to the top of the browser. And when there's multiple, you might click the wrong allow/deny button.
another thing @brave browser is annoying is that when you hover mouse over a tab, it shows that tab's content temporarily (and in grayish fog). Very annoying and confusing, especially if you have hover tab/dwell on in OS.
xahlee info popularity ranking 2018-01-15
climbing the ladder.
how to become a mathematician for programers
xah's edu corner extempore! episode №20180115122340 on math courses, and how to become a mathematician for programers
for young coders out there, the proper order of learning math is: highschool algebra, trig, pre-calc, calculus, linear algebra, then optional intro diff equations. These are basics. Must be in that order. These are basic needed for engineering. But are not math major basics yet.
for math major (to become mathematician), you need abstract algebra, real analysis. Order doesn't matter. After you had these, you are acquainted with math “language”, or math maturity. i.e. the way mathematician talks. These are typically 3rd year math major.
after that, you may have the following in arbitrary order: complex analysis, different geometry, topology, set theory. After studying these, you can consider yourself mathematician. You know what math is about, or know where to go. All the above, are traditional main math courses.
for programers, you might wonder, where is graph theory, type theory, game theory, logic, combinatorics, statistics. These are not typical main courses of MATHEMATICIAN. These are called discrete math, sometimes as comp sci. (while statistics n probability r applied math)
the discrete math, they do not have the elaborate pre-requisite sequence as analysis/algebra. Anyone can start to learn graph theory, game theory, combinatorics, number theory, etc. But you don't get deep without the years of analysis/algebra of real mathematician stuff.
See also: Free Math Textbooks
Logical Operators, Truth Table, Unicode (updated)
Programing: “or” Considered Harmful (updated)
Proliferation of Computing Languages there are so many in past 5 years, that people don't care anymore. Certainly, i stopped listing new ones. #elmlang #purescript #TypeScript #haskell
ask Xah, 2018-01-07
〔 CoffeeMiner: Hacking WiFi to inject cryptocurrency miner to HTML requests By Arnau Code. At http://arnaucode.com/blog/coffeeminer-hacking-wifi-cryptocurrency-miner.html 〕
seems like a good article.
pop stars and programer stars
pop stars, quietly drops out every decade or 2. And this is also true for programer stars. A star everybody knows and talks about, but you don't know or recall who's in previous decade.
classes of the rich, and classes of programers, and you don't know nothing
when something is beyond us, we can't distinguish the level. e.g. some are millionairs, but some r billionairs. To us, we don't comprehend. We group them as just “rich”. To them, it's 1 thousand times diff, literally. This goes for pop stars, to politicians, businessman.
Similarly, in programing, to outsiders, we all smart nerds beyond their comprehension. But to us, there r script kiddies, and “web designers”, to those who wrote google search engine. This applies to any field or community u r not familiar with. which basically means, everything.
intel chip bug Meltdown and Spectre
just read a few articles/papers about the intel bugs Meltdown and Spectre. Meltdown require OS kernel patch, cause 5% to 30% slow down for system call intensive apps (networking/file). and Spectre is unfixable.
We are screwed. by next week, nobody will remember the problems, except hackers and gov. But have a cookie, you are screwed already anyway with ur phone and usb and ssh etc.
if you are not a computer nerd, here's the gist: hackers discovered one very big hole to your password, and there's nothing you can do.
instead of Santa Claus, the unicode calls it Father Christmas.
🎄 🎅 🤶 ⿅ 𐂂 🦌
see Unicode Search
so why Unicode call Santa Claus as Father Christmas? Wikipedia has history https://en.wikipedia.org/wiki/Father_Christmas
but be warned it probably contains biased writing.
this Wikipedia from 2005 gives a easier to follow picture of history of Father Christmas https://en.wikipedia.org/w/index.php?title=Father_Christmas&oldid=33065629
Father Christmas is a name used in the United Kingdom, Australia, New Zealand and several other Commonwealth Countries, as well as Ireland, for the gift bringing figure of Christmas or yuletide. Although Father Christmas, Saint Nicholas and Santa Claus (the latter deriving from the Dutch for Saint Nicholas: Sinterklaas), are now used interchangeably, the origins of Father Christmas are quite different.
Dating back to Norse mythology, Father Christmas has his roots in Paganism. Before Christianity came to British shores it was customary for an elder man from the community to dress in furs and visit each dwelling. At each house, in the guise of "Old Winter" he would be plied with food and drink before moving on to the next. It was thought he carried the spirit of the winter with him, and that the winter would be kind to anyone hospitable to Old Winter. The custom was still kept in Medieval England, and after a decline during the Commonwealth, became widespread again during the Restoration period. A book dating from the time of the Commonwealth, The Vindication of CHRISTMAS or, His Twelve Yeares' Observations upon the Times involved Father Christmas advocating a merry, alcoholic Christmas and casting aspersions on the charitable motives of the ruling Puritans.
He was neither a gift bringer nor was he associated with children. During the Victorian era when Santa Claus arrived from America he was merged with “Old Winter”, “Old Christmas” or “Old Father Christmas” to create Father Christmas, the British Santa which survives today.
Does Automated Theorem Prover Exist? Can Neural Net Solve Math Problems?
can neural net solve the Collatz conjecture problem?
The Collatz conjecture is a conjecture in mathematics that concerns a sequence defined as follows: start with any positive integer n. Then each term is obtained from the previous term as follows: if the previous term is even, the next term is one half the previous term. Otherwise, the next term is 3 times the previous term plus 1. The conjecture is that no matter what value of n, the sequence will always reach 1.
[2017-12-14 the Collatz conjecture Collatz conjecture ]
this is a interesting question, because it is a simplified version of whether neural networks based AI can answer math problems. “Math problems” here means, for example, any of
Millennium Prize Problems Millennium Prize Problems
The above math problems are complex to computerize. Computerizing math is not a solved problem.
So, we might ask instead, whether neural network can solve math problems that are simple to computerize. For example,
are there odd perfect number? (a positive integer that's equal to its divisors excluding itself)
or group theory List of unsolved problems in mathematics#Group theory
my understanding is, no. Neutral networks cannot answer any such question. Unless, an AI reaches what's called Strong AI (which basically means human level intelligence). But, at that level, the AI is simply like a human, that it isn't using neural net to solve a specific given problem, rather, it has become something else, even possibly sentient.
Also, i think it's understood that we cannot learn much from neural net other than observing the results. Other AI approaches (such as say brute force, or genetic algorithm), we can actually learn something from it. For example, by brute force, we have solved many chess end game problems, and know some theory about it.
we have also learned, that some stalemate rules are bad. That our rule says after x moves the game is a draw, but computer by brute force have found forced checkmate that require more than x moves. (i think the x is 50)
AlphaGo (the precursor of AlphaZero), even though it beat world champion Ke Jie and many other international champions of go early in 2017, and have given us many games it played with itself. However, we did not learn anything about theory of go. What we did learn, is only by observing its play, and theorize ourselves.
other “simple” math problems are, for example,
is there an odd perfect number Perfect number#Odd perfect numbers
“a perfect number is a positive integer that is equal to the sum of its proper positive divisors, that is, the sum of its positive divisors excluding the number itself”.
or Boolean satisfiability problem Boolean satisfiability problem
traveling salesman problem Travelling salesman problem
am wondering, with the seemingly powerful AlphaZero, how's such neural network based AI can tackle these kind of concrete, absolute, seemingly simple, math problems. Or, was it agreed that neural network simply cannot work on these kinda problems?
if neural net can deal with these problems, what's the approach? are there examples?
i don't know much about netural net, but it seems, it is never used on these type of problems.
as for working on disease and other general human problems, am just wondering, what exactly are the problems in concrete terms? Since “disease” is quite general, nothing like chess or go.
suppose you write a chess program. And by brute force, you completely solved chess. That is, you've determined, the optimal move for every position. That is, automated theorem proving.
That, is the idea, and beginning, of automated theorem proving.
of course, we cannot brute force all the way, since there are more ways than we can fathom. Therefore, we try to cut corners, and be smarter, in our ways of enumeration, such as the neural networks of AlphaZero.
aside from that, of mathematics, we cannot even begin to brute force or neural net, since math is not codified as chess or go is. The problem, of turning a human math question into logic and into computer, is itself, not a solved problem. Before we can automate prove theorems, we need codification of math, and that's in the realm of foundation of math.
and in this realm, even though we made a lot progress, or none, relative to the cosmos, there are still mysteries and unbelievers and glory holes. We make do what we can. Thus, we have “conjecture” searchers, “assisted” provers, alterantive foundations such as homotopy type theory and such. Their meaning and context, evolves. Few, knew what they are talking about, reality speaking.
can neural net solve math problems?
Dear Lu, here's a problem you might find illuminating.
suppose you went to RadioShack and built a tiny neural networks Artificial Intelligence software. In just 1 hour of playing with itself, it plays so good at tac-tac-toe that it never loses.
Now, that's some accomplishment. But, now, how to solve, say, x + 1 = 2, for arbitrary 1 2, with your neutral net?
Can your neural net solve such math problem?
what would be your approach?
source of discussion https://plus.google.com/u/0/+johncbaez999/posts/Xk36jKsosGT
google AlphaZero beats best chess software stockfish. And AlphaZero only learned the game from scratch, in 4 hours.
Then, in 2 hours it learned shogi (Japanese chess (much more complex)) and beat the best shogi software.
scary AI is happening.
ASCII Table (minor update)
removed comment system on my site
just removed disqus comment on all my sites for now. They are now forcing image ads. And their ads are those low quality sensational types. To opt ad free, would be $10/month. But, comment takes 30min/day to reply, and 95% are garbage. (i have 5 thousand pages on my sites) might add back, we'll see. let me know what you think.
Unicode Flags 🏁 (major rewrite)
Unicode User Interface Icon Becomes Emoji
unicode emoji should be ban'd. Extremely annoying to show a symbol it becomes a emoji.
if you have ◀ ▶ ⏯, the last becomes a emoji.
Adding U+FE0E does not always work.
And in MacOS, it has a bug forcing emoji tiny, same size as a letter. It ignores CSS font size spec.
and which symbol will become emoji is unpredictable. On twitter, ◀ ▶ both become emoji.
ok, the whole thing is pretty fkd.
see 〔 Apple did not invent emoji By Eevee. At https://eev.ee/blog/2016/04/12/apple-did-not-invent-emoji/ 〕
and see replies at https://twitter.com/xah_lee/status/926994405046722560
the problem of computerizing math, began with: THERE EXIST ∃, and FOR ALL ∀. #haskell #coq #typetheory
Leon Chwistek, a Founder of Type Theory
Leon Chwistek (Kraków, Austria-Hungary, 13 June 1884 – 20 August 1944, Barvikha near Moscow, Russia) was a Polish avant-garde painter, theoretician of modern art, literary critic, logician, philosopher and mathematician.
Starting in 1929 Chwistek was a Professor of Logic at the University of Lwów in a position for which Alfred Tarski had also applied. His interests in the 1930s were in a general system of philosophy of science, which was published in a book translated in English 1948 as The Limits of Science.
In the 1920s-30s, many European philosophers attempted to reform traditional philosophy by means of mathematical logic. Leon Chwistek did not believe that such reform could succeed. He thought that reality could not be described in one homogeneous system, based on the principles of formal logic, because there was not one reality but many.
Chwistek demolishes the axiomatic method by demonstrating that the extant axiomatic systems are inconsistent.
2017-11-03 Wikipedia Leon Chwistek
Plants Emoji 🌵 🎄 🌷 (added a macOS screenshot)
xah edu corner extempore! episode №20171101042745, ban recursion in programing languages
1 with regards to computer science, recursion should be ban'd in programing languages.
2 it's got the chicken n egg problem: before it's defined, it's calling itself. Like russell's paradox, or 1st order logic in twilight zone
3 But in math, we have recursive relation, and comp sci recursive theory. How to resolve that?
4 in math, nth term is defined by previous term, and f is defined non-recursively. so, it's well defined. In a sense, no “recursion”
5 in most programing langs, body of recursive f use “if” to check input. So, “no recursion” really. But chicken n egg remain in defining f.
6 some lang, (Mathematica, haskell), can define f(1) and f(n) separately. So, no chicken n egg recursive definition problem.
7 actually chicken n egg recursive definition problem remain. With respect to order of evaluation.
need to think about this more
Quiz, Write a NestList Function
Quiz. write a function r(f,x,n) that returns a list [f(x), f(f(x)), ...], length n. write in your fav lang.
f is function (e.g. f(x) = x+1), x is a number, n is a number ≥ 1. we want [f(x), f(f(x)), ...]
Someone asked why is this useful? For example, factorial function, or fibonaci sequence. In math it happens often. Check out logistic map or iterated function system or dynamical systems, mandelbrot set.
programing language popularity 2017
wait, why is haskell on the left side?
Unicode search at Unicode Characters ☯ ⚡ ∑ ♥ 😄
programers and docs
remember, boys n girls, there's no lang that has rigorous math-like doc or spec. None.
2, in programing, if you spend 1 min with with good doc, you spend 1 hour without, or even 10. When there's no doc, 10 days.
3, but then why programing community don't appreciate or have good doc? because
4, ① the nature of code, changes all the time. Docs usually don't keep up.
5, ② it's hard to convert how-to into what-is, the latter is math style doc/spec.
6, ③ doc in software are literally useless, in some sense. It adds nothing to the software behavior.
7, ④ programers, partly due to above, don't know how to write well.
there's no lang in practical use that has rigorous math-like doc or spec
been reading math 2 hours a day in past months. what a joy. In contrast of reading programing doc n lang specs. Programers are such idiots.
programers don't appreciate good docs. n they have this nasty concept of “grok” (from unix fkheads), n in a flash they'll tell you to dig the source code.
there's no lang in practical use that has rigorous math-like doc or spec. #Haskell? #Ocamel? lol, they've the worst “grok it” doc and spec.
yet the haskell fkheads's like, “algebraic” data structure and monoid and suff. Each one sounds like superior mathematicians. Monad ya ass.
homotopy, a continuous function between 2 functions. How can such topology, differential geometry notion, be tied to logic, set theory, foundation of math? that's the story of homotopy type theory. Absolutely fascinating.
programers and math jargons
#math if you haven't studied group theory before, do so now. Wikipedia article is very good.
after Wikipedia #math group, read http://xahlee.info/Wallpaper_dir/c0_WallPaper.html
when programers use math jargons, they dunno which side is ass, which is mouth. #haskell #lisp
if a programer mention idempotent monad directed graph, n they can't talk basic abstract algebra, tell them 2 shut piehole #haskell #lisp
programers talking garbage math jargon happens, from 1990s perl and sql to 2000 lisper homoiconicity to today js haskell category idiots.
programers and mathematicians are very distinct communities. The 2 basically don't communicate, not unlike engineers and lawyers.
mathematicians, in general look down on programing. They dunno what's a subroutine, function, object, class.
programers, usually lookup and idolize math, yet, have 0 clue. you wouldn't have a clue of math unless you had 3 years worth of undergraduate MATH MAJOR.
now n then we see hacker idiots discuss how important is math to programing. that's, like, guys in bar on the tao of quantum cosmos.
xah lee, schizoid
- 1, due to my public website since 1995, i've talked to lots people, coders, geeks, and many weird people. (same ilk attract)
- 2, Usually, they know me, but i don't know/remember people. (plus, they are often anonymous)
- 3, It has happened quite a few times, in argument about coding or other, something ticked me off, and my screed turned supporters/fans to stone.
- 4, am a schizoid. That basically means, loners, or, people with very little emotion. Any attachment, relationship, trouble us greatly.
Google Doodle? Never click it
you see those Google Doodle? Never, ever, click it or read about it. If you do, your brain is tainted. This is similar to never watch TV.
Google Doodle was fun in 2000s. It's casual, non-intentional. Today, it's commercialization plus propaganda.
there's a idiotic program called pngquant.
it reduces png file size by a lossy compression.
if you want lossy, goto jpg or webp
in September, i'll be blogging on my patreon account only.
If you like my stuff, i hope you patreon me there.
to my patreon supporters, new article https://www.patreon.com/posts/13809835
golang's choice of tab for indentation
golang's choice of tab for indentation is the correct one. However, emacs golang mode forcing it to be DISPLAYED as 8 spaces, is the most idiotic. It undo the correct thinking.
See also: Programing: Tab vs Space in Source Code
golang is superb
golang is truly a simple superb practical language. + Real functional programing features. And fast! Puts clojure haskell in shame.
despite my supreme love for functional programing, i'd say, clojure is a complex idiocy, on so many levels. And Haskell too.
my golang tutorial is coming in shape.
See also: Xah Clojure Tutorial
my site ranking, i think that's the highest.
find some sites you know, and let me know what you get. On twitter, Google Plus.
See also: Practical git in 1 Hour
[see Egyptian Hieroglyph 𓂀]
be my first patreon
now i have a patreon account. be my first patreon. see first post at https://www.patreon.com/xahlee
Java: Unicode in Java (minor update)
drawing a maze with Unicode. Unicode Box Lines, Shapes ┌ ┬ ┐
Jargon Lambda in Decline (expanded for the general public.)
Ask me question on patreon