climbing the ladder.

arduous climb.

xah's edu corner extempore! episode №20180115122340 on math courses, and how to become a mathematician for programers

for young coders out there, the proper order of learning math is: highschool algebra, trig, pre-calc, calculus, linear algebra, then optional intro diff equations. These are basics. Must be in that order. These are basic needed for engineering. But are not math major basics yet.

for math major (to become mathematician), you need abstract algebra, real analysis. Order doesn't matter. After you had these, you are acquainted with math “language”, or math maturity. i.e. the way mathematician talks. These are typically 3rd year math major.

after that, you may have the following in arbitrary order: complex analysis, different geometry, topology, set theory. After studying these, you can consider yourself mathematician. You know what math is about, or know where to go. All the above, are traditional main math courses.

for programers, you might wonder, where is graph theory, type theory, game theory, logic, combinatorics, statistics. These are not typical main courses of MATHEMATICIAN. These are called discrete math, sometimes as comp sci. (while statistics n probability r applied math)

the discrete math, they do not have the elaborate pre-requisite sequence as analysis/algebra. Anyone can start to learn graph theory, game theory, combinatorics, number theory, etc. But you don't get deep without the years of analysis/algebra of real mathematician stuff.

See also: Free Math Textbooks

Logical Operators, Truth Table, Unicode (updated)

Programing: “or” Considered Harmful (updated)

〔 CoffeeMiner: Hacking WiFi to inject cryptocurrency miner to HTML requests By Arnau Code. At http://arnaucode.com/blog/coffeeminer-hacking-wifi-cryptocurrency-miner.html 〕

seems like a good article.

See also: http://xahlee.info/linux/linux_iptables_basics.html

pop stars, quietly drops out every decade or 2. And this is also true for programer stars. A star everybody knows and talks about, but you don't know or recall who's in previous decade.

when something is beyond us, we can't distinguish the level. e.g. some are millionairs, but some r billionairs. To us, we don't comprehend. We group them as just “rich”. To them, it's 1 thousand times diff, literally. This goes for pop stars, to politicians, businessman.

Similarly, in programing, to outsiders, we all smart nerds beyond their comprehension. But to us, there r script kiddies, and “web designers”, to those who wrote google search engine. This applies to any field or community u r not familiar with. which basically means, everything.

just read a few articles/papers about the intel bugs Meltdown and Spectre. Meltdown require OS kernel patch, cause 5% to 30% slow down for system call intensive apps (networking/file). and Spectre is unfixable.

We are screwed. by next week, nobody will remember the problems, except hackers and gov. But have a cookie, you are screwed already anyway with ur phone and usb and ssh etc.

if you are not a computer nerd, here's the gist: hackers discovered one very big hole to your password, and there's nothing you can do.

instead of Santa Claus, the unicode calls it Father Christmas.

🎄 🎅 🤶 ⿅ 𐂂 🦌

see Unicode Search

so why Unicode call Santa Claus as Father Christmas? Wikipedia has history https://en.wikipedia.org/wiki/Father_Christmas

but be warned it probably contains biased writing.

this Wikipedia from 2005 gives a easier to follow picture of history of Father Christmas https://en.wikipedia.org/w/index.php?title=Father_Christmas&oldid=33065629

quote:

Father Christmas is a name used in the United Kingdom, Australia, New Zealand and several other Commonwealth Countries, as well as Ireland, for the gift bringing figure of Christmas or yuletide. Although Father Christmas, Saint Nicholas and Santa Claus (the latter deriving from the Dutch for Saint Nicholas: Sinterklaas), are now used interchangeably, the origins of Father Christmas are quite different.

Dating back to Norse mythology, Father Christmas has his roots in Paganism. Before Christianity came to British shores it was customary for an elder man from the community to dress in furs and visit each dwelling[citation needed]. At each house, in the guise of "Old Winter" he would be plied with food and drink before moving on to the next. It was thought he carried the spirit of the winter with him, and that the winter would be kind to anyone hospitable to Old Winter. The custom was still kept in Medieval England, and after a decline during the Commonwealth, became widespread again during the Restoration period. A book dating from the time of the Commonwealth, The Vindication of CHRISTMAS or, His Twelve Yeares' Observations upon the Times involved Father Christmas advocating a merry, alcoholic Christmas and casting aspersions on the charitable motives of the ruling Puritans.

He was neither a gift bringer nor was he associated with children. During the Victorian era when Santa Claus arrived from America he was merged with “Old Winter”, “Old Christmas” or “Old Father Christmas” to create Father Christmas, the British Santa which survives today.

suppose you write a chess program. And by brute force, you completely solved chess. That is, you've determined, the optimal move for every position. That is, automated theorem proving.

That, is the idea, and beginning, of automated theorem proving.

of course, we cannot brute force all the way, since there are more ways than we can fathom. Therefore, we try to cut corners, and be smarter, in our ways of enumeration, such as the neural networks of AlphaZero.

aside from that, of mathematics, we cannot even begin to brute force or neural net, since math is not codified as chess or go is. The problem, of turning a human math question into logic and into computer, is itself, not a solved problem. Before we can automate prove theorems, we need codification of math, and that's in the realm of foundation of math.

and in this realm, even though we made a lot progress, or none, relative to the cosmos, there are still mysteries and unbelievers and glory holes. We make do what we can. Thus, we have “conjecture” searchers, “assisted” provers, alterantive foundations such as homotopy type theory and such. Their meaning and context, evolves. Few, knew what they are talking about, reality speaking.

Dear Lu, here's a problem you might find illuminating.

suppose you went to RadioShack and built a tiny neural networks Artificial Intelligence software. In just 1 hour of self-training, it plays so good a tac-tac-toe that it never loses.

Now, that's some accomplishment. But, now, how to solve, say, x + 1 = 2, for arbitrary 1 2, with your neutral net?

Can your neural net solve such math problem?

source of discussion https://plus.google.com/u/0/+johncbaez999/posts/Xk36jKsosGT

ASCII Table (minor update)

just removed disqus comment on all my sites for now. They are now forcing image ads. And their ads are those low quality sensational types. To opt ad free, would be $10/month. But, comment takes 30min/day to reply, and 95% are garbage. (i have 5 thousand pages on my sites) might add back, we'll see. let me know what you think.

Unicode Flags 🏁 (major rewrite)

unicode emoji should be ban'd. Extremely annoying to show a symbol it becomes a emoji.

if you have ◀ ▶ ⏯, the last becomes a emoji.

Adding U+FE0E does not always work.

And in MacOS, it has a bug forcing emoji tiny, same size as a letter. It ignores CSS font size spec.

and which symbol will become emoji is unpredictable. On twitter, ◀ ▶ both become emoji.

ok, the whole thing is pretty fkd.

see 〔 Apple did not invent emoji By Eevee. At https://eev.ee/blog/2016/04/12/apple-did-not-invent-emoji/ 〕

and see replies at https://twitter.com/xah_lee/status/926994405046722560

the problem of computerizing math, began with: THERE EXIST ∃, and FOR ALL ∀. #haskell #coq #typetheory

Leon Chwistek (Kraków, Austria-Hungary, 13 June 1884 – 20 August 1944, Barvikha near Moscow, Russia) was a Polish avant-garde painter, theoretician of modern art, literary critic, logician, philosopher and mathematician.

Starting in 1929 Chwistek was a Professor of Logic at the University of Lwów in a position for which Alfred Tarski had also applied. His interests in the 1930s were in a general system of philosophy of science, which was published in a book translated in English 1948 as The Limits of Science.[1]

In the 1920s-30s, many European philosophers attempted to reform traditional philosophy by means of mathematical logic. Leon Chwistek did not believe that such reform could succeed. He thought that reality could not be described in one homogeneous system, based on the principles of formal logic, because there was not one reality but many.

Chwistek demolishes the axiomatic method by demonstrating that the extant axiomatic systems are inconsistent.[2]

2017-11-03 Wikipedia Leon Chwistek

Plants Emoji 🌵 🎄 🌷 (added a macOS screenshot)

post deleted

Quiz. write a function r(f,x,n) that returns a list [f(x), f(f(x)), ...], length n. write in your fav lang.

f is function (e.g. f(x) = x+1), x is a number, n is a number ≥ 1. we want [f(x), f(f(x)), ...]

#haskell #javascript #golang #clojure

Someone asked why is this useful? For example, factorial function, or fibonaci sequence. In math it happens often. Check out “logistic map” or “iterated function system” or “dynamical systems”

comment at

https://noagendasocial.com/@xahlee/98929138430987793

lol.

wait, why is haskell on the left side?

Unicode search at Unicode Characters ☯ ⚡ ∑ ♥ 😄

remember, boys n girls, there's no lang that has rigorous math-like doc or spec. None. http://xahlee.info/comp/blog.html #haskell #ocaml

2, in programing, if you spend 1 min with with good doc, you spend 1 hour without, or even 10. When there's no doc, 10 days.

3, but then why programing community don't appreciate or have good doc? because

4, ① the nature of code, changes all the time. Docs usually don't keep up.

5, ② it's hard to convert how-to into what-is, the latter is math style doc/spec.

6, ③ doc in software are literally useless, in some sense. It adds nothing to the software behavior.

7, ④ programers, partly due to above, don't know how to write well.

been reading math 2 hours a day in past months. what a joy. In contrast of reading programing doc n lang specs. Programers are such idiots.

programers don't appreciate good docs. n they have this nasty concept of “grok” (from unix fkheads), n in a flash they'll tell you to dig the source code.

there's no lang in practical use that has rigorous math-like doc or spec. #Haskell? #Ocamel? lol, they've the worst “grok it” doc and spec.

yet the haskell fkheads's like, “algebraic” data structure and monoid and suff. Each one sounds like superior mathematicians. Monad ya ass.

homotopy, a continuous function between 2 functions. How can such topology, differential geometry notion, be tied to logic, set theory, foundation of math? that's the story of homotopy type theory. Absolutely fascinating.

#math if you haven't studied group theory before, do so now. Wikipedia article is very good.

after Wikipedia #math group, read http://xahlee.info/Wallpaper_dir/c0_WallPaper.html

when programers use math jargons, they dunno which side is ass, which is mouth. #haskell #lisp

if a programer mention idempotent monad directed graph, n they can't talk basic abstract algebra, tell them 2 shut piehole #haskell #lisp

programers talking garbage math jargon happens, from 1990s perl and sql to 2000 lisper homoiconicity to today js haskell category idiots.

programers and mathematicians are very distinct communities. The 2 basically don't communicate, not unlike engineers and lawyers.

mathematicians, in general look down on programing. They dunno what's a subroutine, function, object, class.

programers, usually lookup and idolize math, yet, have 0 clue. you wouldn't have a clue of math unless you had 3 years worth of undergraduate MATH MAJOR.

now n then we see hacker idiots discuss how important is math to programing. that's, like, guys in bar on the tao of quantum cosmos.

- 1, due to my public website since 1995, i've talked to lots people, coders, geeks, and many weird people. (same ilk attract)
- 2, Usually, they know me, but i don't know/remember people. (plus, they are often anonymous)
- 3, It has happened quite a few times, in argument about coding or other, something ticked me off, and my screed turned supporters/fans to stone.
- 4, am a schizoid. That basically means, loners, or, people with very little emotion. Any attachment, relationship, trouble us greatly.

you see those Google Doodle? Never, ever, click it or read about it. If you do, your brain is tainted. This is similar to never watch TV.

Google Doodle was fun in 2000s. It's casual, non-intentional. Today, it's commercialization plus propaganda.

there's a idiotic program called pngquant.

it reduces png file size by a lossy compression.

if you want lossy, goto jpg or webp

in September, i'll be blogging on my patreon account only.

https://www.patreon.com/xahlee

If you like my stuff, i hope you patreon me there.

to my patreon supporters, new article https://www.patreon.com/posts/13809835

golang's choice of tab for indentation is the correct one. However, emacs golang mode forcing it to be DISPLAYED as 8 spaces, is the most idiotic. It undo the correct thinking.

See also: Programing: Tab vs Space in Source Code

golang is truly a simple superb practical language. + Real functional programing features. And fast! Puts clojure haskell in shame.

despite my supreme love for functional programing, i'd say, clojure is a complex idiocy, on so many levels. And Haskell too.

my golang tutorial is coming in shape.

See also: Xah Clojure Tutorial

my site ranking, i think that's the highest.

find some sites you know, and let me know what you get. On twitter, Google Plus.

See also: Practical git in 1 Hour

〔►see Egyptian Hieroglyph 𓂀〕

now i have a patreon account. be my first patreon. see first post at https://www.patreon.com/xahlee

Java: Unicode in Java (minor update)

drawing a maze with Unicode. Unicode Box Lines, Shapes ┌ ┬ ┐

Jargon Lambda in Decline (expanded for the general public.)

Unicode 10 is released last week.

〔►see Unicode 10 New Characters〕

New, is the oblete Nushu script, used in China, created and used by women.

On researching this writing system, you find an article from Gardian published in 2005.

The page is gone, but here's screenshot from WayBackMachine.

Note, the word used: forbidden, women, minority, forced arranged marriage.

You see, Guardian spins it so that the now obsolete script appears to be related to suppression and Western gender issues.

Almost all English language news about Chinese are of this nature. They sell what you want to hear, American liberal or conservative. Pretty much, it's about how China doesn't have democracy, that Chinese people want it, and human rights abuse, that it's ancient civilization struggling with modernity, such and such. All very easy to take in and sympathize!

When you study world's scripts (writing systems), at first it's fascinating, because it's novel, and you are introduced to many aspects of designing a writing system. But after a while, you find most of them boring, dreary, idiotic. Basically, Random symbols derived from scribbles. There's no math/logic/design/architect value, but just cultural histories.

if you are interested in anthropology or ethnology, then it's a different story.

if you are interested in design/architecture aspect of writing systems, you look up to scifi, math, logic, communication theory, or perhaps neuroscience.

Billionaire Peter Thiel's startup book Zero to One: Notes on Startups, or How to Build the Future

A Lambda Logo Tour (and why LISP languages using λ as logo should not be looked upon kindly)

See also: Unicode Characters ☯ ⚡ ∑ ♥ 😄