Xah Programing Blog 2017-11
instead of Santa Claus, the unicode calls it Father Christmas.
🎄 🎅 🤶 ⿅ 𐂂 🦌
see Unicode Search
so why Unicode call Santa Claus as Father Christmas? Wikipedia has history https://en.wikipedia.org/wiki/Father_Christmas
but be warned it probably contains biased writing.
this Wikipedia from 2005 gives a easier to follow picture of history of Father Christmas https://en.wikipedia.org/w/index.php?title=Father_Christmas&oldid=33065629
Father Christmas is a name used in the United Kingdom, Australia, New Zealand and several other Commonwealth Countries, as well as Ireland, for the gift bringing figure of Christmas or yuletide. Although Father Christmas, Saint Nicholas and Santa Claus (the latter deriving from the Dutch for Saint Nicholas: Sinterklaas), are now used interchangeably, the origins of Father Christmas are quite different.
Dating back to Norse mythology, Father Christmas has his roots in Paganism. Before Christianity came to British shores it was customary for an elder man from the community to dress in furs and visit each dwelling. At each house, in the guise of "Old Winter" he would be plied with food and drink before moving on to the next. It was thought he carried the spirit of the winter with him, and that the winter would be kind to anyone hospitable to Old Winter. The custom was still kept in Medieval England, and after a decline during the Commonwealth, became widespread again during the Restoration period. A book dating from the time of the Commonwealth, The Vindication of CHRISTMAS or, His Twelve Yeares' Observations upon the Times involved Father Christmas advocating a merry, alcoholic Christmas and casting aspersions on the charitable motives of the ruling Puritans.
He was neither a gift bringer nor was he associated with children. During the Victorian era when Santa Claus arrived from America he was merged with “Old Winter”, “Old Christmas” or “Old Father Christmas” to create Father Christmas, the British Santa which survives today.
Does Automated Theorem Prover Exist? Can Neural Net Solve Math Problems?
can neural net solve the Collatz conjecture problem?
The Collatz conjecture is a conjecture in mathematics that concerns a sequence defined as follows: start with any positive integer n. Then each term is obtained from the previous term as follows: if the previous term is even, the next term is one half the previous term. Otherwise, the next term is 3 times the previous term plus 1. The conjecture is that no matter what value of n, the sequence will always reach 1.
[2017-12-14 the Collatz conjecture Collatz conjecture ]
this is a interesting question, because it is a simplified version of whether neural networks based AI can answer math problems. “Math problems” here means, for example, any of
Millennium Prize Problems Millennium Prize Problems
The above math problems are complex to computerize. Computerizing math is not a solved problem.
So, we might ask instead, whether neural network can solve math problems that are simple to computerize. For example,
are there odd perfect number? (a positive integer that's equal to its divisors excluding itself)
or group theory List of unsolved problems in mathematics#Group theory
my understanding is, no. Neutral networks cannot answer any such question. Unless, an AI reaches what's called Strong AI (which basically means human level intelligence). But, at that level, the AI is simply like a human, that it isn't using neural net to solve a specific given problem, rather, it has become something else, even possibly sentient.
Also, i think it's understood that we cannot learn much from neural net other than observing the results. Other AI approaches (such as say brute force, or genetic algorithm), we can actually learn something from it. For example, by brute force, we have solved many chess end game problems, and know some theory about it.
we have also learned, that some stalemate rules are bad. That our rule says after x moves the game is a draw, but computer by brute force have found forced checkmate that require more than x moves. (i think the x is 50)
AlphaGo (the precursor of AlphaZero), even though it beat world champion Ke Jie and many other international champions of go early in 2017, and have given us many games it played with itself. However, we did not learn anything about theory of go. What we did learn, is only by observing its play, and theorize ourselves.
other “simple” math problems are, for example,
is there an odd perfect number Perfect number#Odd perfect numbers
“a perfect number is a positive integer that is equal to the sum of its proper positive divisors, that is, the sum of its positive divisors excluding the number itself”.
or Boolean satisfiability problem Boolean satisfiability problem
traveling salesman problem Travelling salesman problem
am wondering, with the seemingly powerful AlphaZero, how's such neural network based AI can tackle these kind of concrete, absolute, seemingly simple, math problems. Or, was it agreed that neural network simply cannot work on these kinda problems?
if neural net can deal with these problems, what's the approach? are there examples?
i don't know much about netural net, but it seems, it is never used on these type of problems.
as for working on disease and other general human problems, am just wondering, what exactly are the problems in concrete terms? Since “disease” is quite general, nothing like chess or go.
suppose you write a chess program. And by brute force, you completely solved chess. That is, you've determined, the optimal move for every position. That is, automated theorem proving.
That, is the idea, and beginning, of automated theorem proving.
of course, we cannot brute force all the way, since there are more ways than we can fathom. Therefore, we try to cut corners, and be smarter, in our ways of enumeration, such as the neural networks of AlphaZero.
aside from that, of mathematics, we cannot even begin to brute force or neural net, since math is not codified as chess or go is. The problem, of turning a human math question into logic and into computer, is itself, not a solved problem. Before we can automate prove theorems, we need codification of math, and that's in the realm of foundation of math.
and in this realm, even though we made a lot progress, or none, relative to the cosmos, there are still mysteries and unbelievers and glory holes. We make do what we can. Thus, we have “conjecture” searchers, “assisted” provers, alterantive foundations such as homotopy type theory and such. Their meaning and context, evolves. Few, knew what they are talking about, reality speaking.
can neural net solve math problems?
Dear Lu, here's a problem you might find illuminating.
suppose you went to RadioShack and built a tiny neural networks Artificial Intelligence software. In just 1 hour of playing with itself, it plays so good at tac-tac-toe that it never loses.
Now, that's some accomplishment. But, now, how to solve, say, x + 1 = 2, for arbitrary 1 2, with your neutral net?
Can your neural net solve such math problem?
what would be your approach?
source of discussion https://plus.google.com/u/0/+johncbaez999/posts/Xk36jKsosGT
google AlphaZero beats best chess software stockfish. And AlphaZero only learned the game from scratch, in 4 hours.
Then, in 2 hours it learned shogi (Japanese chess (much more complex)) and beat the best shogi software.
scary AI is happening.
ASCII Table (minor update)
removed comment system on my site
just removed disqus comment on all my sites for now. They are now forcing image ads. And their ads are those low quality sensational types. To opt ad free, would be $10/month. But, comment takes 30min/day to reply, and 95% are garbage. (i have 5 thousand pages on my sites) might add back, we'll see. let me know what you think.
Unicode Flags 🏁 (major rewrite)
Unicode User Interface Icon Becomes Emoji
unicode emoji should be ban'd. Extremely annoying to show a symbol it becomes a emoji.
if you have ◀ ▶ ⏯, the last becomes a emoji.
Adding U+FE0E does not always work.
And in MacOS, it has a bug forcing emoji tiny, same size as a letter. It ignores CSS font size spec.
and which symbol will become emoji is unpredictable. On twitter, ◀ ▶ both become emoji.
ok, the whole thing is pretty fkd.
see [Apple did not invent emoji By Eevee. At https://eev.ee/blog/2016/04/12/apple-did-not-invent-emoji/ ]
and see replies at https://twitter.com/xah_lee/status/926994405046722560
the problem of computerizing math, began with: THERE EXIST ∃, and FOR ALL ∀. #haskell #coq #typetheory
Leon Chwistek, a Founder of Type Theory
Leon Chwistek (Kraków, Austria-Hungary, 13 June 1884 – 20 August 1944, Barvikha near Moscow, Russia) was a Polish avant-garde painter, theoretician of modern art, literary critic, logician, philosopher and mathematician.
Starting in 1929 Chwistek was a Professor of Logic at the University of Lwów in a position for which Alfred Tarski had also applied. His interests in the 1930s were in a general system of philosophy of science, which was published in a book translated in English 1948 as The Limits of Science.
In the 1920s-30s, many European philosophers attempted to reform traditional philosophy by means of mathematical logic. Leon Chwistek did not believe that such reform could succeed. He thought that reality could not be described in one homogeneous system, based on the principles of formal logic, because there was not one reality but many.
Chwistek demolishes the axiomatic method by demonstrating that the extant axiomatic systems are inconsistent.
2017-11-03 Wikipedia Leon Chwistek
Plants Emoji 🌹 (added a macOS screenshot)
xah edu corner extempore! episode №20171101042745, ban recursion in programing languages
1 with regards to computer science, recursion should be ban'd in programing languages.
2 it's got the chicken n egg problem: before it's defined, it's calling itself. Like russell's paradox, or 1st order logic in twilight zone
3 But in math, we have recursive relation, and comp sci recursive theory. How to resolve that?
4 in math, nth term is defined by previous term, and f is defined non-recursively. so, it's well defined. In a sense, no “recursion”
5 in most programing langs, body of recursive f use “if” to check input. So, “no recursion” really. But chicken n egg remain in defining f.
6 some lang, (Mathematica, haskell), can define f(1) and f(n) separately. So, no chicken n egg recursive definition problem.
7 actually chicken n egg recursive definition problem remain. With respect to order of evaluation.
need to think about this more
Quiz, Write a NestList Function
Quiz. write a function r(f,x,n) that returns a list [f(x), f(f(x)), ...], length n. write in your fav lang.
f is function (e.g. f(x) = x+1), x is a number, n is a number ≥ 1. we want [f(x), f(f(x)), ...]
Someone asked why is this useful? For example, factorial function, or fibonaci sequence. In math it happens often. Check out logistic map or iterated function system or dynamical systems, mandelbrot set.
If you have a question, put $5 at patreon and message me.