here are examples of best neural net of given design, with small number of nodes and layers.
They are trying to learn a square function. Input is just a real number.
Neural net can never do simple calculations, such as calculate Pi to a thousand places.
A deep philosophical damning conclusion.
Poignantly true and insightful.
Wolfram is saying that AI cannot compute. (read his blog for his exact words)
in other words, if we have artificial general intelligence like Mr Data in Star Trek, it won't be able to multiply 2 big numbers like a pocket calculator can.
another way to put this is that, if one day we have sentient AI, it won't be able to predict weather.
Typically start with a 2D matrix (or n-D), each layer usually reduce shape or dimension, and eventually become 1D (just a vector. aka list of numbers) down to a single number.
Neural net training is currently mostly sequential.