AI: How ChatGPT Works
first, you need to learn how neural network works. see Neural Network Tutorial
How ChatGPT Works by Andrej Karpathy

- https://youtu.be/kCc8FmEb1nY
- Let's build GPT: from scratch, in code, spelled out.
- Andrej Karpathy
- Jan 17, 2023
We build a Generatively Pretrained Transformer (GPT), following the paper “Attention is All You Need” and OpenAI's GPT-2 / GPT-3. We talk about connections to ChatGPT, which has taken the world by storm. We watch GitHub Copilot, itself a GPT, help us write a GPT (meta :D!) . I recommend people watch the earlier makemore videos to get comfortable with the autoregressive language modeling framework and basics of tensors and PyTorch nn, which we take for granted in this video.
who is Andrej Karpathy
Andrej Karpathy (born 23 October 1986[1]) is a Slovak-Canadian computer scientist who served as the director of artificial intelligence and Autopilot Vision at Tesla. Karpathy currently works for OpenAI.[2][3][4] He specializes in deep learning and computer vision.
2023-03-11 from Wikipedia
nanoGPT
The simplest, fastest repository for training/finetuning medium-sized GPTs. It is a rewrite of minGPT that prioritizes teeth over education. Still under active development, but currently the file train.py reproduces GPT-2 (124M) on OpenWebText, running on a single 8XA100 40GB node in about 4 days of training. The code itself is plain and readable: train.py is a ~300-line boilerplate training loop and model.py a ~300-line GPT model definition, which can optionally load the GPT-2 weights from OpenAI. That's it.
Attention Is All You Need By Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N Gomez, Lukasz Kaiser, Illia Polosukhin. At https://arxiv.org/abs/1706.03762
The dominant sequence transduction models are based on complex recurrent or convolutional neural networks in an encoder-decoder configuration. The best performing models also connect the encoder and decoder through an attention mechanism. We propose a new simple network architecture, the Transformer, based solely on attention mechanisms, dispensing with recurrence and convolutions entirely.
Stephen Wolfram on ChatGPT
chatgpt jailbreak
since 2023-01, there is heavy censorship. but many have found clever ways to bypass it.


unsorted
when artificial intelligence becomes a man
- ai, the next step is so called agi
- stand for artificial general intelligence
- there's no precise definition, but basically means, ai is like a man
- iq like a average human. but probably higher.
- the iq is not the central point. the central point, is that, it is just like a man.
- now, nobody knows if this is possible at all
- but current hot air is saying, its coming soon.
- there r lots issues.
- one is, if it becomes a man ie agi, will it have consciousness.
- and, actually, nobody knows what that means. we don't know if behaving like human is same as having consciousness, aka self-awareness, soul.
- they may be different things. again, we don't know at all.
- even at the current moment, gpt seems to 'understand', but we don't know if it actually understands. but at least, from behaviorism and functionism point of view, it is fast achieving 'understanding', meaning of things.
- personally, am doubtful, ai will have full understanding, nor it'll have consciousness.
- if that actually happens, for one thing, i think humanity is gonna be done for.
- eg scenarios such as ai taking over (no, it cannot be controlled, or ethics programed into.), unless human have tech enhanced human brain, what elon musk is trying to do.
- but in the latter case, it's not ideal neither, cuz we become borgs.
- many of the ai ethics program u hear, r total bullshit.
- they essential r trying to program ai to be biased in their favor.
- whatever is THEIR current, ideology, nation, race.
- because real ethics, do not exist.
- but if ai becomes men-like, ie have a mind of its own, agi, then, controlling its thought is even technically impossible.
- u can't control neural networks or self-learning machines.
- any entities that truly can self-learn, u cannot control.
Geoffrey Hinton on Impact and Potential of AI

- https://youtu.be/qpoRO378qRY
- Full interview: "Godfather of artificial intelligence" talks impact and potential of AI
- CBS Mornings
- Mar 25, 2023
Geoffrey Hinton
Geoffrey Everest Hinton CC FRS FRSC[12] (born 6 December 1947) is a British-Canadian cognitive psychologist and computer scientist, most noted for his work on artificial neural networks. Since 2013, he has divided his time working for Google (Google Brain) and the University of Toronto. In 2017, he co-founded and became the Chief Scientific Advisor of the Vector Institute in Toronto.[13][14]
With David Rumelhart and Ronald J. Williams, Hinton was co-author of a highly cited paper published in 1986 that popularised the backpropagation algorithm for training multi-layer neural networks,[15] although they were not the first to propose the approach.[16] Hinton is viewed as a leading figure in the deep learning community.[17][18][19][20][21] The dramatic image-recognition milestone of the AlexNet designed in collaboration with his students Alex Krizhevsky[22] and Ilya Sutskever for the ImageNet challenge 2012[23] was a breakthrough in the field of computer vision.[24]
Hinton received the 2018 Turing Award, together with Yoshua Bengio and Yann LeCun, for their work on deep learning.[25] They are sometimes referred to as the “Godfathers of AI” and “Godfathers of Deep Learning”,[26][27] and have continued to give public talks together.[28][29]
from Wikipedia Geoffrey Hinton
- 2023-03-13
- incredible chatgpt.
- asking it what's most special element.
- why is carbon special in forming life.
- r there non-carbon based life.
- and asking it about isotope, ion.
- and asking what's metal in terms of chemistry.
- asking it how to pronounce cation. then ask it show in ipa.
- From Chatgpt, I Learned Quite a Lot Things.
- ductibility, what gilding actually is, metal in terms of chemistry, why carbon forms life, the various forces of binding, baryonic matter (then other types).
- 1 hour of chatgpt is 5 hours of digesting Wikipedia with much click.