I’ve watched a lot about AI but only saw random results until I decided to go as low-level as possible.

For starters grab a copy of Llama Cpp with git clone https://github.com/ggerganov/llama.cpp.git. Then compile it (you’ll need CMake) with cd llama.cpp/; cmake -B build; cmake --build build --config Release.

Now just get any GGUF model from Huggingface, download it and use it with ./build/bin/llama-cli -m your_model.gguf -p "I believe the meaning of life is" -n 128

or, if your computer can take it, talk with it using ./build/bin/llama-cli -m your_model.gguf -cnv -n 128.

Despite being a lot slower, it still wouldn’t tell me how to do anything illegal. That was the whole point of not using it. That was the whole point of using it.