It's still a bit hacky in the current PyPi version of LMQL, but you can also use it from the command line, just like `python -c`:
echo "Who are you?" | lmql run "argmax '\"Q:{await input()} A:[RESULT]';print(RESULT) from 'chatgpt'" --no-realtime
Gives you: I am an AI language model created by OpenAI.
I am one of the LMQL devs and we plan to also add a little more seamless CLI interface, e.g. to support processing multiple lines of text (e.g. quick classification tasks).
I am one of the LMQL devs and we plan to also add a little more seamless CLI interface, e.g. to support processing multiple lines of text (e.g. quick classification tasks).