

51·
14 days agoLlama.cpp now supports Vulkan, so it doesn’t matter what card you’re using.


Llama.cpp now supports Vulkan, so it doesn’t matter what card you’re using.


I successfully ran local Llama with llama.cpp and an old AMD GPU. I’m not sure why you think there’s no other option.


Here’s a task for you: how do you convert a folder with 5000 images from png to jpg, while ensuring that they are scaled to at most 1024x768 and have a semi transparent watermark on them?
I know how to do it quickly using the command line, but have no idea how to do it with a GUI.
Want a nice project to spend your resources on? Try working on a PDF viewer that supports verifying signatures, form filling and signing documents.
Stop fucking around with meaningless issues.