Running LLMs locally on MacOS
s-e.bjorling at enviro.se
Wed Aug 30 05:55:00 UTC 2023
Recommend that you look into ollama.ai - a solution to run Meta LLMs locally on your Mac. Several models supported - especially the coding support models and the uncensored ones are interesting.
Demand a lot of memory for the larger ones, quite good performance - normal reservations apply…
Take care all, all the best…
Kyrkogatan 5A 2 tr
SE-972 32 Luleå
E-Mail: s-e.bjorling at enviro.se
Mobile: +46-70-655 11 72
iChat: stene at icloud.com
Signal: +46 70 655 11 72
FaceTime: stene at icloud.com
Hotmail / Messenger: stenerikbjorling at hotmail.com
GMail: stenerikbjorling at gmail.com
This email and any files transmitted with it are confidential, may be legally privileged and are intended solely for the use of the individual or entity to whom they are addressed. If you are not the intended recipient, please note that any use, distribution, or reproduction of the contents of this email is strictly prohibited and may be unlawful. If you are not the intended recipient, please notify the sender by return email and destroy all copies of the original message, including any attachments. Thank you.
Please note that we take reasonable precautions to prevent the transmission of viruses; however, we cannot guarantee that this email or its attachments are free from viruses. We only send and receive emails because we are not liable for any loss or damage resulting from the opening of this message and/or attachments.
More information about the omnisdev-en