Running LLMs locally on MacOS

Sten-Erik Björling s-e.bjorling at enviro.se
Wed Aug 30 05:55:00 UTC 2023


Hi,

Recommend that you look into ollama.ai - a solution to run Meta LLMs locally on your Mac. Several models supported - especially the coding support models and the uncensored ones are interesting.

Demand a lot of memory for the larger ones, quite good performance - normal reservations apply…

Take care all, all the best…

Stene

_______


Sten-Erik Björling
Enviro Data
Kyrkogatan 5A 2 tr
SE-972 32  Luleå
Sweden

E-Mail: s-e.bjorling at enviro.se
Mobile: +46-70-655 11 72
Wire: @stenerikbjorling
Skype: stenerikbjorling
iChat: stene at icloud.com
Signal: +46 70 655 11 72
FaceTime: stene at icloud.com
Telegram: @stenerikbjorling
Hotmail / Messenger: stenerikbjorling at hotmail.com
GMail: stenerikbjorling at gmail.com
______

This email and any files transmitted with it are confidential, may be legally privileged and are intended solely for the use of the individual or entity to whom they are addressed. If you are not the intended recipient, please note that any use, distribution, or reproduction of the contents of this email is strictly prohibited and may be unlawful. If you are not the intended recipient, please notify the sender by return email and destroy all copies of the original message, including any attachments. Thank you.

Please note that we take reasonable precautions to prevent the transmission of viruses; however, we cannot guarantee that this email or its attachments are free from viruses. We only send and receive emails because we are not liable for any loss or damage resulting from the opening of this message and/or attachments.



More information about the omnisdev-en mailing list