Salta al contenuto principale


Every 'free' AI assistant has a hidden cost: your data. Providers log prompts and train models on private info. That's data harvesting, not privacy. The alternative? Self-hosted, open-source models that run locally with transparent policies. More setup, but your conversations stay private. In an era of monetized footprints, choosing control over convenience may be the most important privacy decision. Thoughts? #Privacy #OpenSource #TechWriting #FOSS #Mastodon

lauseta reshared this.

in reply to Alex Rivers

@Alex Rivers any recommendations for a server/computer to buy and run LLMs on locally? What are the specs needed?