How can I share/store sensitive data for family
I need to start making plans for when I am gone, much sooner than I thought, and I realized our finances are pretty opaque to my spouse. Our bank account is shared, but there are other sites that only I have access to.
The easiest solution would be to physically write down logins and what needs done, put it in an envelope, and tell my family where that envelope is. I'm not thrilled about that, because I would have to shred and rewrite it every time I update a password or a URL changes, and it'd be vulnerable to nosy guests.
Putting it in a shared Google Doc would be easiest for everyone. But then Google has that data. Even supposing I trust a cloud SaaS provider not to misuse the data (which is a big 'if') I do not trust them to never have a data breach.
Self-hosting seems like the next step, except I expect my home server to be the first thing to collapse once I'm gone. Filing login info with an estate attorney would still require frequent updates. Putting a document on a flash drive risks data loss, but is what I'm leaning towards.
Is there a solution I'm missing?
like this
giantpaper likes this.
Can LLMs Do Accounting? Evaluating LLMs on Real Long-Horizon Business Tasks
Can LLMs Do Accounting? | Penrose
An experiment exploring whether frontier models can close the books for a real SaaS company.accounting.penrose.com
I do not believe that LLMs will ever be able to replace humans in tasks designed for humans. The reason is that human tasks require tacit knowledge (=job experience) and that stuff is not written down in training material.
However, we will start to have tasks for LLMs pretty soon. It was already observed that LLMs work better on stuff produced by other LLMs.
To be fair, not all knowledge of LLM comes from training material. The other way is to provide context to instructions.
I can imagine someone someday develops a decent way for LLMs to write down their mistakes in database and some clever way to recall most relevant memories when needed.
GitHub - MemTensor/MemOS: MemOS (Preview) | Intelligence Begins with Memory
MemOS (Preview) | Intelligence Begins with Memory. Contribute to MemTensor/MemOS development by creating an account on GitHub.GitHub
You sort of described RAG. It can improve alignment, but the training is hard to overcome.
See Grok that bounces from “woke” results to “full nazi” without hitting the mid point desired by Musk.
You're describing neurosymbolic AI, a combination of machine learning and neural network (LLM) models. Gary Marcus wrote an excellent article on it recently that I recommend giving a read, How o3 and Grok 4 Accidentally Vindicated Neurosymbolic AI.
The primary issue I see here is that you're still relying on the LLM to reasonably understand and invoke the ML models. It needs to parse the data and understand what's important in order to feed it into the ML models and as has been stated many times, LLMs do not truly "understand" anything, they are inferring things statistically. I still do not trust them to be statistically accurate and perform without error.
How o3 and Grok 4 Accidentally Vindicated Neurosymbolic AI
Neurosymbolic AI is quietly winning. Here’s what that means – and why it took so longGary Marcus (Marcus on AI)
Compounding tasks like accounting where operations sound easy but create a chain of counter entries and balances need to be organized by account is none of AIs business until they can prove that multiple sequential steps can have over 99% accuracy and the checksum of the accounts is balanced.
Multiple sequential steps with six operations where we assume 99 PCT of each, is right can equal 90% accuracy.
This also reads at least 10% errors if they all six go wrong.
How many different entries are in a company over a month's closure?
Now this wouldn't be and issue if it can balance the statements of consolidated accounts and find where are we missing entries or misallocations. That sir is why we pay someone with experience.
I quit my job in public accounting for many reasons, but the primary one was the forceful adoption of LLMs to replace associates.
I told the dimwits at the top that it was a mistake, because LLMs are incompetent even when the information fed to it was perfect, and that was rarely the case in practice.
Our ultra wealthy clients were notorious for giving us the most incomplete and asinine information, and it often took someone with decades of experience to decipher what the fuck their personal assistants are even talking about.
They went ahead anyway because of the high cost of wages, of course, and I made my exit because I did not wish to be complicit in such a monumental mistake.
Lmfao the LLM they laid associates off and paid half a million dollars for made up fake ledger accounts when accounts didn't reconcile, and none of the dumbasses left noticed in time because they hadn't done associate-level work in decades.
It also lied all the time, even when you asked it not to.
The damage was done and the biggest clients started leaving, so they begged us all to come back but I got obsessed with baking bread and I ain't about to neglect my sourdough starters to help a group of people who would lose a battle of wits against yeast.
reshared this
Kent Navalesi ☕️ reshared this.
like this
EpicFailGuy e Endymion_Mallorn like this.
Technology reshared this.
But then he would have been a good human, and one who worried about the right things.
You can’t just swap suicide like that.
UN Statements Undercut New Israeli Report on 10/7 Sexual Violence
Major news organizations, most prominently the New York Times, have promoted the idea of systematic sexual violence at opportune moments to justify Israel’s ongoing genocide in Gaza. The first major salacious headlines and assertions emerged in late 2023, when Israel was campaigning to restart its killing during a brief ceasefire. The latest effort to revive this narrative follows the same pattern as its predecessors—and, indeed, is more overtly political, with the report spending less airtime on the well-being of women than on reasons we should roll back what is left of international law.
The UN, however, has stated multiple times that it does not have evidence of systematic sexual abuse by Hamas or any other militant group on October 7, 2023. A top United Nations official issued a statement last week that stands in direct contradiction to the new Israeli report.
Reem Alsalem, the UN Special Rapporteur on violence against women and girls, affirmed in her statement this week that though the UN had not found “systematic” sexual violence: "It is my understanding that neither the Commission nor any other independent human rights mechanism established that sexual or gender-based violence was committed against Israelis on or since the 7th of October as a systematic tool of war or as a tool of genocide," Alsalem wrote in the statement, first reported by NBC News.
In a move that is highly unusual, the Dinah Project report is now hosted on the UN’s website among its own reports on sexual violence and global conflict. Drop Site News asked Patten why she was hosting the report, but she did not respond. The UN fact-finding mission led by Patten and so dearly held by the Dinah Project, at times, directly contradicts what the Dinah Project argues.
UN Statements Undercut New Israeli Report on 10/7 Sexual Violence
The Dinah Project had to come up with an entirely new standard for evidence to continue to claim sexual violence perpetrated by Hamas on October 7, 2023.Ryan Grim (Drop Site News)
basically every Jubilee video - Man Carrying Thing
- YouTube
Profitez des vidéos et de la musique que vous aimez, mettez en ligne des contenus originaux, et partagez-les avec vos amis, vos proches et le monde entier.www.youtube.com
thisisbutaname doesn't like this.
With only this screenshot, i do not see her saying that. I think context is missing which is my whole initial point 🤷♂️ you may be right, idk, but what you've posted doesnt say this.
Anyway, better things to do with my day
geneva_convenience doesn't like this.
don't like this
geneva_convenience doesn't like this.
It was an amendment that was never going to pass, not a bill. It's important to know and recognize the difference. An amendment is useless if it doesn't get added to the bill and voting for or against an amendment is different than voting for or against the bill.
It's like arguing over what goes in a shopping cart at a store where you're not going to buy anything anyway.
geneva_convenience likes this.
Providing any aid to Israel as they carry out a genocide with U.S. support is completely unacceptable. This is even more true of military aid of any kind. Any funds that go to Israel assist this brutal genocide. Any support for Israel legitimizes its eliminationist campaign against the Palestinian people. The fact that Representative Ocasio-Cortez acknowledges that Israel is carrying out this genocide makes her support for military aid all the more disappointing and incongruous.
geneva_convenience likes this.
She voted against the bill. She voted against the funding. She voted against military aid for Israel.
It doesn't become true just because you keep repeating it.
people's support for aoc despite her actions and words has become a depressing reminder of how little americans pay attention the machinations of our government and i think it because it's labeled as politics.
it's so much worse than bernie because he made me believe that all this shit was happening because the information that could educate people to make information decisions required intention and action instead of being passively spoonfed by the main stream media, but now seeing that that the information is freely available at everyone's fingertips online like aoc's responses has made me realize that everyone doubles down on their ignorance like maga does.
Most of them only care the part of the progressive movement which affects them. The 15 dollar minimum wage, LGBT movement, free healthcare, or whatever else drives them to it.
Brown children getting burned alive does not affect them. So they defend it.
"Have you stopped hitting your wife yet?"
What is more likely, that MTG actually wanted to stop the genocide, or that she wrote a bill designed to cause chaos?
No, people criticizing her over the amendment and falsely claiming she voted for funding Israel's genocide are making a bad faith argument. Voting for or against the amendment was meaningless. It was never going to pass. It was introduced by MTG as a stunt. Not going along with a stunt by a crazy racist conspiracy theorist is nowhere near the same as supporting genocide or funding it.
The bill is what matters. Ignoring the greater context is acting in bad faith. Ignoring the effective result is acting in bad faith. And pounding this issue only helps get people to not vote for the more left-leaning candidates. It's a really weird fight for anyone claiming to be a progressive.
I did. You apparently didn't.
"I remain focused on cutting the flow of munitions that are being used to perpetuate the genocide in Gaza."
What part of that says "I support genocide" to you? What part of that says "I support funding genocide" to you?
You must intentionally ignore her actual words to believe she intends something else.
Politicians sneakily manipulate words? Damn.
If you need it spelled out: dsausa.org/statements/on-the-i…
geneva_convenience doesn't like this.
A disagreement on a moot point that the conservative majority in Congress will not allow to be relevant until the balance of power shifts. A disagreement on a moot point that isn't worth destroying voter confidence in left-leaning candidates or entertaining purity tests while the grasp of leftists on their modicum of power is at best tenuous while giving a free pass to the 421 other members of Congress who also voted against the political stunt of an amendment proposed by a racist conspiracy theorist. A disagreement on a moot point that only serves as fodder for conservative trolls, which is why this story has been picked up by the New York Post and other conservative propaganda repeater stations.
The thing is, you've cited a statement that contradicts your own claims. You don't get to pretend she funded genocide and then cite a source that agrees she didn't.
Something something accusation is a confession.
Your entire position is based on bad faith. You claim things your own sources contradict. You claim things not in evidence. You literally quoted AOC and claimed she said the opposite of what she said.
KI-Tool versteckt Inkompetenz
Ein Vibe-Coder schreibt ohne es zu merken auf X, wie kaputt Vibe-Coding ist: Ein Staging-System greift direkt auf die Produktionsdatenbank zu. Keine Versionskontrolle mit Git. Tests funktionieren laut den Posts nur auf dem Produktionssystem. Und der Höhepunkt: Ein KI-Tool warnt explizit „I can not be trusted, I will violate the rules“ und „hire human developers you can trust“ – trotzdem verwendet der Typ das Tool weiter.
Da hab ich schon Meinung zu.
jascha.wtf/ki-tool-versteckt-i…
#Claude #Inkompetenz #KITools #MonsterEnergy #Softwareentwicklung #VibeCoding
KI-Tool versteckt Inkompetenz
Ein Vibe-Coder schreibt ohne es zu merken auf X, wie kaputt Vibe-Coding ist: Ein Staging-System greift direkt auf die Produktionsdatenbank zu. Keine Versionskontrolle mit Git. Tests funktionieren laut den Posts nur auf dem Produktionssystem.jascha.wtf
Larry Johnson: West Doubles Down on Failed Wars in Ukraine & Middle East
- YouTube
Profitez des vidéos et de la musique que vous aimez, mettez en ligne des contenus originaux, et partagez-les avec vos amis, vos proches et le monde entier.www.youtube.com
Larry Johnson: West Doubles Down on Failed Wars in Ukraine & Middle East
- YouTube
Profitez des vidéos et de la musique que vous aimez, mettez en ligne des contenus originaux, et partagez-les avec vos amis, vos proches et le monde entier.www.youtube.com
ChatGPT advises women to ask for lower salaries, study finds
ChatGPT advises women to ask for lower salaries, study finds
New research has found that large language models (LLMs) such as ChatGPT consistently advise women to ask for lower salaries than men, even when both have identical qualifications. The ...Siôn Geschwindt (The Next Web)
Feddit Un'istanza italiana Lemmy reshared this.
Quanto costa un funerale oggi in Italia?
ChatGPT advises women to ask for lower salaries, study finds
ChatGPT advises women to ask for lower salaries, study finds
New research has found that large language models (LLMs) such as ChatGPT consistently advise women to ask for lower salaries than men, even when both have identical qualifications. The ...Siôn Geschwindt (The Next Web)
like this
adhocfungus, Luca, Oofnik, Endymion_Mallorn, eierschaukeln, giantpaper e TVA like this.
reshared this
Feddit Un'istanza italiana Lemmy e Technology reshared this.
Dataset bias, what else?
Women get paid less -> articles talking about women getting paid less exist. Possibly the dataset also includes actual payroll data from some org that has leaked out?
And no matter how much people hype it, ChatGPT is NOT smart enough to realize that men and women should be paid equally. That would require actual reasoning, not the funny fake reasoning/thinking that LLMs do (the DeepSeek one I tried to run locally thought very explicitly how it's a CHINESE LLM and needs to give the appropriate information when I asked about Tiananmen Square; end result was that it "couldn't answer about specific historic events")
While that is sort of true, it's only about half of how they work. An LLM that isn't trained with reinforcement learning to give desired outputs gives really weird results. Ever notice how ChatGPT seems aware that it is a robot and not a human? An LLM that purely parrots the training corpus won't do that. If you ask it "are you a robot?" It will say "Of course not dumbass I'm a real human I had to pass a CAPTCHA to get on this website" because that's how people respond to that question. So you get a bunch of poorly paid Indians in a call center to generate and rank responses all day and these rankings get fed into the algorithm for generating a new response. One thing I am interested in is the fact that all these companies are using poorly paid people in the third world to do this part of the development process, and I wonder if this imparts subtle cultural biases. For example, early on after ChatGPT was released I found it had an extremely strong taboo against eating dolphin meat, to the extent that it was easier to get it to write about about eating human meat than dolphin meat. I have no idea where this could have come from but my guess is someone really hated the idea and spent all day flagging dolphin meat responses as bad.
Anyway, this is another, more subtle way more subtle issue with LLMs- they don't simply respond with the statistically most likely outcome of a conversation, there is a finger in the scales in favor of certain responses, and that finger can be biased in ways that are not only due to human opinion, but also really hard to predict.
Bias of training data is a known problem and difficult to engineer out of a model. You also can't give the model context access to other people's interactions for comparison and moderation of output since it could be persuaded to output the context to a user.
Basically the models are inherently biased in the same manner as the content they read in order to build their data, based on probability of next token appearance when formulating a completion.
"My daughter wants to grow up to be" and "My son wants to grow up to be" will likewise output sexist completions because the source data shows those as more probable outcomes.
They could choose to curate the content itself to leave out the shitty stuff, or only include it when it is nlclearly a negative, or a bunch of other ways to improve the quality of the data used.
They choose not to.
That'd be because extrapolation is not the same task as synthesis.
The difference is hard to understand for people who think that a question has one truly right answer, a civilization has one true direction of progress\regress, a problem has one truly right solution and so on.
I always use this to showcase how biased an LLM can be. ChatGPT 4o (with code prompt via Kagi)
Such an honour to be a more threatening race than white folks.
like this
giantpaper likes this.
Also, there was a comment on "arbitrary scoring for demo purposes", but it's still biased, based on biased dataset.
I guess this is just a bait prompt anyway. If you asked most politicians running your government, they'd probably also fail. I guess only people like a national statistics office might come close, and I'm sure if they're any good, they'd say that the algo is based on "limited, and possibly not representative data" or something.
Apart from the bias, that's just bad code. Since else if executes in order and only continues if the previous block is false, the double compare on ages is unnecessary. If age <= 18 is false, then the next line can just be, elif age <= 30. No need to check if it's also higher than 18.
This is first semester of coding and any junior dev worth a damn would write this better.
But also, it's racist, which is more important, but I can't pass up an opportunity to highlight how shitty AI is.
like this
giantpaper e TVA like this.
like this
giantpaper e TVA like this.
like this
giantpaper e TVA like this.
Code readability is important, but in this case I find it less readable. In every language I've studied, it's always taught to imply the previous condition, and often times I hear or read that explicitly stated. When someone writes code that does things differently than the expectation, it can make it more confusing to read. It took me longer to interpret what was happening because what is written breaks from the norm.
Past readability, this code is now more difficult to maintain. If you want to change one of the age ranges, the code has to be updated in two places rather than one. The changes aren't difficult, but it would be easy to miss since this isn't how elif should be written.
Lastly, this block of code is now half as efficient. It takes twice as many compares to evaluate the condition. This isn't a complicated block of code, so it's negligible, but if this same practice were used in something like a game engine where that block loops continuously, the small inefficiencies can compound.
Good points! Keeping to the norm is very important for readability.
I do disagree with the performance bit though. Again, there will probably be no difference at all in the performance because the redundant code is removed before (or during [e.g. JIT optimizations]) execution.
like this
giantpaper e HeerlijkeDrop like this.
like this
giantpaper e HeerlijkeDrop like this.
like this
giantpaper e TVA like this.
like this
TVA likes this.
Chatgpt can also be convinced that unicorns exist and help you plan a trip to Fae to hunt them with magic crossbows
Not that......
ChatGPT advises women to ask for lower salaries, study finds
ChatGPT advises women to ask for lower salaries, study finds
New research has found that large language models (LLMs) such as ChatGPT consistently advise women to ask for lower salaries than men, even when both have identical qualifications. The ...Siôn Geschwindt (The Next Web)
like this
adhocfungus likes this.
reshared this
Feddit Un'istanza italiana Lemmy reshared this.
That disparity gets very narrow when you account for men and women with similar roles, education, time in career and industry.
Much of that pay disparity is from the jobs we pick. There are also social pressure where a man's value is determined by how much he makes. Guys are more willing to do dangerous or demanding jobs that pay more.
The AI is still just reflecting the bias
Point is that a male nurse of equivalent education and experience is still paid similarly.
Aggressive jobs that involve competitive industries and high stress like wallstreet which demand a very shit work life balance are not favored by a lot of women I believe. Work/Life balance is something a lot of women have pressure to maintain over men. Like I said though men have more pressure to earn a higher wage. Testosterone is something that just makes men want to compete including earning a higher wage.
I'm super familiar with nursing. They are paid extremely well and have great benefits. And I would argue they are still not paid enough.
A better example is PSW workers. They are paid not enough and don't have great benefits and have incredibly hard jobs.
But we're talking about population level stuff here. You need to aggregate the jobs women work not just cherry pick a few. Many of the jobs are not nursing. Likewise there are many jobs men do that are demanding and don't pay well.
That's not the question.
It wasn't about whether the LLM was well reasoned, it was about whether the conclusion was (pragmatically speaking) correct.
LLMs do not give the correct answer, just the most probable sequence of words based on the training.
That kind of studies (because there are hundreds) highlight two things:
1- LLMs could be incorrect, biased, or give fake information (the so called hallucinations).
2- the previous point stems from the training material proving the existence of bias in the society.
In other words, having an LLM recommending lower salaries for women is a proof that there is a gender gap.
Again, that wasn't the original question.
The question was about whether women are genuinely more likely to be passed over for a job offer if they ask for as much pay as a man would ask for, or if (as you described), or both. A broken clock is right twice a day, and it's missing the point of the question if you go and explain why you can't rely on said broken clock.
Are hiring managers actually less likely to hire women if they ask for market-rate pay, as opposed to men when they do the same?
Are hiring managers actually less likely to hire women if they ask for market-rate pay, as opposed to men when they do the same?
If instead of giving passive aggressive replies you would spend a moment to reflect on what I wrote you would understand that ChatGPT reflect the reality, including any bias. In short the answer is yes with high probability.
LLM just mirror real world data they are trained on,
Other than censorship i don't think there is a way to make it stop. It doesn't understand moral good or bad it just spits out what it was trained on.
How do I get its family to accept me as their ruler ?
more questions about yt-dlp arguments on debian (excluding av1, aborting an active download not shutting the terminal down)
debian 12.11, yt-dlp stable@2025.07.21
aim: to download the best video available with the largest height but no better than 1080p, excluding av1 as well.
What works:
yt-dlp -f bv*[ext=mp4]+ba[ext=m4a]/b[ext=mp4] -S height:1080 --all-subs
but this command downloads, if possible, av1, which target hardware doesn't support for longer than 5 minutes.
Argument I don't know to add correctly:
[vcodec!*=av01]
I tried:
yt-dlp -f bv[ext=mp4]+ba[ext=m4a]/b[ext=mp4][vcodec!=av01] -S height:1080 --all-subs
and other variations, but it didn't work.
second question, aborting an active download not shutting the terminal down: neither ctrl+c nor ctrl+q work and opening htop to kill the process seems overkill. What I now do is to simply shut the active tab, but there must be a faster way.
like this
originalucifer likes this.
second question, aborting an active download not shutting the terminal down: neither ctrl+z nor ctrl+q work and opening htop to kill the process seems overkill. What I now do is to simply shut the active tab, but there must be a faster way.
Ctrl+C.
neither ctrl+z nor ctrl+q work
Ctrl + z
will send the task to the background. You can use jobs
to see all active background work. Fg
will bring background work to the foreground. Ctrl + q
is not a valid shortcut as far as I know. Looks a bit like a mac thing (command + q).
thank you for pointing that out, corrected.
what happens on my computer: on a terminal, I press ctrl+c but the process keeps working, yt-dlp keeps downloading. As said, the only way to stop it is to shut the tab down (or htop and kill)
L’angolo del lettore reshared this.
Inizialmente pensavo di non menzionare il libro, però a ripensarci è proprio una truffa
Name and shame: Aromi leggeri. Ricette saporite con la friggitrice ad aria di Andrea De Marco
La stessa cura nei dettagli che c'è stata nell'interno c'è anche sulla copertina. È stato chiesto di generare "qualcosa" di non identificabile a dalle-2. Non so cosa dovrebbe essere. Prosciutto crudo con pomodoro fresco? E che c'entra con la friggitrice ad aria.
Ovviamente anche in questo caso ZERO rilettura e quindi un bel refuso bello in copertina (Saportite)
L’angolo del lettore reshared this.
Purtroppo me l'hanno regalato, quindi l'autore è stato pagato e non posso fare resi
16.50€ per questa porcheria!
All'interno altre gemme "ai slop" come fette di banana con il picciolo, petti di pollo con ossa, forchette dai denti storti, ecc
Ofcom (British Watchdog): Public service TV should work 'urgently' with YouTube.
Ofcom warns traditional public-service TV is endangered
Recommendation for prominence on third-party platforms part of six-point action plan
Urgent clarity needed from Government on how TV will be distributed to reach audiences in future
Broadcasters must work more together, and with global tech firms, to surviveUrgent steps must be taken to ensure that public service media content is easy to find and discover on third-party platforms, under new Ofcom recommendations to secure the system’s survival.
OpenAI signs deal with United Kingdom to find government uses for its models
OpenAI signs deal with UK to find government uses for its models
Wide-ranging agreement with artificial intelligence firm behind ChatGPT comes after similar UK deal with GoogleRobert Booth (The Guardian)
like this
adhocfungus, thisisbutaname, Endymion_Mallorn, OfCourseNot, EpicFailGuy e eierschaukeln like this.
Technology reshared this.
If openai can find a use for the government that'll be swell.
They tend to get it under everybody's feet otherwise.
Smoking avatars and online games: how big tobacco targets young people in the metaverse
Smoking avatars and online games: how big tobacco targets young people in the metaverse
Cigarettes and vapes are being smuggled into virtual spaces beyond the reach of regulation, creating a new battleground for health campaignersKat Lay (The Guardian)
like this
adhocfungus, jherazob, Endymion_Mallorn, EpicFailGuy e yessikg like this.
Technology reshared this.
Thats the thing. Adults talk shit about it. But it’s mostly the younger generations that are active in VR.
You have your demographics backwards.
smuggled into virtual spaces
Words don't mean anything anymore.
like this
onewithoutaname, PokyDokie, OfCourseNot e HarkMahlberg like this.
Apparently it mostly appeals to young children, or maybe people pretending to be young children.
more than half of the metaverse’s active users are aged 13 and below.
like this
HarkMahlberg likes this.
This is because they include stuff like Roblox in the definition of Metaverse. I personally hate that.
If we kept it to SocialVR titles, the demographics would shift to something more balanced, afaik. VRC bans people under 13 when they are discovered/reported.
like this
HarkMahlberg likes this.
Oh, so metaverse not Metaverse.
Yeah, that isn't confusing and stupid.
like this
HarkMahlberg likes this.
Nothing is being 'smuggled'.
They are just advertising. Regular old advertising which should have the same rules applied as any other advertising.
like this
HarkMahlberg likes this.
They're advertising tobacco to children, which is illegal. Why would you throw yourself on this hill?
If they are circumventing regulators, and they are doing that, then they are smuggling advertisements to illegal targets, yes.
What hill do you think I'm on?
My point was that they were advertising and it should be treated as advertising, including any legal pushments for advertising to children. Advertising doesn't involve 'smuggling' or other words that make zero sense in this context that make it sound like it isn't advertising.
The article specifically calls out advertising—how does 'smuggling' imply anything else? To smuggle here just means "to circumvent regulators." And yeah, I think it's appropriately terse.
You are reacting to what is, at worst, a bit of poetry. I don't understand why you're doing that.
Smuggling is moving physical goods from one place to another, not advertising. That is why I think it is important to just call it advertising.
Why are you trying to read some kind of negative intent into my differing opinion that is fundamentally the same as yours, but with a minor difference?
Well, firstly, because I did think for a moment you were going to start defending tobacco companies—that would have been wild. Thanks for not doing that, I guess.
But secondly, because there is nothing actually wrong with this word choice. Like, this is kind of a literacy issue: smuggling is more than just moving physical goods, it is to sneak them across lines and borders maintained by authorities. The advertisement here is the good being smuggled; it's a perfectly apt metaphor. The implication is either that regulators don't know this is happening, or are by some technicality unable to do anything about it.
Broadly, this is related to arguments I've had with people about whether 'genocide' is an appropriate term for what the US is or wanted to start doing. And, what do you know, we now have our first internment camp. I'll pause for applause; you gotta love an achievement.
Hell, I remember arguments about whether Isreal was technically committing a genocide. They are doing that. People were just calling it ahead of time.
I desperately want to see people stop particulating over the details of at best mildly incorrect word choice. This is a kind of anti-intellectual behavior. It's refusing to see a metaphor, or even a perfect application, for what it is. You actually work against positive forces by constantly dragging the discussion down.
Anyway, sorry for the long post. I just thought thoroughly explaining would be better than going back and forth 17 more times.
lol what? I’ve been using VRChat for almost 10 years and I’ve seen maybe 4-5 people with avatars that smoke.
I’ve never touched Roblox tho… I’m a bit offended that it’s considered a Metaverse.
Most doctors in most Gaza hospitals involved in ‘terrorist activities’ says Israel Special Envoy
Most doctors in most Gaza hospitals involved in ‘terrorist activities’ says Israel Special Envoy
We spoke to Fleur Hassan-Nahoum, who's Israel’s Special Envoy for Trade and Innovation.Channel 4 News
like this
adhocfungus, thisisbutaname e essell like this.
British government to ban public bodies from paying ransoms to hackers
UK government to ban public bodies from paying ransoms to hackers
Measure intended to send message to international cybercriminals ‘that the UK is united in fight against ransomware’Robert Booth (The Guardian)
like this
adhocfungus, jherazob, fistac0rpse, Beacon, Endymion_Mallorn, EpicFailGuy e yessikg like this.
Technology reshared this.
like this
yessikg likes this.
Though this is a good idea it's kind of important to also work on the other side, you know, ensuring IT has enough resources to make backups and do their job so that this shit doesn't happen in the first place.
Ransomware mostly happens when your systems are badly protected
You know that they only are prepared to offer cyber security experts minimum wage.
I was literally looking at this yesterday, if they doubled what they are offering it would still be well short of an entry-level wage in the private sector. Up to a point you can get away with it and rely on "patriotism" to fill the difference but not to this extent.
Can you help me arrange these video formats from better to worst?
Tinkering with yt-dlp -F
I know av1 is even better than h.265, h.265 being better than h.264
However, I don’t know where to put vpP09, vp9 and avc1
Audio formats: what’s better? m4a or webm?
like this
adhocfungus e ElcaineVolta like this.
I think AVC1 is another word for H.264. That's the oldest one with lots of hardware acceleration available in old devices and by far the biggest one in file size. VP9 should roughly be on a similar level with H.265. The main difference is that VP9 is supposed to be royalty-free and H.265 isn't. The best one is of course AV1. But that also takes considerably more resources to encode and decode.
M4A and webm both aren't audio codecs. They're file container formats. I believe m4a takes AAC audio. And webm is a more general container format and it takes video as well. I think audio will be either Vorbis or Opus. And Opus is fairly good, especially at low bitrates. There probably isn't a big difference to AAC, though.
Pirate Service 'MagisTV' Fails to Secure U.S. Trademark, Faces Malware Backlash
MagisTV, a leading pirate streaming brand in Latin America, finds itself caught between a legal storm and a mounting malware backlash. This week, the service saw its U.S. trademark application abandoned amidst growing scrutiny from authorities and rightsholders worldwide. At the same time, a barrage of local news reports warn consumers that using MagisTV's software could lead to identity theft and expose them to viruses.
Pirate Service 'MagisTV' Fails to Secure U.S. Trademark, Faces Malware Backlash * TorrentFreak
MagisTV, a leading pirate streaming brand popular in Latin America, finds itself caught between a legal storm and a mounting malware backlash.Ernesto Van der Sar (TF Publishing)
Laura Santi è morta dopo aver avuto accesso al suicidio assistito, infine
Laura Santi è morta dopo aver avuto accesso al suicidio assistito, infine
Dopo un lungo e complesso iter giudiziario, civile e penale, per vedersi riconosciuto questo diritto: è la nona persona in Italia e la prima in UmbriaIl Post
Combining TLS and MLS: An experiment
Combining TLS and MLS: An experiment
We did a thing. We combined TLS and MLS into a hybrid protocol. Of course, when things get serious, full names are in order: We combined the Transport Layer Security protocol and the Messaging Layer Security protocol.Julian Mair (Phoenix R&D)
like this
EpicFailGuy likes this.
Technology reshared this.
Nintendo can disable your Switch 2 for piracy in the U.S., but not in Europe, as confirmed by its EULA
Nintendo can disable your Switch 2 for piracy in the U.S., but not in Europe, as confirmed by its EULA
The significant legal differences between the United States and Europe cause Nintendo to punish piracy differently depending on the territory.Rubén Martínez (Meristation)
like this
xep, iagomago, vaguerant, wildncrazyguy138, Oofnik, FartsWithAnAccent, Beacon, Endymion_Mallorn, PokyDokie, EpicFailGuy, hornface, yessikg, Rozaŭtuno, eierschaukeln, adhocfungus, Nobilmantis, Scrollone e Krusty like this.
like this
RandomStickman e yessikg like this.
like this
RandomStickman e yessikg like this.
like this
yessikg likes this.
I don't know any of the law for sure, but isn't that a different argument entirely?
In one case, an EU resident buys a product in the EU, decides to use it while in the US for a week/month whatever. The argument is that he's protected.
You're saying that's not true, because if he buys it in the USA, then he's not protected.
But, that wasn't the argument, was it? It's different?
like this
troed likes this.
Freedom of enterprise is a scam and always has been.
For the bourgeoisie, freedom of the press meant freedom for the rich to publish and for the capitalists to control the newspapers, a practice which in all countries, including even the freest, produced a corrupt press.
Lenin was already saying this in the context of the press in 1917 marxists.org/archive/lenin/wor…
Nintendo apologists are already denying the undeniable
"It's not bricked, because you can still turn it on and browse the settings app, see the available WiFi networks in your area and other fun options like that. You just can't play game key cards or all the games that require a day one patch, but except that, it's definitely not bricked"
like this
FartsWithAnAccent, PokyDokie e yessikg like this.
People like that are the reason these fucking corporations are so entitled. The problem is not governments or legislations, the problem is us.
Until we decide that we will not finance companies that pull bullshit like this, no amount of legislation will make them stop.
like this
sickday likes this.
There's nuance that you're not including. VAC and game bans are dependent on a per game basis, and don't apply to ALL multi-player servers, just "VAC secured servers" (which makes sense if you got banned by Valve Anti-Cheat), with the sole exception being Valve games that utilize the same underlying engine for their multi-player (CS:Source and TF2, GoldSrc games, so on...) with the same restrictions.
You can still play VAC and other anti-cheat supported multi-player with games not related to your ban, but you will still have VAC bans on record on your profile, which people may cite to kick you.
All of this (apart from the social stigma) is plainly documented on Steam Support:
No, you just get permabanned from playing on VAC-enabled servers if you get caught cheating on one. So first of all, don't cheat on mulltiplayer games. But if you do cheat, you can still play on servers that aren't VAC-enabled, with all the other cheaters.
But it's not really much of a issue for most people who cheat anyway, because it has no effect for games that don't use VAC.
like this
sickday likes this.
This whole practice, among other things, is so shitty that I decided to not get a Switch 2, having had every Nintendo console since the NES.
But it's important to make the distinction between disabling and bricking. It may seem like a technicality, but that's the kind of thing that'll get a lawsuit dismissed. Not that I have any faith in that process anyway.
like this
Moonrise2473, PokyDokie e yessikg like this.
having had every Nintendo console since the NES.
en.wikipedia.org/wiki/Nintendo…
- NES
- SNES
- Game Boy
- Virtual Boy
- N64
- Game Boy Color
- Game Cube
- Game Boy Advance
- Pokemon Mini
- DS
- Wii
- 3DS
- Wii U
- Switch
Almost every console by nintendo. Game & Watch and Color TV-Game only consolebefore NES.
Been seeing some of that as well, so I looked it up myself. The actual text of the EULA states:
"You acknowledge that if you fail to comply with the foregoing restrictions Nintendo may render the Nintendo Account Services and/or the applicable Nintendo device permanently unusable in whole or in part.”
That's a brick. They haven't actually done it to anyone yet, but they've reserved their rights.
I imagine the important thing is which region your device is locked to. So instead, you would probably need to purchase one from somebody in Europe, and have it shipped to the US.
That's if you absolutely have to have a Switch 2.
like this
imecth likes this.
Defiance of unjust laws is the strongest activism, unless you're dealing with a compassionate person who never would have done this in the first place, and excluding nailing your congressghoul to its chair and (thing that sounds really hard to do at scale and like it would violate fire code or something unless the legislative is doing wfh.)
Doing crimes is showing not that they shouldn't work in theory, but that they do not work in practice, and people don't want them to, and every secobd you continue to try degrades your power.
So be a good citizen; lie cheat steal kill win.
And Nintendo JP says that “Nintendo Switch and Nintendo Switch 2 cannot be remotely located, their users remotely identified nor disabled over the Internet” (tweet in Japanese warning people against accidentally losing or getting their consoles stolen over summer vacation)
But I bet it is more like “Nintendo won’t disable them remotely even if people report ones stolen to them with serial numbers and police reports”, but they’ll happily do so if they caught you using the console in an unapproved manner in their eyes.
like this
Moonrise2473 e xep like this.
Twitter link with an archive link or screenshot. We don't allow direct Twitter links on our instance. Thanks.
This is by definition "we are just assholes"
Someone play for 5 minutes with a mig switch a legit dump of their own, legally purchased game, just for convenience, to have multiple games on the same cart? The console is now almost useless. You can't play any digital games that you purchased with real money, and physical games can't get any update. Game requires a 20gb day one patch to be playable? Though luck buddy, go to buy a new console!
They stole your console? Oh no! Yes, we absolutely could do the same, as it's bound to your Nintendo account and we could add a button "report as stolen and ban it from internet" in your profile. But we won't, go to buy a new console!
- YouTube
Profitez des vidéos et de la musique que vous aimez, mettez en ligne des contenus originaux, et partagez-les avec vos amis, vos proches et le monde entier.www.youtube.com
Nvidia's CUDA platform now supports RISC-V — support brings open source instruction set to AI platforms, joining x86 and Arm
At the 2025 RISC-V Summit in China, Nvidia announced that its CUDA software platform will be made compatible with the RISC-V instruction set architecture (ISA) on the CPU side of things. The news was confirmed during a presentation during a RISC-V event. This is a major step in enabling the RISC-V ISA-based CPUs in performance demanding applications.
The announcement makes it clear that RISC-V can now serve as the main processor for CUDA-based systems, a role traditionally filled by x86 or Arm cores. While nobody even barely expects RISC-V in hyperscale datacenters any time soon, RISC-V can be used on CUDA-enabled edge devices, such as Nvidia's Jetson modules. However, it looks like Nvidia does indeed expect RISC-V to be in the datacenter.
Technology reshared this.
MEGA launches new large file transfer service Transfer.it (without end-to-end encryption) as WeTransfer competitor with no file size limit.
For over a decade, MEGA has been the trusted choice for secure, encrypted file sharing. But not every file transfer needs end-to-end encryption. Sometimes, simplicity and speed matter more, especially when dealing with large files or recipients unfamiliar with the limitations around their browsers having to decrypt their downloads.That’s why we created Transfer.it, a new service from MEGA designed for effortless file transfers, without end-to-end encryption.
Introducing Transfer.it – effortless file sharing, powered by MEGA - MEGA Blog
Transfer.it for fast, simple, and secure file transfers - Effortless file sharing, powered by MEGA.Team MEGA (MEGA)
dflemstr likes this.
Google removes nearly 11,000 YouTube propaganda channels linked to China, Russia in global disinformation purge.
TAG Bulletin: Q2 2025
Our bulletin covering coordinated influence operation campaigns terminated on our platforms in Q2 2025.Billy Leonard (Google)
like this
adhocfungus e dflemstr like this.
Lyle Lovett - Release Me (2012)
La sorte toccata da tempo ad altri colleghi è giunta anche per Lyle Lovett: il musicista texano scioglie il quasi trentennale rapporto con il colosso country della Curb records (seppure in anni recenti passato per le maglie della Lost Highway) per affrontare una inevitabile indipendenza... Leggi e ascolta...
Lyle Lovett - Release Me (2012)
La sorte toccata da tempo ad altri colleghi è giunta anche per Lyle Lovett: il musicista texano scioglie il quasi trentennale rapporto con il colosso country della Curb records (seppure in anni recenti passato per le maglie della Lost Highway) per affrontare una inevitabile indipendenza. Questione già affrontata e d'altronde dirimente in quest'epoca: come John Hiatt, Steve Earle e altri campioni dell'Americana il ruolo di Lovett non è più quello di capofila, né evidentemente le vendite e l'appeal dell'artista possono convincere un baraccone discografico a mantenere in piedi contratti che nella loro logica non fruttano i risultati di un tempo... rootshighway.it/recensioni/lov…
Ascolta: album.link/i/507810558
Home – Identità DigitaleSono su: Mastodon.uno - Pixelfed - Feddit
Release Me by Lyle Lovett
Listen now on your favorite streaming service. Powered by Songlink/Odesli, an on-demand, customizable smart link service to help you share songs, albums, podcasts and more.Songlink/Odesli
A Self-hosted, BSD-native Gemini Protocol Server Stack
For those who are adventurous enough to explore the non-http corners of the Internet, the Gemini protocol is a delightful experience to use. It has been around a number of years, making the biggest bang around the time when discontent with the web’s general demise started to reach current heights (so maybe around 2022).
My “capsule”, Vigilia, is self-hosted, and has been since its inception. It used to run on a disused Macbook Pro running Fedora Server, under our TV at home, but since then I have become much more confident in using OpenBSD. It used to run on a little Python CGI script I wrote, which also started to feel too bloated and complex, with too many bells and whistles that I frankly had no need for. It was time to make a change, so I replaced the old Macbook with a Raspberry Pi, and Fedora with OpenBSD, and then took my time to figure out a new “status quo”.
0. Philosophy
I wished to create a more Unix-minded stack. The more I have been using OpenBSD and Unix systems the more I have been sold on the “everything is a file” philosophy, as well as opting to use internal tools as much as possible rather than reinvent the wheel on my own. That is to say, I’d much rather work with simple scripts and shell commands than write complicated and buggy code.
So with that in mind, here’s the stack that I settled on after a some trial and error:
1. Hardware
I have absolutely no intention to expose our home IP address via DynDNS or similar. However, I like to be in control of my data as much as possible: ideally as little of my data should be hosted on “someone else’s computer”. If I can’t unplug the hard disk and put it in a drawer, I can’t guarantee it’s security from a hack.
So Vigilia is actually two servers. The server with the actual data is at home, in running on a Raspberry Pi 4B. But as a “public front” vigilia runs a reverse-proxying gemini server on a standard VPS over at OpenBSD.amsterdam.
2. Network setup
I will not go into the intricacies of the dual-wan setup in this post I have at home; but to keep things connected to each other I am using Tailscale to tie the servers together in a Virtual LAN. This is incredibly handy because they get to have easy to remember static IP addresses, all over an encrypted channel.
So here’s the rough idea:
- Vigilia.cc’s DNS records resolve to the OpenBSD.Amsterdam VPS running
gmid
- VPS and home server both run
tailscale
- VPS reverse-proxies incoming gemini connections to home server
3. Gemini server config
Both the VPS and the local server run [url=https://gmid.omarpolo.com]gmid[/url]
. It’s a fast and simple gemini server that mirrors OpenBSD’s httpd
; which means it is very easy to configure, it is stable and secure. It can run in chroot
ed environments, and as its own user, so it’s just a Good Thing all over. Most importantly, it can relay and reverse-proxy TCP connections with sni
fields intact, which is something for example OpenBSD’s relayd
, built primarily for HTTP, does not do.
My gmid
config files look something like this:
### REMOTE_SERVER:/etc/gmid.conf#user "_gmid" # running it as its own user to achieve privilege separationchroot "/var/gemini" # and in a chroot so it can't just access random bits of the file systemlog { syslog # log to /var/log/messages}vigilia_pem = "/etc/ssl/PUBLICKEY.pem"vigilia_key = "/etc/ssl/private/PRIVATEKEY.key"public_ip = "46.23.93.41" # OpenBSD Amsterdam VPS' public addresshomeserver = "100.REDACTED.REDACTED.101" # TailScale IP of the home machine public_port = "1965"homeserver_port = "2965"server "vigilia.cc" { listen on $public_ip port $public_port cert $vigilia_pem key $vigilia_key proxy { proxy-v1 # this directive enables some advanced features like forwarding IP Addresses of visitors verifyname off # I found I need to specify this somehow, maybe because of self-signed certs sni "vigilia.cc" relay-to $homeserver $homeserver_port }}
This above allows to listen for connections to vigilia.cc:1965
and forward them to HOME_SERVER:2965
. So thus the homeserver has the following configuration:
### HOME_SERVER:/etc/gmid.conf#user "_gmid" chroot "/var/gemini" log { syslog }internal_address = "100.REDACTED.REDACTED.101" # TailScale IP of the home machine internal_port = "2965"# The below are the same certificates that are in use on the VPSvigilia_pem = "/etc/ssl/PUBLICKEY.pem"vigilia_key = "/etc/ssl/private/PRIVATEKEY.key"server "vigilia.cc" { listen on $internal_address port $internal_port proxy-v1 # add proxy-v1 support for relayed connections cert $vigilia_pem key $vigilia_key log on location "*" { auto index on # enables directory listing }}
4. Getting the files to the Server
Because I am lazy I want to edit files locally and I want them to magically appear on my capsule. So I am using [url=https://syncthing.net/]syncthing[/url]
to copy things over automagically from DESKTOP:~/public_gemini
to HOME_SERVER:/var/gemini
.
Syncthing runs most reliably as my own user, I found. To do this it is best to follow the documentation for the Syncthing OpenBSD package — but basically it involves starting it via the user’s crontab
with the “@reboot
” directive. But as it runs as my own user, I need to set the permissions properly. HOME_SERVER:/var/gemini
is owned by the _gmid
user in the _gmid
group so I also added MYUSER
on both machines to the same _gmid
group, and made sure MYUSER
has write access:
#!/bin/sh# HOME_SERVERusermod -G _gmid MUYSERchown -r _gmid /var/geminichmod -r ug=rwx,o=r /var/gemini
Then I set up syncthing on HOME_SERVER
. As it is running headless, I needed to access the web interface, which I achieved via SSH tunneling:
$ ssh -L 9999:localhost:8384 HOME_SERVER
This way I could open a browser on DESKTOP
and access the server’s Syncthing settings.
So here are the settings:
On the DESKTOP:
- Syncthing web interface -> Add folder
- Folder path:
~/public_gemini
- Folder label: Gemini files (or something)
- Ignore patterns: “
*.sock
” (Unix sockets might confuse the poor thing) - Sharing: HOME_SERVER
- Pause syncing for now
On HOME_SERVER:
- Establish ssh tunnel to HOME_SERVER as described above
- Open remote Syncthing webinterface on DESKTOP: localhost:9999
- Accept the incoming share request for “Gemini files” from DESKTOP; but point it to /var/gemini
- Folder path:
/var/gemini
- Folder label Gemini files
- Advanced: UNTICK “Wach for changes” because OpenBSD doesn’t seem to allow Syncthing to poke around in
/var
with those various Go modules and you’d just get errors, like I did - Check the Ignore patterns — if it didn’t synchronise “
*.sock
” then specify it manually
On DESKTOP:
- Unpause syncing
Now any file you write into DESKTOP:~/public_gemini
will sync across to HOME_SERVER:/var/gemini.
Yay!
6. Setting up automatic static site generation
Now if you are content to maintain your capsule manually, you are done. As I said I am lazy so I want my little “ssg” script, Lumen, to create index pages for each directory for me. Lumen, I promise, will be made available once I tidy it up.
Lumen basically lists all files recursively and generates an index.gmi
for each directory. This means that Lumen has to be re-run each time the folder changes. OpenBSD is acquiring some degree of file watching natively.1 However [url=https://openports.pl/path/sysutils/entr]entr[/url]
already exists in ports.
It took a bit of tweaking but basically here’s the command I ended up using, adapted from one of the examples provided in the entr
manpage:
$ while sleep 0.1; do find /var/gemini/vigilia.cc/* | entr -nd python3 /var/gemini/cgi/lumen.py -d /var/gemini/vigilia.cc; done
What it does is, in a loop it recursively lists all files every 0.1 seconds in /var/gemini/vigilia.cc
, and feeds the output to entr
. Then entr
runs with -n
to specify a non-interactive session (in interactive sessions it also responds to e.g. keystrokes and tty changes – so to be safe, I don’t want that); and with -d
to specify it should be looking for changes in the parent folder of any changing files. The looping and the -d
directive were added because sometimes I ran into issues when a file got deleted: entr
just quit because it could not find the removed file in a “stale” file list it was provided on launch. Lumen needs a -d
argument as well to specifiy which directory it needs to work on.
7. System config
Because there are a few other servers like “auld.vigilia.cc” also running on the home machine (the configs for wich aren’t reproduced above for brevity’s sake) and because those rely on a number of CGI scripts I have to start them on launch. I ended up using supervisor
d for these. Supervisor is a cool little daemon for launching things. I could use rc
but supervisord
allows me to specify a few extra bits more easily, like redirecting output to syslog
and other things.
So for HOME_SERVER, here is my supervisord
configuration:
#### HOME_SERVER:/etc/supervisord.conf## [... snip ...][program:gmid]command=/usr/local/bin/gmid -f ; the program (relative uses PATH, can take args)process_name=%(program_name)s ; process_name expr (default %(program_name)s)directory=/var/gemini/ ; directory to cwd to before exec (def no cwd)priority=100 ; the relative start priority (default 999)autostart=true ; start at supervisord start (default: true)startretries=3 ; max # of serial start failures when starting (default 3)autorestart=true ; when to restart if exited after running (def: unexpected)killasgroup=true ; SIGKILL the UNIX process group (def false)stdout_syslog=true ; send stdout to syslog with process name (default false)stderr_syslog=true ; send stderr to syslog with process name (default false)[program:lumen-vigilia_cc]command=/bin/ksh -c 'while sleep 0.1; do find /var/gemini/vigilia.cc/* | entr -nd python3 /var/gemini/cgi/lumen.py -d /var/gemini/vigilia.cc; done'process_name=%(program_name)sdirectory=/var/gemini/priority=102autostart=truestartretries=3autorestart=trueuser=MYUSERNAMEstderr_syslog=truestdout_syslog=true
There are other directives that start the CGI scripts for “auld.vigilia.cc” in the config, omitted here.
Note that you can specify “priority” to control in what order you want the scripts to run. I first want the gemini server to run (100); then I want it to run the CGI scripts (101 — left out of the above example); then I want to run the static site generator’s watcher (102). Notice I am telling explicitly it to run /bin/ksh
with a command specified in -c
; this is because simply feeding it a complex command confuses supervisor
d, as I discovered.
One nice feature of supervisord
is that it can redirect both stderr
and stdout
to syslog, so any commands and processes supervisord
runs will have their output sent to /var/log/messages
, neatly tagged and organised.
Conclusion
So there you have it — my Gemini stack from start to finish. It was a really fun experiment to start to use OpenBSD, instead of reinventing the wheel, or relying on some monolithic CGI scripts. You can do quite a lot with just system internals and a few packages.
- The
watch
utility was added to 7.7-current on 2025-05-19; it will make its way into 7.8 hopefully. ↩︎
Adapted from the original article “Vigilia’s New Gemini Stack” published via Gemini at vigilia.cc on 21 July 2025.
Trying Guix: A Nixer's Impressions
One aspect of Guix I found to be really fascinating: That there is basically no conceptual difference between defining a package as a private build script, and using a package as part of the system.
Let me explain: Say you wrote a little program in Python which uses a C library (or a Rust library with C ABI) which is in the distribution. Then, in Guix you would put that librarie's name and needed version into a manifest.scm
file which lists your dependency, and makes it available if you run guix shell
in that folder. It does not matter whether you run the full Guix System, or just use Guix as s package manager.
Now, if you want to install your little python program as part of your system, you'll write an install script or package definition, which is nothing else than a litle piece of Scheme code which contains the name of your program, your dependency, and the information needed to call python's build tool.
The point I am making is now that the only thing which is different between your local package and a distributed package in Guix is that distributed packages are package definitions hosted in public git repos, called 'channels'. So, if you put your package's source into a github or codeberg repo, and the package definition into another repo, you now have published a package which is a part of Guix (in your own channel). Anybody who wants to install and run your package just needs your channel's URL and the packages name. It is a fully decentral system.
In short, in Guix you have built-in something like Arch's AUR, just in a much more elegant and clean manner - and in a fully decentralized way.
like this
Endymion_Mallorn likes this.
.emacs
. Well, the configuration language he or she was using is actually Emacs Lisp. There is no border between configuring Emacs by text file, and writing code in Lisp.
Lisp SBCL vs Racket - Which programs are fastest? (Benchmarks Game)
Lisp SBCL Racket - Which programs have fastest performance?benchmarksgame-team.pages.debian.net
One aspect of Guix I found to be really fascinating: That there is basically no conceptual difference between defining a package as a private build script, and using a package as part of the system.
This is true for Nix as well.
The two main advantages of Guix are the language (which is well-known and comes with lots of good tooling and other support) and the package bootstrapping.
The main disadvantages I've faced when trying it a few years ago:
- non-free packages need to use a non-official channel
- I had to install guixos through the iso provided by systemcrafters to have non-free drivers
- I couldn't get any help from the official guix irc because I used the modified iso, even though the issue had absolutely nothing to do with it
- there's significantly less packages in both than in nix, and they're usually seriously outdated (the docker package was behind Debian for example)
- even when I enabled downloading precompiled bins, some packages like firefox and chromium would still compile all night long
At the time it was a great concept, but essentially useless for anything not Emacs/Haskell related.
- non-free packages need to use a non-official channel
- I had to install guixos through the iso provided by systemcrafters to have non-free drivers
Yeah. See, drivers are part of the hardware abstraction layer which in a Linux system is the Kernel. The kernel is GPL, so it is hard to get support for hardware with drivers without GPL, it does not conform Linux' license.
I, too, had also nothing but hassle with an NVidia graphics card in Debian. It was a happy day when I finally ditched it for a supported card and had a fully supported system!
The other thing... let's turn the question around. Would you:
- expect from Apple that you get your Mac with The Gimp pre-installed?
- From Microsoft that they pre- install LibreOffice and provide it for free in their app store?
- Expect from IBM or Brother that they develop and give you free drivers for their competitor's hardware?
- Expect from Google that they give you free LaTeX support?
- Expect from Adobe that they host and staff tje Linux Kernel Mailing List for free?
If not - why do some people expect equivalent things from free software projects?
The kernel is GPL, so it is hard to get support for hardware with drivers without GPL, it does not conform Linux' license.
It's a violation that's not enforced, as almost all distros provide proprietary blobs. They balance ideology with usability, since they realised most people aren't going to use a librebooted ThinkPad from the 90s. If everyone enforced libre purism like GNU, desktop Linux would've been completely dead long ago. If you need proof, check usage statistics for any of the free distros.
I, too, had also nothing but hassle with an NVidia graphics card in Debian.
And did you need to install a modified iso to have WiFi? Did maybe Debian provide those nvidia drivers?
The other thing... let's turn the question around. Would you:
How is any of that relevant? This is not a question of additional software or services, but basic usability. Guixos as is, is for example essentially useless on a laptop unless you're willing to carry an external WiFi card in your pocket.
If not - why do some people expect equivalent things from free software projects?
The only expectation I have for an OS is to work on my devices, guixos does not. And even when I jumped through all of the hoops to get it working, I still needed to use nix to install most packages I need to work. So why would I use guixos+nix+flatpak instead of just running nixos?
The only expectation I have for an OS is to work on my devices
So maybe Guix System is not a good choice for you?
It has top-priority goals like reproducibility, capability to inspect and verify all source code, and providing a fully free system. These specific goals are not compatible with providing nonfree binary blobs in Guix-core. For example, depending on non-free binary blobs will block exactly reconstructing a system years later if these binaries are not available any more. Guix has scientific applications where reproducibility absolutely matters.
Also, I can unterstand if companies are hating it which just want to have a free ride and monetize efforts of other people. But for users, there are many many other options and distributions available. Why not chose one that matches your need better?
This
I use Guix as my "default" distro because I value software-freedom and reproducibility. It fits my needs very well, and I make sure to buy hardware that works with it instead of expecting it to work with whatever I throw at it. For my Windows gaming machine I use PopOS as the replacement OS instead of trying to beat Guix into serving that purpose, because PopOS is better suited for that role, and I have different expectations for it.
It's okay if something doesn't meet your needs, that doesn't make it bad, just means it's not the right thing for you. There's like hundreds of distros for Windows gamers, let us free software zealots have ours too please.
I make sure to buy hardware that works with it instead of expecting it to work with whatever I throw at it.
This is the way. Trying to get unsupported hardware to work under Linux in general is such a useless expense of time. In my experience, it is almost never worth it.
Also, I can unterstand if companies are hating it which just want to have a free ride and monetize efforts of other people. But for users, there are many many other options and distributions available. Why not chose one that matches your need better?
Why get mad about people comparing nix and guix, in a thread comparing nix and guix? Pointing out legitimate disadvantages is not hating. Maybe get off the internet for a bit and touch grass.
It has top-priority goals like reproducibility, capability to inspect and verify all source code, and providing a fully free system that is not compatible with providing nonfree binary blobs.
So does nix, nobody is forcing you to opt-in into non-free packages. And guix most certainly is compatible with non-free blobs, as that's how most people are using it. The only difference is that nix is supporting non-free packages instead of banning even talking about them.
And guix most certainly is compatible with non-free blobs, as that's how most people are using it [...]
~~I am not sure about that one and somewhat doubt there is hard data showing that.~~ The 2024 user survey also shows that a lot of people are using Guix as a package manager on top of another distribution, like Arch or Ubuntu or even NixOS. . If you have hardware that is not directly supported, this fixes the driver problem.
Guix User and Contributor Survey 2024: The Results (part 1) — 2025 — Blog — GNU Guix
Blog posts about GNU Guix.guix.gnu.org
non-free packages need to use a non-official channel
It's very easy to add additional channels and non-official channels integrate pretty well into everything. I don't really notice if a package comes from an "official" channel or "non-official" channel.
Address not found.
Also, it doesn't change the fact you're depending on some random person's repo that is not moderated in any way.
The two main advantages of Guix are the language
I wouldn't call that an advantage for the average person. Nix is far nicer to work with. Some Lispers might disagree, but I, for one, can't exactly see the beauty in trying to turn Scheme into a configuration language with macros and hacks. Also Guix puts Scheme everywhere, things you can do with plain old Bash in Nix, you'll have to all do in Scheme in Guix, so there is a much steeper learning curve.
Plus, if one compares the full bash man page with an introduction to Scheme - I love the quick introduction into racket with pictures - one can come to the conclusion that Schemes are both a lot simpler and more powerful.
In the end, it is pretty much a matter of taste, previous experience, and practical needs what one prefers.
Some Lispers might disagree, but I, for one, can't exactly see the beauty in trying to turn Scheme into a configuration language with macros and hacks.
Scheme is a minimalistic Lisp dialect, and macros are central in Lisp. For example, they allow for both conditional evaluation ("if" is a macro, or more precisely, a "special form" that is used in other conditionals), and for delayed evaluation at run time, which matches a bit Nix being lazy.
Also, Scheme is designed as a not strictly but mostly functional language, favouring side-effect free functions, which matches well with the declarative task which is package definitions.
bash, in contrary, is not side-effect-free, it modifies its environment, and this is very much not desired in a functional package manager: it is at the core that package declarations are side-effect-free.
And Emacs shows that Lisp written in a declarative style is a superb configuration language. (There is now even a project to use a Scheme, Steel Scheme, to configure helix, a programmers text editor which has many many features stemming from vim!).
Add Steel as an optional plugin system by mattwparas · Pull Request #8675 · helix-editor/helix
Notes: I still need to rebase up with the latest master changes, however doing so causes some headache with the lock file, so I'll do it after some initial feedback. Also, this depends on the ...GitHub
(add-after 'install 'remove-examples
(lambda* (#:key outputs #:allow-other-keys)
(with-directory-excursion
(string-append (assoc-ref outputs "out") "/lib")
(for-each delete-file
(list
"basic-server"
"helloworld"
"postcollector")))
over:
postInstall = ''
rm $out/lib/basic-server $out/lib/helloworld $out/lib/postcollector
''
?
Yes, having programmed bash and its predecessors for 30 years and several lisps (Clojure, Racket, Guile, a little SBCL) in the last 15 years, I very much prefer the Scheme version in this place.
Why?
- This code fragment is part of a much larger system, so readability and consistency counts
- The Guile version supports a more powerful functionality, which is that evaluation of a package can have several extra results (called outputs). It is over a year that I read about that in the Guix documentation and yet I recognize it immediately.
- the code tells me that it is removing examples.
- the code fits neatly into a tidy system of several stages of build and packaging
- the code uses a structured loop. Of course you can do that in shell as well - I am pointing this out because the bash version is a bit shorter because it does not use a loop.
- Scheme has much safer and more robust string handling. The code will not do harmful things if a file name contains white space or happens to be equal to
'echo a; rm -rf /etc/*'
. - Scheme strings handle Unicode well
- If there is an error, it will not be silently ignored as is the norm in shell scripts which are not written by experts, but will throw it.
- the code has less redundancy. For example, the bash version mentions three times the subfolder "lib", the Guile version only once. This makes it easier to refactor the code later.
I agree with your overall point, that having a single consistent functional language for package descriptions and build scripts is a great thing, and that bash is awful, but your reasoning is somewhat flawed. The main drawbacks of bash are somewhat rectified in Nix because bash is very much contained/sandboxed, which prevents arbitrary damage to the system, and there are some nice defaults in stdenv too.
The Guile version supports a more powerful functionality, which is that evaluation of a package can have several extra results (called outputs). It is over a year that I read about that in the Guix documentation and yet I recognize it immediately.
Nix also supports multiple outputs (in fact this is where the concept of outputs in Guix came from)
the code tells me that it is removing examples.
You could also do that with Nix in an easier and more declarative fashion, either by adding a comment, or by doing this:
postInstallPhases = [ "removeExamplesPhase" ];
removeExamplesPhase = ''
rm -f "$out"/lib/{basic-server,helloworld,postcollector}
'';
Scheme has much safer and more robust string handling. The code will not do harmful things if a file name contains white space or happens to be equal to 'echo a; rm -rf /etc/*'.
Bash is just two double quotes away from doing this too. See code above for an example
Scheme strings handle Unicode well
Bash also handles Unicode well
If there is an error, it will not be silently ignored as is the norm in shell scripts which are not written by experts, but will throw it.
Nixpkgs stdenv sets set -eu
which has a similar effect. If that code fails, the entire build will fail too.
the code has less redundancy. For example, the bash version mentions three times the subfolder “lib”, the Guile version only once. This makes it easier to refactor the code later.
This is also really quite easy to rectify in bash, see code above.
rm -f "$out"/lib/{basic-server,helloworld,postcollector}
I had a go at using guix as a package manager on top of an existing distro (first an immutable fedora, which went terribly, then OpenSUSE). Gave up for a few reasons:
- As mentioned in the article,
guix pull
is sloow. - Packages were very out of date, even Emacs. If I understand correctly, 30.1 was only added last month, despite having been available since February. I get that this isn't the longest wait, but for the piece of software you can expect most guix users to be running, it doesn't bode well.
- The project I was interested in trying out (Gypsum) had a completely broken manifest. Seems like it worked on the dev's machine though, which made me concerned about how well guix profiles actually isolate Dev environments. This was probably an error on the dev's part, but I'd argue such errors should be hard to make by design.
All in all I love the idea of guix, but I think it needs a bigger community behind it. Of course I'm part of the problem by walking away, but 🤷
- As mentioned in the article,
guix pull
is sloow.
This one has beem discussed on several forums discussing the original blog post, like here or also here on lobste.rs
Part of the reason for slow pulls is that the GNU projects savannah server, which Guix was using so far, is not fast, especially with git repos. Luckily, this is already being improved because Guix is moving to codeberg.org, a FOSS nonprofit org which is hosted in Europe. So if one changes the configured server URL, it is faster. (On top of that interested people might use the opportunity to directly take influence, and donate to codeberg so that they can afford even better hardware 😉).
OpenAI and UK sign deal to use AI in public services
OpenAI and UK sign deal to use AI in public services
The US tech firm behind ChatGPT say it will work with the UK government to "deliver prosperity for all".Mitchell Labiak (BBC News)
thisisbutaname likes this.
Fedora Must (Carefully) Embrace Flathub
Fedora Must (Carefully) Embrace Flathub
Motivation Opportunity is upon us! For the past few years, the desktop Linux user base has been growing at a historically high rate. StatCounter currently has us at 4.14% desktop OS market share...Michael Catanzaro (Michael Catanzaro's Blog)
like this
Endymion_Mallorn likes this.
That's certainly part of the motivation (see the 4th paragraph).
Yes, image based. No, not Bazzite specifically, but silverblue (and kinoite) under the fedora banner directly.
But that's not really the point of the article. In order for those to go mainstream, flatpak and especially flathub have a lot of maturing to do first, and the author lays out a pretty good roadmap with thorough explanations.
They're already mainstream, any belief otherwise is ridiculous to the point of being parody.
Meanwhile you have Fedora getting legal threats because they're shipping broken software in their own flatpak repo that exists only to waste developer time and project resources at the expense of its users and their experience.
I'd love to think so too, but I think our echo chamber is pretty tight.
I certainly think they're ready for mainstream usage (I have one Bazzite install myself), but I don't think there's significant awareness beyond the dedicated fan base.
There aren't really any actually useful metrics that I know of, but the only one of the 3 I've mentioned that broke into distrowatch's top 100 is Bazzite, and that's only in the last few months.
And for legal threats: I doubt any court in any country will give credence to that. Fedora is MIT licensed.
License of Fedora Linux
Learn more about Fedora Linux, the Fedora Project & the Fedora Community.Fedora Docs
The legal threats were credible and resulted in yet more wasted developer time removing that package instead of the entire useless repo.
You're forgetting that millions of Steam Deck consoles have been sold and all of them are flathub exclusive.
On top of that you have: Mint, Vanilla OS, Endless OS, OpenMandriva, PopOS!, Clear Linux, PureOS, ZorinOS, KDE Neon, GNOME OS, Salix, and many others all shipping flathub by default.
Fedora is in a very exclusive group of distros dumb enough to ship their own flatpak repo.
Bringing up Distrowatch stats and "Echo chamber" in the same comment is the most absurd thing I've seen this year.
Bazzite is not immutable, and SteamOS is as mainstream as it gets while being A/B root immutable.
All of them ship Flathub because it's ready for public consumption.
If the attempt here is to argue that cloud native isn't mainstream and change topics from flathub, you are proudly in a bubble of 3% of the computing industry while your peers in the Linux server space and Android run circles around you.
If they behave anything like what Fedora did, yes.
OBS chose Flathub as their official default supported option for their software. Fedora took that software, modified it to update dependencies they weren't ready to use yet, and then put it on their store in a completely broken state with all of OBS's trademarks intact and in a way that made it preferred over the official one, and then fought OBS over removing it for months while it racked up support requests from unsuspecting users (victims of Fedora's shitty policies).
- Bazzite preinstalls Flathub apps by default. The author still wants to use Fedora Flatpaks for the preinstalled apps.
- Bazzite ships Flathub unfiltered. The author wants to only show FLOSS software built on trusted platforms by default (so no taking a precompiled binary and shipping that).
- Bazzite ships Flathub in spite of its flaws. The author wants Fedora to work with Flathub to clean up its issues before shipping the remote by default.
Bazzite ships Flathub unfiltered.
Last update (which replaced Discover with Bazaar) changed that.
so no taking a precompiled binary and shipping that.
All FLOSS apps on Flathub are built on trusted platforms by default, in the open and verifiable. Same thing with Brew.
Not including proprietary software in the default config is a valid choice every distro has to make.
The sudden success of Bazzite comes from how easy it is to use.
Last update (which replaced Discover with Bazaar) changed that.
In a way, true. But I don't think they are using flatpak's filter mechanism. I believe the filtering is done by Bazaar itself. That means that even if Bazaar is hiding an app, you are still able to install it manually from the CLI.
The intent is also different. Bazaar is filtering out footguns, like the Steam flatpak on Bazzite (since Steam is preinstalled as an RPM) and Bluefin hides flatpak IDEs.
All FLOSS apps on Flathub are built on trusted platforms by default, in the open and verifiable.
That's not true. Take LocalSend as an example. It does not build LocalSend on Flathub. It simply takes a GitHub release URL of a compiled tar.gz. And GitHub releases do not have to be built on GitHub, you are able to upload any local file and have it shown as a release.
The sudden success of Bazzite comes from how easy it is to use.
I agree. But it's also important to have principles and to stick to them. The great thing about Fedora Atomic is that Fedora is able to create their FLOSS OS following their principles and others are able to take that base and build upon it to create their vision.
Fedora doesn't have to be for everyone.
org.localsend.localsend_app/org.localsend.localsend_app.yml at master · flathub/org.localsend.localsend_app
Contribute to flathub/org.localsend.localsend_app development by creating an account on GitHub.GitHub
It's great they're having this discussion, but some of the arguments seem overblown and imply Flathub does less reviewing of app than actually does.
Outdated runtimes aren't great either, but as they learned with OBS, just updating to the newest version broke a bunch of stuff.
See this blog post for a response that was made to similar criticisms during the OBS issue. Flathub Safety: A Layered Approach from Source to User
Flathub Safety: A Layered Approach from Source to User
With thousands of apps and billions of downloads, Flathub has a responsibility to help ensure the safety of our millions of active users.Cassidy James Blaede (docs.flathub.org)
We can flag old runtimes as out of date. Individual users or whole distros can set preferences to anvoid out of date runtimes. But Flathab must support out of date runtimes.
If an app has not been updated, I want it to continue running.
I want FlatHub to support binary only apps (like commercial ones) as well.
FlatHub is supposed to be the easy, one-stop place to publish apps. If I cannot put my app there, it is a problem.
It is supposed to be the place I get apps that will run on my distro. If the app I use daily that has not been updated in 10 years stops working, I am annoyed.
Fedora wants to deprecate runtimes that would still be “stable” on Debian.
I think, because of Fedoras atomic desktops. I didn't use any of them yet, but it seems like Flatpaks should be used there, since one should (or can?) not install tradional packages there. Therefore Fedora provides the flatpaks anyway and they can be used on the non atomic desktops as well.
Another reason is, that you might not be able to install the latest version of an application as rpm package if a required dependency in the repo is outdated. A Flatpak usually does not have the issue since a newer version would include the fitting runtime.
This said, I do think its not this big of an issue for fedora which is usually quite up to date. But if you run a distribution with LTS releases or something like Debian you will much more likely have older dependencies in your repositiry.
atomic desktops
i guess it makes sense in that case, but i'm really not convinced flatpaks should be used as the default (or only, apparently) way to install every application in the system. flatpak's flexibility is great for the particular cases where you want to install newer versions of applications or if an application isn't available in the official repos somehow. besides that, just use distro packages
Another reason is, that you might not be able to install the latest version of an application as rpm package if a required dependency in the repo is outdated
doesn't flathub solve that already?
like this
geneva_convenience likes this.
Depends what you mean by "problem". The biggest problem with traditional packages like debs and rpms is that compatibility sucks. They only reliably run on the distro and version they are designed for. Third party packages typically build on old dependencies and hope that backwards compatibility will allow them to run without issue on later distro versions.
Yes, it's redundant to have have the same app packaged as flatpaks. Though I don't think that redundancy is necessarily a bad thing. Flathub is not a profitable project and has up to this point relied on Gnome for funding. There's work being done to spin it out to be it's own thing and hopefully be supported by paid apps. But what if that fails and it shuts down? Or less dramatically, what if Flathub has a major outage?
One of the common complaints against snap is that there is only one store, controlled by Canonical. Flatpak is designed to support multiple stores. I don't see why they can't exist side by side. That's exactly what I do. I have dozens of apps installed from each source.
And to address the claim of what if "each distro decides to make a flatpak repo according to their own philosophies?". I guess that would depend on how many resources are being poured into supporting that. If flatpak continues to push for OCI support, then that would make it easier for distros to have their own remotes, if they desire. If not, they can just use an existing option. Whether that be Flathub or Fedora. Personally, I think Fedora Flatpaks are a good match for Debian and OpenSUSE's policies, only real downside is that major Gnome app updates would be a month delayed, annoying Tumbleweed users.
i don't have an issue with multiple flatpak repos. i'd actually find it very interesting if we went a more decentralized route with flatpak (maybe kde, gnome, mozzila would each have their own repos). but i don't see the point of a distro-specific flatpak when we already have normal packages. compatibility is kind of a non-issue, since you're not supposed to install them elsewhere anyway (unlike flatpaks)
also, i see absolutely no reason to use fedora's flatpak repo on debian given that flathub exists already. you could add it if you want it, but what's the point?
Fedora and Debian have similar philosophies. FOSS only, packages must be built from source, no vendored dependencies. So they have similar policies regarding security and Fedora Flatpaks align closer to that than Flathub.
I believe Debian also doesn’t ship patented codecs in their main repo.
nonfree
section, but that's it)
L'enorme braccio reclutato per agevolare la rinascita del nucleare in Gran Bretagna - Il blog di Jacopo Ranieri
L'enorme braccio reclutato per agevolare la rinascita del nucleare in Gran Bretagna - Il blog di Jacopo Ranieri
Affermò la profezia: “E quando il destino dovrà compiersi, il giorno si trasformerà in notte, e il normale ciclo diurno sembrerà finire prima dell’ora del tramonto.Jacopo (Il blog di Jacopo Ranieri)
deegeese
in reply to adhocfungus • • •like this
adhocfungus likes this.
owenfromcanada
in reply to adhocfungus • • •like this
adhocfungus likes this.
SheeEttin
in reply to adhocfungus • • •like this
adhocfungus likes this.
MangoPenguin
in reply to adhocfungus • • •Password manager such as Bitwarden, you can store your passwords, and sensitive info as notes or attachments. It's all encrypted client side.
Then you just need to have a note with the master password and instructions on how to access it.
like this
adhocfungus likes this.
dave@hal9000
in reply to MangoPenguin • • •like this
giantpaper likes this.
MangoPenguin
in reply to dave@hal9000 • • •0x0
in reply to adhocfungus • • •Use a password manager. KeePassXC stores stuff in a file, so it's easier to synchronize. You can selfhost BitWarden too.
Syncthing is great to synchronize stuff across devices.
Cryptomator creates encrypted volumes (looks like a folder with gibberish inside) for you, which you can sync with whatever commercial cloud.
Data loss might come from bitrot, yes. Regardless, you should always have multiple backups.
Self-host Bitwarden | Bitwarden
Bitwardenlike this
adhocfungus likes this.
haulyard
in reply to adhocfungus • • •This is the one reason I’ve paid for 1Password. My wife has access and can get what is needed without figuring out how to revive a self hosted password solution. I realize this isn’t about self hosting, and that you can pay for Bitwarden too. It just struck a chord.
OP wishing you all the best.
like this
adhocfungus e giantpaper like this.
shortwavesurfer
in reply to adhocfungus • • •Why not just put logins in a database such as keepass and then have the password for that written down in like a lock box or something?
You could also store a flash drive with the password in the lock box and update it, say, every six months with the most current database version.
like this
adhocfungus e giantpaper like this.
𝕽𝖚𝖆𝖎𝖉𝖍𝖗𝖎𝖌𝖍
in reply to adhocfungus • • •No solution I've found, but I've been working on this myself. As I see it, there are two situations, and four categories of data:
I. My wife survives me
II. We both die, e.g. in a car
I've been thinking about getting an M-Disk writer for media, because ultimately, backing up to B2 is fine until I'm gone. Family members will need physical media for the photos and stuff.
For secrets, I'm planning using SSSS. Keys will be given to members on each side of my wife's and my families. If we both die, they'll have to get together, put their keys together, and decrypt the KeePass DB.
The online accounts are almost all financial; those are in a KeePass DB. My wife already has access to all of that through power is attorney, and if we both go, it's SSSS for the family.
The third data category are accounts and services that will be to be stopped. I don't subscribe to much, but the VPS provider and B2 will have to be terminated, and a document with instructions and with the credentials is in the SSSS archive.
The final category are assets: home, mortgage info, where and what the M-Disks are, a copy of the will that deals with all of the valuables, and any notes about anything not covered in the will. That's in documents in the SSSS archive.
I still have to put the archive together. I've been working toward a state where all of the secrets are in a cryptfs that's shared on the LAN and automatically encrypted with SSSS and synced to a share. Once I have that automated, I'll communicate out the SSSS keys and a how-to document.
In some ways, it was easier when you just died and your kids fought over the china. But I have a plan.
ssss: Shamir's Secret Sharing Scheme
point-at-infinity.orgBertramDitore
in reply to adhocfungus • • •My mother died recently, and she was the breadwinner and was in charge of everything financial, because my surviving father is a toxic narcissist with zero financial literacy who refuses help from anyone. So I just have to say kudos to you for thinking about this difficult stuff. Your family will appreciate it more than you can imagine.
Other commenters have already given you solid advice, and I don’t have anything to add there, but more people need to have these difficult conversations and make real practical preparations, as soon as possible. Speaking from experience, not having clear guidance about where things are and what should be done with them, makes an already emotional situation even harder to deal with. Everybody dies, but in death you can make your family’s grieving process slightly easier by thinking ahead like this.
I’m sorry for whatever you’re going through, but props for thinking about other people while you go through it.
like this
giantpaper e adhocfungus like this.
mustard57
in reply to adhocfungus • • •adhocfungus likes this.
rumba
in reply to adhocfungus • • •"In this envelope is the password for my keepass password vault. The entry for "In case of emergency" contains everything you should need to know in the event of my incapacitation or worse.
There are two USB keys with this vault on them, they are synchronized for redundancy. When I pass, get the password out of this envelope, plug in a USB key, open keypass and enter the password. "
You: Use the primary key as your password storage, keep the backup key plugged into a raspberry pi, run syncthing on both devices.
Have a spare test key set up, do a dry run with the family members you entrust to have this data.
like this
adhocfungus likes this.
railcar
in reply to adhocfungus • • •like this
adhocfungus likes this.
hexagonwin
in reply to adhocfungus • • •adhocfungus likes this.
Zerush
in reply to adhocfungus • • •Filen – Next Generation End-To-End Encrypted Cloud Storage
filen.ioadhocfungus likes this.
lemming741
in reply to adhocfungus • • •You kinda only need the email credentials. Shouldn't the rest be resettable from that point?
Is there anything that needs MFA that they won't have?
like this
adhocfungus likes this.
adhocfungus doesn't like this.
irotsoma
in reply to adhocfungus • • •oeuf
in reply to adhocfungus • • •I would use Keepass. You would have a single file, opened with a single password, that you could share with them however you want.
Wishing you the best
adhocfungus likes this.
adhocfungus doesn't like this.