Salta al contenuto principale


KI im Krieg: „Wir brauchen mehr kritische Debatten und zivilgesellschaftliches Engagement“


netzpolitik.org/2025/ki-im-kri…


The Pirate Post ha ricondiviso questo.


BEETLEPAD: l’ultima backdoor leggera ma insidiosa nel toolkit di APT41 (e affiliati?)
#CyberSecurity
insicurezzadigitale.com/beetle…

The Pirate Post ha ricondiviso questo.


30/6 Digest roundup: le notizie più rilevanti delle ultime 24 ore
#CyberSecurity
insicurezzadigitale.com/306-di…

The Pirate Post ha ricondiviso questo.


Le projet de loi sur les Jeux olympiques d’hiver de 2030 dans les Alpes a été adopté au Sénat la semaine dernière. Le texte prévoit notamment de reconduire jusqu’à fin 2027 l’expérimentation de la vidéosurveillance algorithmique (VSA). En attendant la reconnaissance faciale... senat.fr/dossier-legislatif/pj…

Retrouvez notre page dédiée à la VSA : laquadrature.net/vsa/

Questa voce è stata modificata (2 mesi fa)

The Pirate Post reshared this.



Verwaltungsdigitalisierung: Arbeitsgruppe empfiehlt Matrix-Protokoll für Behördenkommunikation


netzpolitik.org/2025/verwaltun…


The Pirate Post ha ricondiviso questo.


🍹 Log Out @ Roma

🕒 08 luglio, 18:30 - 08 luglio, 19:30

📍 Casa del parco Vigna Cardinali, Rome, Lazio

🔗 mobilizon.it/events/e39a912f-a…


🍹 Log Out @ Roma


🍹 Log Out @ Roma
Inizia: Martedì Luglio 08, 2025 @ 6:30 PM GMT+02:00 (Europe/Rome)
Finisce: Martedì Luglio 08, 2025 @ 9:30 PM GMT+02:00 (Europe/Rome)

About this event

Martedì 8 luglio torniamo con il Logout di TWC Roma, il ritrovo per tech workers che vogliono incontrarsi dopo lavoro: un'occasione per socializzare, conoscersi, parlare del nostro lavoro e come organizzarci nei prossimi mesi!

Ci vediamo martedì 8 luglio, alle 18.30, alla casa del parco della Caffarella

Unisciti al Gruppo telegram!


reshared this






The Pirate Post ha ricondiviso questo.


🍹 Log Out @ Roma


8 luglio 2025 18:30:00 CEST - GMT+2 - Casa del parco Vigna Cardinali, 00179, Rome, Italy
Lug 8
🍹 Log Out @ Roma
Mar 18:30 - 21:30
Tech Workers Coalition Italia

About this event

Martedì 8 luglio torniamo con il Logout di TWC Roma, il ritrovo per tech workers che vogliono incontrarsi dopo lavoro: un'occasione per socializzare, conoscersi, parlare del nostro lavoro e come organizzarci nei prossimi mesi!

Ci vediamo martedì 8 luglio, alle 18.30, alla casa del parco della Caffarella

Unisciti al Gruppo telegram!

Questa voce è stata modificata (2 mesi fa)

reshared this



Online event: Going sustainable – Is the energy transition possible?


The energy sector is key to becoming sustainable. Moving away from fossil fuels to renewable energy would eliminate the majority of human caused CO2 emissions and enable other sectors to reduce their footprint.

But is it possible to convert an industrialized nation to 100% renewable energy? What about renewable energy drought? The sun does not shine at night! Is there even enough space to build all the necessary systems? Who is going to pay for that?

Our presentation shows the numbers for Germany and how they change when going renewable. After the presentation, there will be an open discussion.

The event is held twice to allow people to participate from different time zones. Save the dates!

Saturday, July 12th 11:00 CEST (GMT 9:00)
Saturday, July 12th 19:00 CEST (GMT 17:00)

We will meet on Big Blue Button: Piratensommer

The 19:00 (CEST) session will be hosted from the German Pirate Party main office in Berlin (Pflugstr. 9a, 10115 Berlin).

The event will be recorded for publication on our YouTube-channel. If you want to stay anonymous just keep your camera and mike off and use a neutral username, the recording will not contain the participants list or the chat. You can ask questions via chat and stay anonymous.

Of course, the recording can be shared on any pirate party channels.

About the presenter: Guido Körber (aka TheBug) is the spokesperson for energy policy of the German Pirate Party. He runs a company in industrial electronics and is also vice chair of the Energy Committee of the German Association for Small and Medium-Sized Businesses (BVMW).


pp-international.net/2025/06/o…


The Pirate Post ha ricondiviso questo.


⚠️ WATCH OUT! European Commission's questionnaire for public consultation on #DataRetention has some really tricky and misguiding questions! ⚠️

But don't worry - we'll be publishing an answering guide in early August to help you understand the questions and avoid the traps. Well in time for the September 12 deadline! 😌

Stay tuned for the guide! 🗺️

in reply to EDRi

💪🏾 It's here! We've put together an answering guide to help you navigate the tricky and misleading questionnaire by the European Comission for the public consultation on #DataRetention.

This proposal can legalise #MassSurveillance and threaten our #privacy and #FundamentalRights - speak up against this agenda!

⚠️ The deadline for the consultation is September 12.

Read through the answering guide ⤵️ edri.org/our-work/public-consu…

Respond to the Commission consultation here ➡️ ec.europa.eu/info/law/better-r…

in reply to fgaz

@fgaz Unfortunately we don't have the capacity to translate this guide into other languages at the moment. However, if anyone is willing to do it, that would be really beneficial and we'd be happy to (re)share the translations on Mastodon for the community to use!
@fgaz
in reply to EDRi

can I, as a holder of a passport from a member state of the European Union c. q. the Netherlands, participate without my answers and grieviances being tossed out because I live outside the European Union?
in reply to EDRi

Seems like EU Commission made logging in to the consultation site harder than before?

At least I don't remember this weird hoop from June: after entering your email, there's a separate screen with a password field. Under the password field is a popup full of what looks like 2FA methods with weird names. If you just try to continue after entering your password, you get an error that the selected authentication method has not been enabled on your account.

To actually log in with a password, you have to explicitly select "Password" from the popup. The default for me was something that sounded like FIDO2 in bureaucratese.



Membro del Quartier Generale Federale del Partito Pirata Russo ha partecipato al Forum ONU sulla Governance di Internet 2025

Alexander Isavnin, Membro della Segreteria Generale Federale del Partito Pirata Russo e Segretario Generale dell'Internazionale Pirata, al Forum ONU sulla Governance di Internet 2025.

L'Internazionale Pirata, di cui fa parte anche Alexander Isavnin, partecipa al Forum da molti anni consecutivi. Quest'anno, il tema principale del forum per il 20° anniversario, tenutosi in Norvegia, era "Costruire insieme la Governance Digitale".

Per il secondo anno consecutivo, l'Internazionale Pirata organizza un seminario "Day 0". Quest'anno, il seminario si intitolava Networking Etico: Sostenibilità e Responsabilità, e Alexander vi ha partecipato in qualità di relatore. Il seminario si è svolto in collaborazione con il programma di Etica in Matematica dell'Università di Cambridge, con la partecipazione di professori ed esperti di Internet provenienti da tutto il mondo. Alexander ha parlato delle peculiarità dell'etica delle reti in Russia, dello sviluppo, da parte dello Stato e delle sue strutture, di tecnologie che vengono presentate come un vantaggio, ma che vengono utilizzate per la sorveglianza e il controllo, della necessità di educare i giovani, del sottosviluppo di tecnologie socialmente significative con il pretesto della sicurezza, del fatto che luoghi diversi nel mondo hanno etiche diverse, e dell'IGF che permette, innanzitutto, non di sincronizzare, ma di comprendere i rispettivi approcci.

Alexander:

Vengo da un Paese che per lungo tempo ha finto di essere il Nord del mondo, ma ora stiamo fingendo di essere il Sud del mondo. Ma bisogna anche vigilare con molta chiarezza, perché nei paesi sviluppati non molto democratici, soprattutto nei paesi del cosiddetto Sud del mondo, la tecnologia può essere facilmente sfruttata a sproposito dal governo, creando un divario con il Nord, un divario economico, di civiltà, beh, non di civiltà, divario sociale, divario democratico, molto più grande di quanto non sia già. La tecnologia non è riuscita a colmare questi divari. Bisogna supervisionare in modo davvero, davvero accurato e costante, senza divulgare nulla.

Questioni chiave sollevate durante le discussioni del workshop:

• Il networking etico richiede una collaborazione interdisciplinare, come dimostrato dalla partecipazione di matematici, ingegneri, scienziati sociali e politologi a questa sessione
• La progettazione tecnologica deve tenere conto della sostenibilità a lungo termine, poiché anche decisioni tecniche minori oggi possono avere un impatto significativo sul consumo energetico futuro
• L'educazione etica delle giovani generazioni è essenziale per un uso responsabile della tecnologia
• Il controllo etico della tecnologia varia notevolmente in tutto il mondo
• Gli ingegneri devono integrare etica e sostenibilità nelle basi del progresso tecnologico
• Le valutazioni di impatto sociale devono essere condotte parallelamente allo sviluppo tecnologico
• Il networking etico richiede prospettive e dialogo internazionali inclusivi, simili alla prassi delle riunioni dell'IGF
• È necessaria la supervisione dei cittadini per gestire i sistemi basati sui dati
• Il monitoraggio continuo degli impatti tecnologici è fondamentale per prevenire gli abusi e i meccanismi di responsabilità devono includere l'applicazione a lungo termine
• Il controllo etico della tecnologia deve essere integrato nelle basi dello sviluppo tecnologico
• La supervisione etica della tecnologia deve essere condotta parallelamente allo sviluppo tecnologico.
• Il networking etico richiede prospettive e dialogo internazionali inclusivi, simili alla prassi degli incontri dell'IGF.
• La supervisione dei cittadini è necessaria per gestire i sistemi basati sui dati.
• Il monitoraggio continuo degli impatti tecnologici è fondamentale per prevenire gli abusi e i meccanismi di responsabilità devono includere misure di applicazione a lungo termine.
• La supervisione etica della tecnologia deve essere attuata parallelamente allo sviluppo tecnologico... le implicazioni etiche e il coinvolgimento della comunità richiedono risorse e tempo dedicati.
• L'errore umano rimane una sfida importante nelle valutazioni socio-tecniche.
• Sono necessarie ulteriori ricerche interdisciplinari che esaminino il modo in cui le persone interagiscono con le macchine, la loro capacità di apprendere tecnologie avanzate e i tipi di interventi educativi che motivano l'adozione di nuove tecnologie.
• Strumenti di ricerca pratici come i sondaggi sono utili per comprendere questi fattori umani.

Trascrizione del workshop sul sito web di Pirate International: pp-international.net/2025/06/2…

Riepilogo sul sito web del Forum: https://www.intgovforum.org/en/content/igf-2025-day-0-event-197-ethical-networking-sustainability-and-accountability

L'articolo Un membro della Segreteria Generale Federale del Partito Pirata Russo ha partecipato al Forum delle Nazioni Unite sulla Governance di Internet 2025 è apparso per la prima volta su Partito Pirata della Russia | PPRU.


@Pirati Europei


Член Федерального Штаба Пиратской партии России принял участие в форуме ООН по управлению Интернетом 2025


Александр Исавнин, Член Федерального Штаба Пиратской партии России, Генеральный секретарь Пиратского Интернационала, в рамках форума Организации объединённых наций по управлению Интернетом 2025 (Internet Governance Forum 2025).

Пиратский интернационал, в том числе в лице Александра Исавнина, много лет подряд участвует в Форуме. В этом году главной темой юбилейного 20-ого форума, прошедшего в Норвегии, было «Совместное построение цифрового управления» (Building Digital Governance Together).

Второй год подряд Пиратский Интернационал проводит семинар Day 0. В этом году семинар назывался Этическое сетевое взаимодействие: устойчивость и ответственность (Ethical Networking: Sustainability and Accountability), и Александр принял в нём участие в качестве спикера. Семинар проводился совместно с программой Кембриджского университета по этике в математике, а также профессорами и интернет-экспертами со всего мира.

Александр рассказал об особенностях сетевой этики в России, разработке государством и подконтрольным ему структурам технологий, которые позиционируются как благо, но используются слежки и контроля, необходимости обучения молодёжи, неразвитии социально значимых технологий под предлогом безопасности, о том, что в разных местах мира разная этика, и IGF позволяет прежде всего не синхронизировать, а понять подходы друг друга.

[em]Александр:

I’m from country which for a long time pretended to be global north, but now we’re pretending to be global south. But you also have to oversight really clearly, because in not very democratic developed countries, especially in countries of so-called global south, technology can easily be abused by the government, which will make gap to the north, economical gap, civilizational, well, not civilizational, societal gaps, democratic gaps, much bigger than it exists. Technology could not close gaps. You have to oversight really, really accurately and constantly and not releasing it.

Ключевые вопросы, поднятые в ходе обсуждения на семинаре:

• Этическое взаимодействие в сети требует междисциплинарного сотрудничества, что было продемонстрировано участием в этой сессии математиков, инженеров, социологов и политологов
• Проектирование технологий должно учитывать долгосрочную устойчивость, поскольку даже незначительные технические решения сегодня могут существенно повлиять на энергопотребление в будущем
• Обучение молодого поколения этике крайне важно для ответственного использования технологий
• Этический надзор за технологиями значительно различается в разных регионах мира
• Инженеры должны интегрировать этику и устойчивое развитие в основу технологического прогресса
• Оценка социальных последствий должна проводиться параллельно с разработкой технологий
• Этическое взаимодействие в сети требует инклюзивных международных перспектив и диалога, подобного практике встреч на IGF
• Гражданский контроль необходим для управления системами, основанными на данных
• Непрерывный мониторинг воздействия технологий критически важен для предотвращения злоупотреблений, а механизмы подотчетности должны включать долгосрочные меры принуждения
• Для оценки этических последствий и вовлечения сообщества необходимы выделенные ресурсы и время
• Человеческие ошибки остаются главной проблемой в социотехнических оценках
• Необходимо проводить больше междисциплинарных исследований, изучающих взаимодействие людей с машинами, их способность осваивать передовые технологии и типы образовательных мер, которые мотивируют внедрение новых технологий
• Практические инструменты исследований, такие как опросы, полезны для понимания этих человеческих факторов

Стенограмма семинара на сайте Пиратского интернационала: pp-international.net/2025/06/2…

Краткое изложение на сайте Форума: https://www.intgovforum.org/en/content/igf-2025-day-0-event-197-ethical-networking-sustainability-and-accountability

Сообщение Член Федерального Штаба Пиратской партии России принял участие в форуме ООН по управлению Интернетом 2025 появились сначала на Пиратская партия России | PPRU.


reshared this



Член Федерального Штаба Пиратской партии России принял участие в форуме ООН по управлению Интернетом 2025


Александр Исавнин, Член Федерального Штаба Пиратской партии России, Генеральный секретарь Пиратского Интернационала, в рамках форума Организации объединённых наций по управлению Интернетом 2025 (Internet Governance Forum 2025).

Пиратский интернационал, в том числе в лице Александра Исавнина, много лет подряд участвует в Форуме. В этом году главной темой юбилейного 20-ого форума, прошедшего в Норвегии, было «Совместное построение цифрового управления» (Building Digital Governance Together).

Второй год подряд Пиратский Интернационал проводит семинар Day 0. В этом году семинар назывался Этическое сетевое взаимодействие: устойчивость и ответственность (Ethical Networking: Sustainability and Accountability), и Александр принял в нём участие в качестве спикера. Семинар проводился совместно с программой Кембриджского университета по этике в математике, а также профессорами и интернет-экспертами со всего мира.

Александр рассказал об особенностях сетевой этики в России, разработке государством и подконтрольным ему структурам технологий, которые позиционируются как благо, но используются слежки и контроля, необходимости обучения молодёжи, неразвитии социально значимых технологий под предлогом безопасности, о том, что в разных местах мира разная этика, и IGF позволяет прежде всего не синхронизировать, а понять подходы друг друга.

Александр:

I’m from country which for a long time pretended to be global north, but now we’re pretending to be global south. But you also have to oversight really clearly, because in not very democratic developed countries, especially in countries of so-called global south, technology can easily be abused by the government, which will make gap to the north, economical gap, civilizational, well, not civilizational, societal gaps, democratic gaps, much bigger than it exists. Technology could not close gaps. You have to oversight really, really accurately and constantly and not releasing it.

Ключевые вопросы, поднятые в ходе обсуждения на семинаре:

• Этическое взаимодействие в сети требует междисциплинарного сотрудничества, что было продемонстрировано участием в этой сессии математиков, инженеров, социологов и политологов
• Проектирование технологий должно учитывать долгосрочную устойчивость, поскольку даже незначительные технические решения сегодня могут существенно повлиять на энергопотребление в будущем
• Обучение молодого поколения этике крайне важно для ответственного использования технологий
• Этический надзор за технологиями значительно различается в разных регионах мира
• Инженеры должны интегрировать этику и устойчивое развитие в основу технологического прогресса
• Оценка социальных последствий должна проводиться параллельно с разработкой технологий
• Этическое взаимодействие в сети требует инклюзивных международных перспектив и диалога, подобного практике встреч на IGF
• Гражданский контроль необходим для управления системами, основанными на данных
• Непрерывный мониторинг воздействия технологий критически важен для предотвращения злоупотреблений, а механизмы подотчетности должны включать долгосрочные меры принуждения
• Для оценки этических последствий и вовлечения сообщества необходимы выделенные ресурсы и время
• Человеческие ошибки остаются главной проблемой в социотехнических оценках
• Необходимо проводить больше междисциплинарных исследований, изучающих взаимодействие людей с машинами, их способность осваивать передовые технологии и типы образовательных мер, которые мотивируют внедрение новых технологий
• Практические инструменты исследований, такие как опросы, полезны для понимания этих человеческих факторов

Стенограмма семинара на сайте Пиратского интернационала: pp-international.net/2025/06/2…

Краткое изложение на сайте Форума: https://www.intgovforum.org/en/content/igf-2025-day-0-event-197-ethical-networking-sustainability-and-accountability

Сообщение Член Федерального Штаба Пиратской партии России принял участие в форуме ООН по управлению Интернетом 2025 появились сначала на Пиратская партия России | PPRU.



Reviewing PPI´s Participation at the Internet Governance Forum


The Internet Governance Forum (IGF) just ended, and PPI was lucky enough to host a workshop and have several representatives on site, together with our colleagues from PPEU. We have participated in this event for numerous years consecutively and view this forum as one of our main avenues to influence transnational governance on an issue that is very important to us, the Internet.

This was the 20th IGF and held in Norway. The overarching theme was “Building Digital Governance Together”.

PPI’s Ethical Networking Workshop
For the second year in a row we hosted a Day 0 workshop. This year´s workshop was titled “Ethical Networking: Sustainability and Accountability.” Our session collaborated with Cambridge University´s Ethics in Mathematics Program and professors and internet experts from around the globe. At the end of this blog we provide a copy of our report, which can also be read on the IGF website:
intgovforum.org/en/content/igf…

We also provide a transcript below of the entire discussion, which we hope will be the basis for future academic publications.

Other Takeaways from the IGF
Beyond the workshop, PPI shared a booth with the European Pirate Party (PPEU). Besides meeting with our PPEU colleagues, whom we have known and worked together for years, we were able to share in networking and information dissemination with other organizations. Many people have heard of Pirates but never met one. Others have never even heard of our movement. It is shocking to some to consider how successful our movement has been on a global scale and how much we are central to the discussion of internet governance.

Looking ahead this year´s IGF especially highlighted the accomplishments and dangers of AI. Prior IGFs have discussed AI, but the current IGF was able to showcase real projects that are in much more advanced stages of development. Technological developments with AI are happening at a brake-neck pace, and the Pirate movement definitely needs to put itself at the center of this debate.

PPI’s Ethical Networking Workshop Report

List of Speakers and their institutional affiliations:
• Daphne Tuncer, Institut Polytechnique de Paris
• Marc Bruyere, Civil Society
• Maurice Chiodo, University of Cambridge, Ethics in Mathematics Project
• Dennis Muller, University of Cologne, Ethics in Mathematics Project
• Alexander Isavnin, Free Moscow University, Pirate Party Russia
• Keith Goldstein, Pirate Parties International, University of Potsdam
• Sara Hjalmarsson, European Pirate Party

Key Issues raised:

• Ethical networking requires interdisciplinary collaboration, which was represented by the mathematicians, engineers, sociologist, and political scientists who participated in this session.
• Technology design must consider long-term sustainability, as minor technical design choices today can significantly affect energy consumption in the future.
• Educating young people on ethics is crucial for responsible technology use.
• Ethical oversight of technologies varies significantly across different global regions.
• Engineers should integrate ethics and sustainability at the core of technological development.
• Societal impact assessments must be made alongside the technology development process.
• Ethical networking demands inclusive international perspectives and dialogue, such as the practice of meeting at the IGF.
• Citizen oversight is essential for governance of data-driven systems.
• Continuous monitoring of technological impacts is critical to prevent misuse, and accountability mechanisms must include long-term enforcement measures.
• Dedicated resources and time are needed for ethical impact assessments and community engagement.
• Human failures are a primary concern in sociotechnical evaluations.
• We must conduct more interdisciplinary studies that examine how humans interact with machines, how they are able to learn state of the art technology, and what types of educational interventions will motivate the adoption of new technology.
• Practical research tools like surveys are valuable for understanding these human factors.

There were no formal presentations during this session.

The session began with the broad question about how to align network practices with ethical principles. Daphne Tuncher emphasized the need to make underlying assumptions explicit, noting that network research narratives “are just taken for granted” and arguing that we must “reserve time” to question them. She also warned that fast systems can hinder reflection: “We tend to value high speed as something good… but to some extent, I believe this is not really aligned with ethical principle where we require time to think.”

The next discussion focused on emerging technologies, and how ethics cannot be added later. Dennis Muller stressed that “ethics is not an optional extra or a bolt-on,” calling for “a fundamental systemic shift” in design processes. He argued that developers must balance technical success with social good at every stage: “Technical success must be balanced with success from an ethical and sustainability perspective.”

The third discussion addressed IGF’s role. Maurice Chiodo described the IGF as an organization that “breaks down the silos between the technical and non-technical experts” and spreads insights across communities. Marc Bruyere added that IGF should forge stronger ties with standards bodies like the IETF and W3C, urging “more participation from the IGF community” in those forums.

The final discussion discussed the way forward and potential next steps. Sara Hjalmarsson asked the speakers to sum up how they proposed evaluating how to learn about the human element in systems. Maurice Chiodo highlighted that socio-technical evaluations must assess not just failures of the technical or human components but also the human-machine interface itself. He called this “the primary site of miscommunication and error.” Keith Goldstein agreed, advocating surveys, quantitative, and qualitative methods to capture human experiences. The speakers recommended creating working groups that mix technical and social science experts. They suggested surveys/questionnaires to track human factors in networks. They also proposed case studies to document ethical failures. They called for ethics training as part of engineering curricula.

There were two questions from the audience. The first question asked how to democratize ethical networking so that ordinary citizens can meaningfully oversee data-driven public systems. Dennis Muller observed that “we need to respect the different cultures and different regions of this world.” Alexander Isavnin added that “technology cannot insure you in something. You are your own insurance. You have to communicate, you have to oversight, you have to think about what’s going on with your data. IGF is a good starting venue but your participation is also really important.” The second question probed how global North–South power dynamics influence networking ethics. Alexander Isavnen explained that while “Internet and these technologies could shorten the gap between what we call West world and the others, or North and South,” they can also be “easily abused by the government so technology alone cannot close gaps.”

Overall the speakers indicated a limitation involved the lack of dedicated budgets to make impact assessments. Ethics must be a core competency for all stakeholders, but it is too often secondary. Bridges must also be made between academics and standards organizations like the IETF and W3C. They urged the IGF to publish periodic policy briefs. They also recommended local IGF chapters engage with communities on these issues.

Transcript of Ethical Networking Workshop
# Transcription:
(00:00:07.507): you
(00:00:40.615) [Keith]: Okay, thank you everybody for coming.
(00:00:42.600) [Keith]: It’s a pleasure to see you all.
(00:00:44.805) [Keith]: This is the IGF Workshop on Ethical Networking, Sustainability and Accountability.
(00:00:52.122) [Keith]: Rather than introduce everyone, I’ll turn over to my colleague Sarah over here to ask our first question.
(00:00:58.334) [Sara]: Thanks, Keith.
(00:01:00.236) [Sara]: First of all, my name is Sarah.
(00:01:03.520) [Sara]: I’m the vice chair of the European Pirate Party.
(00:01:06.764) [Sara]: We have a booth here if you’re on site.
(00:01:09.707) [Sara]: So if you like what we’re talking about, please feel free to stop by.
(00:01:15.133) [Sara]: I’d like to start by letting our speakers introduce themselves first before we start with question one.
(00:01:22.882) [Sara]: So I’d like to hand over to Daphne.
(00:01:30.042) [Sara]: We have Daphne with us.
(00:01:35.050) [Sara]: Welcome, Daphne.
(00:01:35.892) [Sara]: Please introduce yourself and tell us a bit about what you do.
(00:01:39.478) [Sara]: Tell me something about yourself, your project, and how it relates to ethical networking.
(00:01:54.528) [Sara]: We’re having a bit of a… Yeah, now we can hear you.
(00:01:57.475) [Daphne]: Sorry, sorry.
(00:01:58.457) [Daphne]: I couldn’t turn on my mic.
(00:01:59.981) [Daphne]: Sorry about that.
(00:02:00.562) [Daphne]: That’s okay.
(00:02:02.487) [Daphne]: Happens sometimes.
(00:02:03.369) [Daphne]: Go for it.
(00:02:04.371) [Daphne]: Yeah.
(00:02:04.832) [Daphne]: Hi, hi, everyone.
(00:02:05.754) [Daphne]: Sorry, let me turn on the video as well.
(00:02:07.578) [Daphne]: It should be working now.
(00:02:08.621) [Daphne]: Yeah, great.
(00:02:09.563) [Daphne]: So hi everyone.
(00:02:11.808) [Daphne]: Thanks a lot for joining this session.
(00:02:13.451) [Daphne]: So my name is Daphne Tuncher.
(00:02:15.916) [Daphne]: I’m academic.
(00:02:17.038) [Daphne]: My research is in the domain of computer science, more specifically computer networks.
(00:02:22.009) [Daphne]: I’m affiliated with Institut Polytechnique de Paris in France.
(00:02:26.017) [Daphne]: So, over the years, I’ve been trying to work on putting together kind of actionable resources, both for research and education on what I call responsibility in our digital development.
(00:02:41.533) [Daphne]: So, thank you.
(00:02:43.455) [Sara]: Okay, wow.
(00:02:44.616) [Sara]: That’s a big responsibility.
(00:02:45.858) [Sara]: All right.
(00:02:47.660) [Sara]: Thank you, Daphne.
(00:02:51.323) [Sara]: Next.
(00:02:57.286) [Sara]: Next, we have Marc Brouillere.
(00:03:00.841) [Sara]: Marc, are you with us?
(00:03:02.628) [Marc]: Yes.
(00:03:03.371) [Marc]: Can you hear me and see me?
(00:03:04.837) [Sara]: Yes.
(00:03:05.741) [Sara]: Loud and clear.
(00:03:08.002) [Marc]: OK, quickly.
(00:03:09.624) [Marc]: Actually, I did a PhD when I was 40, like 10 years ago, coming from a long path from an industry and so on.
(00:03:18.035) [Marc]: And when you’re actually starting to do research, and you know what implication it is in research, you are actually influencing things and innovating stuff and so on.
(00:03:30.030) [Marc]: And it always questioned me how to do this without hurting
(00:03:35.457) [Marc]: society with an integral way.
(00:03:37.840) [Marc]: Then that’s what we with Daphne had a very first conversation about it.
(00:03:42.126) [Marc]: And I’m actually working for a large company back for 10 years in research for Airbus with everything’s do count in the choice you do.
(00:03:52.841) [Marc]: And it’s very valuable that we are actually all thinking of the impact of the choices we do.
(00:03:58.609) [Marc]: And I really appreciate we have this time together.
(00:04:02.947) [Sara]: Oh, wow.
(00:04:04.449) [Sara]: So you’ve had a lot of insight to share there.
(00:04:07.815) [Sara]: Looking forward to it.
(00:04:09.598) [Sara]: Next, we have Maurice.
(00:04:14.847) [Sara]: Maurice, are you with us?
(00:04:17.110) [Maurice]: Yes, thank you.
(00:04:17.892) [Maurice]: Can you hear me?
(00:04:18.573) [Sara]: Yep.
(00:04:18.994) [Sara]: Loud and clear.
(00:04:20.581) [Maurice]: Excellent.
(00:04:21.562) [Maurice]: Thank you.
(00:04:21.882) [Maurice]: It’s a pleasure to be able to speak here today.
(00:04:24.525) [Maurice]: So my name is Maurice Chiodo, and I’m a research associate at the Center for the Study of Existential Risk at the University of Cambridge.
(00:04:31.511) [Maurice]: I’m also the principal investigator and co-founder of the Ethics in Mathematics Project.
(00:04:36.016) [Maurice]: So a research mathematician by training, I specialized in computability theory and abstract algebra.
(00:04:41.341) [Maurice]: My work now looks at the ethical challenges and risks posed by mathematics, mathematicians, and mathematically-powered technologies.
(00:04:48.087) [Maurice]: I’ve been working on this for over nine years and have insights and industry experience as an ethics and safety consultant in AI and blockchain technologies.
(00:04:56.045) [Sara]: Oh wow, you’ve done a bit of everything.
(00:04:58.991) [Sara]: Thank you, Maurice.
(00:05:00.274) [Sara]: Next we have Dennis, Dennis Muller.
(00:05:03.521) [Sara]: Dennis, are you with us?
(00:05:05.350) [Dennis]: Yeah, I’m here.
(00:05:07.713) [Dennis]: Thank you very much.
(00:05:08.493) [Dennis]: It’s an honor to be here.
(00:05:09.415) [Dennis]: I’m also a co-founder of the Ethics in Mathematics Project.
(00:05:13.139) [Dennis]: I’m currently a research associate at the University of Cologne, where I work on mathematics education for sustainable development.
(00:05:19.986) [Dennis]: And I work with Maurice at the Center of the Study of Existential Risk, where I study extreme technological risks related to AI and the Internet.
(00:05:28.317) [Dennis]: Overall, my work sort of connects to ethics, education, mathematics, and I’m particularly interested in studying how mathematics and mathematically powered technologies are shaping our world.
(00:05:39.595) [Sara]: Okay, wow.
(00:05:42.640) [Sara]: All right, very good.
(00:05:43.862) [Sara]: Great to have you with us.
(00:05:45.806) [Sara]: Next, we have Alexander Isavnen.
(00:05:49.391) [Sara]: Alexander, are you with us?
(00:05:52.600) [Alex]: Yeah for sure.
(00:05:54.362) [Alex]: Hello.
(00:05:55.444) [Alex]: I’m Alexander.
(00:05:56.906) [Alex]: I’m a member of the Council of Russian Pirates Party.
(00:06:01.392) [Alex]: We live in very difficult countries and our party and citizens of our country constantly need to face ethical and sustainability challenges.
(00:06:13.508) [Alex]: I’m also a mathematician by education, but have no relations to ethics and mathematic projects.
(00:06:20.515) [Alex]: Thanks.
(00:06:21.376) [Sara]: Okay, welcome.
(00:06:23.498) [Sara]: Next, we have Keith.
(00:06:25.400) [Keith]: And I’ll just introduce myself.
(00:06:26.962) [Keith]: I’m Keith Goldstein, Chair of Pirate Parties International.
(00:06:31.367) [Keith]: I also have been involved with Daphne and Mark here on drafting a research project on computer networking ethics and looking at how humans are able to learn new systems.
(00:06:43.782) [Keith]: Okay, thanks.
(00:06:44.363) [Keith]: So, why don’t we move on to the next question?
(00:06:47.566) [Sara]: So, let’s start with the first question there.
(00:06:51.992) [Sara]: We’re sharing a little bit.
(00:06:54.715) [Sara]: So, how can we ensure that mathematics and computer networking practices align with ethical principles, including privacy, transparency, and accountability?
(00:07:09.893) [Keith]: So, Daphne, would you like to start?
(00:07:14.258) [Daphne]: Yeah sure.
(00:07:14.719) [Daphne]: I’m happy to start.
(00:07:16.042) [Daphne]: I mean so as I said earlier I’m a computer scientist.
(00:07:18.507) [Daphne]: But in the recent year I started working a lot with people from social science.
(00:07:23.739) [Daphne]: And through this collaboration I got to learn a lot about the role of narratives in how this contributes to how we approach and develop new technologies.
(00:07:33.580) [Daphne]: And if you take computer network research as an example, so a lot of the narratives that we have today have to do with hyperperformance, optimization, measurements.
(00:07:45.277) [Daphne]: So of course, there’s nothing wrong with that.
(00:07:47.901) [Daphne]: But my point is that very often, these things are just taken for granted.
(00:07:52.528) [Daphne]: We never really question these narratives.
(00:07:57.115) [Daphne]: And so it does subconsciously, us, like a researcher in computer networks,
(00:08:02.082) [Daphne]: influence the way we think.
(00:08:04.707) [Daphne]: So to me, spending time on talking about this narrative to make them explicit and also having a space to confront them is an essential part and also ingredient to get an alignment between our practices, for example, in computer networks and ethical principles.
(00:08:23.632) [Daphne]: So I think what is really important is to reserve time for that.
(00:08:28.058) [Daphne]: So today, and I think this has been driven a lot by all these developments in the computing technologies, we tend to value high speed as something good.
(00:08:36.510) [Daphne]: So it’s fast, it’s good.
(00:08:39.354) [Daphne]: But to some extent, I believe this is not really aligned with ethical principle where we require time to think.
(00:08:45.162) [Daphne]: So I think time is very key here.
(00:08:48.666) [Sara]: Thank you.
(00:08:49.127) [Sara]: Next, Maurice.
(00:08:53.534) [Maurice]: What are your thoughts on this?
(00:09:08.857) [Maurice]: Ensuring that networking practices align with core ethical principles requires us to address three distinct but ultimately interconnected challenges of the alignment problem.
(00:09:19.333) [Maurice]: So from the perspective of ethics and mathematics, we must first define what we want to achieve.
(00:09:24.922) [Maurice]: Second, we must determine how to achieve these outcomes by developing the right mathematical tools, technologies, and practices.
(00:09:32.496) [Maurice]: This involves examination of the methods we use.
(00:09:35.562) [Maurice]: For instance, a commitment to privacy requires not just policy, but the implementation of privacy-preserving mathematics from the ground up.
(00:09:42.655) [Maurice]: Third, and most crucially,
(00:09:45.543) [Maurice]: sticks.
(00:09:46.425) [Maurice]: This is the long-term challenge.
(00:09:47.587) [Maurice]: To get this right, we must scrutinize three areas simultaneously.
(00:09:51.174) [Maurice]: As I said, the ethical vision of our outcomes, the integrity of our tools, and the robustness of our processes.
(00:09:56.785) [Maurice]: Any one of these can undermine the others.
(00:09:59.490) [Maurice]: For example, an ethical process can still lead to a harmful outcome if the underlying technology is flawed.
(00:10:05.017) [Maurice]: Therefore, we must move beyond just analyzing intent and design aims.
(00:10:08.282) [Maurice]: We have to rigorously investigate the technologies and the technologists’ ability to do good or cause harm.
(00:10:14.352) [Maurice]: We must understand not only what they want to do, but also… Oh, we have a bit of a lag there.
(00:10:22.746) [Sara]: We missed the last thing you said, Maurice.
(00:10:26.272) [Maurice]: Oh, sorry.
(00:10:27.554) [Maurice]: So, I was saying that, therefore, we must move beyond just analyzing intent and design aims.
(00:10:32.455) [Maurice]: We have to rigorously investigate the technologies and the technologists’ ability to do good or cause harm.
(00:10:38.545) [Maurice]: And we must understand not only what they want to do, but also what they can do.
(00:10:43.494) [Maurice]: Okay.
(00:10:44.976) [Maurice]: Yeah, that’s a big point.
(00:10:47.761) [Sara]: Let’s see.
(00:10:48.923) [Sara]: We have Alexander.
(00:10:50.666) [Sara]: You have a slightly different cultural environment.
(00:10:54.753) [Sara]: What’s your perspective?
(00:10:56.873) [Alex]: Let me give perspective, not just from my cultural environment, but from my experience.
(00:11:04.203) [Alex]: We all know that technology and instrumentation and tools are being developed much faster than regulations or even spelling norms of what’s going on.
(00:11:17.582) [Alex]: At the beginning of the internet, there was no privacy considerations or security considerations because scientists have created internet
(00:11:26.675) [Alex]: for their own needs.
(00:11:28.798) [Alex]: They thought that only such good guys with scientific approaches will exist on the Internet.
(00:11:37.250) [Alex]: But actually, a lot happens in that.
(00:11:40.635) [Alex]: A lot of people came here, evil people, bad people, governments, corporations, and so on.
(00:11:47.144) [Alex]: So I think that our idea of sustainability and ethical networking should go towards understanding of what people need first of all and only then such formulated needs need to shape technology developments.
(00:12:09.942) [Alex]: Back to my cultural background, in Russia it’s happening always.
(00:12:15.309) [Alex]: The state and the state-controlled corporations are developing technologies.
(00:12:20.997) [Alex]: They are announcing that technologies are for the good of the people, but lately it appears that even network applications are developed for surveillance or control of people’s activities.
(00:12:36.819) [Alex]: Thanks.
(00:12:38.461) [Sara]: All right, very good.
(00:12:39.623) [Sara]: We’ll actually get into that topic in a moment.
(00:12:43.949) [Sara]: In the meantime, we have Dennis.
(00:12:49.917) [Dennis]: What’s your perspective?
(00:12:51.780) [Dennis]: I think to truly align our practices with ethical principles, we must understand that ethics is not an optional extra or a bolt-on.
(00:12:58.790) [Dennis]: It’s something that we must fundamentally embed within everything we do.
(00:13:03.723) [Dennis]: Principles like safety and sustainability cannot be bolted on at the end of a project, especially with decentralized technologies such as the internet, where retrospective fixes can be very difficult or even impossible.
(00:13:19.788) [Dennis]: I think that achieving this requires a fundamental systemic shift in how we work.
(00:13:24.917) [Dennis]: We need to communicate, hire and train with ethics as a core competency.
(00:13:31.149) [Dennis]: Technical success must sort of be balanced with success from an ethical and sustainability perspective and this can be quite challenging from my experience and from working with other
(00:13:41.928) [Dennis]: engineers and it requires sort of like an adjustment because engineers can be accustomed to viewing their work as sort of like a technological optimization problems and this perspective demands that technical and non-technical experts and the affected communities of those technologies must find a common language and build a shared understanding of the goals and risk involved.
(00:14:03.308) [Dennis]: And so ultimately, technical expertise and ethical expertise are sort of like two sides of the same coin.
(00:14:10.976) [Dennis]: And only by fostering a community that sort of like equally values forward-thinking responsibility and backward-looking accountability, we can ensure that this happens.
(00:14:21.108) [Sara]: Okay, very good.
(00:14:22.649) [Sara]: Thank you, Dennis.
(00:14:25.873) [Sara]: And finally, we have Mark.
(00:14:29.785) [Marc]: I think multi-depletionary groups and thinking is always a benefit.
(00:14:35.557) [Marc]: And then that’s something in the discipline of engineering, design, and so on, what those imply.
(00:14:42.046) [Marc]: Choices as well, with nautical think, ideas and thoughts actually have placed sometime in industry.
(00:14:49.596) [Marc]: Even in research and so on, that’s very important.
(00:14:52.319) [Marc]: We have also feedback and time to give proper response and ideas, review from other,
(00:15:01.291) [Marc]: people with all their field of research or activities and so on.
(00:15:06.358) [Marc]: For a simple story to illustrate this, we are actually, and I just verified, using IPv4 to communicate through Zoom.
(00:15:15.252) [Marc]: Think of a teeny details who has very profound impact today.
(00:15:20.119) [Marc]: Then the design on that IPv4, they actually place the source address before the destination address.
(00:15:26.869) [Marc]: What do you do when you are actually
(00:15:29.911) [Marc]: checking where the packets need to go.
(00:15:32.895) [Marc]: You’re expecting the destination, not the source, to be first.
(00:15:37.182) [Marc]: And these teeny details is actually using a lot of power and electricity every time for a very long time.
(00:15:44.292) [Marc]: Big impact on the consummation of electricity and so on.
(00:15:49.239) [Marc]: Because all the routers have to wait, have to wait for the destination field
(00:15:54.667) [Marc]: before having the source.
(00:15:57.110) [Marc]: Kind of a teeny mistakes, but big impact.
(00:16:00.615) [Marc]: Then obviously reviewing and so everyone and all disciplinary things for such things is very difficult.
(00:16:07.344) [Marc]: We don’t know, they didn’t know that actually that design they did will remain for that long.
(00:16:12.651) [Marc]: And in IPv6, destination comes first.
(00:16:16.925) [Sara]: All right, thank you very much.
(00:16:20.388) [Sara]: And I think that’s actually a wonderful segue into our next question.
(00:16:26.554) [Sara]: So we’ve already, Mark, sorry, Mark has mentioned the technology that we’ve had for quite a while now and how we’ve learned from that and made things more efficient.
(00:16:43.729) [Sara]: But we’re also seeing emerging technologies
(00:16:46.912) [Sara]: How can these emerging technologies such as automated language models, an artificial intelligence, the Internet of Things, and so on, be ethically developed and deployed to ensure they have positive social, cultural, political, academic, and environmental impacts?
(00:17:10.469) [Sara]: It’s like 10 minutes for that one.
(00:17:13.675) [Sara]: So let’s go back to Mark for that one.
(00:17:19.925) [Sara]: Everyone will have a chance to answer, but we’ll just do it in the opposite order this time.
(00:17:25.114) [Sara]: Go ahead, Mark.
(00:17:28.661) [Marc]: It’s a hard practice to have
(00:17:31.411) [Marc]: all the view and impact of what we do.
(00:17:33.993) [Marc]: But what I actually, when we started to open up ideas and thoughts with Daphne, we did find people who are working hard on those questions from the root practice of what we call computer science today, with mathematicians, or both of your, as Maurice and Dennis.
(00:17:55.413) [Marc]: They put together a lot of questions, a lot of way of asking yourself,
(00:18:00.638) [Marc]: It is a good project, and so on and so on.
(00:18:03.422) [Marc]: That practice needs to be every time for everything, mostly.
(00:18:07.147) [Marc]: It was very hard to have the time for this, but it’s necessary.
(00:18:11.172) [Marc]: Giving time for this kind of practice is essential.
(00:18:15.458) [Marc]: And it does, it has to cover a minimum of different payouts that’s been introduced by their works.
(00:18:23.709) [Marc]: And I think we rely on actually kind of future projects on their approaches, and it’s very valuable.
(00:18:30.558) [Marc]: And that’s why, yes, I let all the people already spend a lot of time thinking of it.
(00:18:38.175) [Sara]: OK.
(00:18:39.016) [Sara]: Very good.
(00:18:39.517) [Sara]: Thank you, Marc.
(00:18:43.501) [Sara]: Alexander, you have something specific, go ahead.
(00:18:49.090) [Alex]: Yeah, you asked a really broad question about the impact of very, very difficult fields of human society.
(00:19:00.008) [Alex]: But I would like to point two issues.
(00:19:02.693) [Alex]: First of all, for technologies, development of technology is something funny.
(00:19:08.442) [Alex]: So that’s more than young people who are rushing into technology, into education, into testing something.
(00:19:16.736) [Alex]: They don’t think about impact of their activities at all.
(00:19:21.805) [Alex]: So that’s why we have script kiddies, we have young hackers and so on.
(00:19:26.513) [Alex]: That’s, I think, the lack of education, overall education, general education, not technology education.
(00:19:34.902) [Alex]: That’s an issue.
(00:19:36.844) [Alex]: And I remember myself when I was young, the Internet was a university and so on.
(00:19:43.413) [Alex]: I definitely can confess I did some unethical things which I would not do now having understanding all this impact.
(00:19:52.164) [Alex]: So first of all, we need to educate young.
(00:19:54.967) [Alex]: The second approach, and this is actually a kind of experience from local, from Russia, because officials, corrupted officials or corporations which have ties to the government, stating nearly the same things, that technology needs to be ethical, technology needs to provide sustainability and be available for everyone.
(00:20:24.953) [Alex]: But in contrary, technology does not develop.
(00:20:31.443) [Alex]: For example, in Russia, we do not have 5G cellular networks because all their frequencies are stockpiled by few companies or militaries under the name of protecting common resource and so on.
(00:20:49.430) [Alex]: So it’s a development
(00:20:50.452) [Alex]: 5G networks is not possible, not because of sanctions, not because of some retrospective things, but just because somebody tries to keep us sustainable.
(00:21:03.968) [Alex]: So I think that’s two points I would like to bring to the table and maybe discuss later.
(00:21:11.998) [Alex]: Thanks.
(00:21:14.819) [Sara]: All right.
(00:21:15.420) [Sara]: Thank you very much, Alex.
(00:21:17.802) [Sara]: That’s an interesting point.
(00:21:19.784) [Sara]: And of course, we also invite questions from our online participants.
(00:21:26.011) [Sara]: Next, we have Maurice.
(00:21:28.294) [Sara]: Go ahead, Maurice.
(00:21:31.557) [Maurice]: Thank you very much.
(00:21:32.759) [Maurice]: So I’m going to sort of try and give this from an engineer’s viewpoint.
(00:21:36.403) [Maurice]: So from an engineer’s viewpoint, there are three key aspects to ethical development here.
(00:21:41.789) [Maurice]: Perspective, perspective, and perspective.
(00:21:45.700) [Maurice]: Even the most conscientious engineers cannot ensure positive impacts on their own.
(00:21:50.890) [Maurice]: We work deep within technical systems, but technologies like AI in the Internet of Things are fundamentally human endeavors.
(00:21:57.023) [Maurice]: They connect people and the object people use.
(00:21:59.828) [Maurice]: Therefore, human insights…
(00:22:01.732) [Maurice]: and a range of perspectives must be central throughout the entire development and deployment process, not just as an afterthought.
(00:22:09.000) [Maurice]: This requires a shift in resources.
(00:22:10.803) [Maurice]: Ethical development isn’t free.
(00:22:12.324) [Maurice]: It takes dedicated time and effort to consult with domain experts, conduct impact assessments, and engage with impacted communities.
(00:22:18.752) [Maurice]: This work must be budgeted for as a core project requirement, not an optional extra.
(00:22:23.197) [Maurice]: Furthermore, our motivation must be scrutinized.
(00:22:25.640) [Maurice]: We should focus on applying our skills to solve recognized societal problems rather than inventing new problems to fit a fancy technological tool.
(00:22:33.931) [Maurice]: With every step forward, we have to ask a critical question.
(00:22:36.394) [Maurice]: Who wins and who loses?
(00:22:38.477) [Maurice]: True ethical networking requires us to see and account for everyone.
(00:22:44.605) [Sara]: Absolutely.
(00:22:45.466) [Sara]: So, very good.
(00:22:46.727) [Sara]: Thank you, Maurice.
(00:22:48.410) [Sara]: And Daphne, what are your thoughts?
(00:22:53.285) [Daphne]: So I think kind of the key word here in this question is positive impacts, because positive for who and relative to what?
(00:23:00.416) [Daphne]: I mean, everything is subjective.
(00:23:01.878) [Daphne]: And that’s related to, I mean, what Alexander, you were saying about the situation in Russia, because what one might consider as being positive might be well perceived as negative by another.
(00:23:10.612) [Daphne]: Meanwhile, one can be, of course, a person.
(00:23:12.836) [Daphne]: They can be a community, a group of interest, can be a government, et cetera, et cetera.
(00:23:17.804) [Daphne]: So as the question shows, impact is multidimensional.
(00:23:20.508) [Daphne]: So we can’t expect there will be one group of people that will decide what positive impacts are.
(00:23:25.355) [Daphne]: So maybe here, I will answer as a researcher, because that’s my community.
(00:23:28.759) [Daphne]: But I think as researcher, the very important thing for us now is ready to engage in a practice that goes beyond this kind of mode of organization and silos that we’ve seen for research.
(00:23:39.715) [Daphne]: So you are a computer scientist, you are a mathematician, you are a biologist, you are a sociologist.
(00:23:43.680) [Daphne]: But at the end, what really matters is that we really work together.
(00:23:47.065) [Daphne]: so that we agree or at least we get some shared value on what positive impact we are aiming at, but also how we assess this impact.
(00:23:57.547) [Sara]: Okay, thank you Daphne.
(00:24:00.042) [Sara]: And Dennis, of course, go ahead.
(00:24:03.928) [Dennis]: The development of technologies like large language models or the Internet of Things hinges critically on understanding the interconnected nature.
(00:24:13.483) [Dennis]: So from an engineer’s perspective or from a management perspective, that means that we cannot compartmentalize ethics within single sub teams because things will just get overlooked.
(00:24:24.140) [Dennis]: Nor can we sort of like overlook that sort of like the social, cultural, political and environmental aspects are deeply intertwined.
(00:24:32.429) [Dennis]: So we cannot usually address one without affecting the other.
(00:24:35.472) [Dennis]: And so that means for developers, there’s sort of a dual responsibility here, building safety into the technical architecture or into the technical system, and also earning the public’s trust.
(00:24:46.804) [Dennis]: One does not necessarily imply the other in an interconnected world.
(00:24:51.528) [Dennis]: And we cannot assume that engineers or mathematicians or computer scientists by default understand how to navigate this complexity or how to raise the right questions.
(00:25:02.347) [Dennis]: They need to be taught this and given the space to think beyond immediate, localized, often monetary incentives.
(00:25:09.980) [Dennis]: And they need to be taught how to do this in a way that earns trust from society.
(00:25:16.351) [Dennis]: And once again, this sort of like requires balancing technical expertise and technical incentives with non-technical knowledge and non-technical incentives.
(00:25:27.324) [Dennis]: In this sense, I can only reiterate what Maurice said.
(00:25:30.207) [Dennis]: Perspective is really what matters here from my perspective.
(00:25:36.214) [Sara]: Okay, thank you very much.
(00:25:37.335) [Sara]: I think we have quite a few overlaps there.
(00:25:41.200) [Sara]: I think a common thread is education.
(00:25:45.022) [Sara]: education and integration with our interdisciplinary teams and interdisciplinary working environments.
(00:25:57.458) [Sara]: And in that sense, we kind of have this big interdisciplinary environment with the IGF.
(00:26:08.289) [Sara]: And that leads us to the next question.
(00:26:10.751) [Sara]: What role can the IGF and its stakeholders play in promoting sustainable and responsible internet governance?
(00:26:22.082) [Sara]: So let’s start with Daphne this time.
(00:26:24.985) [Sara]: Go ahead, Daphne.
(00:26:27.749) [Daphne]: Thanks.
(00:26:28.611) [Daphne]: Well, I think that really the idea is a platform to connect and get the visibility on what’s going on.
(00:26:34.285) [Daphne]: So, as I said earlier, I really think that understanding for who and relative to what technology, a model, a development demonstrates certain qualities is not simple.
(00:26:44.759) [Daphne]: So to me the idea really has the ability to reach out to a very worldwide audience.
(00:26:51.557) [Daphne]: So it must capitalize on that to provide I think a medium through which we can confront our perspective especially coming from different parts of the world.
(00:26:59.879) [Daphne]: because this raises perspectives that we need to embed into sustainable and responsible Internet governance.
(00:27:07.768) [Daphne]: I don’t think we should get a top-down approach where a small group of people would decide on the definition of these qualities for governance.
(00:27:15.376) [Daphne]: So I really believe that the IGF has a key role to play in supporting the diversity of background, cultural heritage, point of views that are really necessary to design and build this governance framework.
(00:27:28.448) [Sara]: OK, thank you, Daphne.
(00:27:30.313) [Sara]: Maurice, what’s your perspective?
(00:27:32.419) [Sara]: What do you think?
(00:27:35.628) [Maurice]: Thank you.
(00:27:36.109) [Maurice]: So in my view, the IGF’s most powerful role here is that of a convener.
(00:27:41.763) [Maurice]: provides the room and sets the tables for the essential multi-level ethical engagement that sustainable internet governance requires.
(00:27:48.993) [Maurice]: This is the space where dialogue is not just possible, but it’s the primary purpose.
(00:27:53.079) [Maurice]: By its very nature, the IGF assembles a diverse array of stakeholders needed to generate genuine perspective from governments and corporations to academics and activists.
(00:28:02.371) [Maurice]: As we’ve discussed, perspective is the single most critical ingredient for the ethical development of emerging technologies.
(00:28:08.948) [Maurice]: An engineer in a lab cannot foresee and understand all the implications of their work, just as a policymaker cannot grasp all the technical nuances.
(00:28:16.021) [Maurice]: The IGF is a place where these worlds connect.
(00:28:18.505) [Maurice]: It breaks down the silos between the technical and non-technical experts that often exist in industry and governments, which is crucial for finding and nurturing a common language.
(00:28:27.882) [Maurice]: In this way, the IGF already acts as the essential first step.
(00:28:31.149) [Maurice]: It gathers the necessary people and perspectives, creating the foundation upon which responsible governance of a decentralized mathematical technology like the Internet can be built.
(00:28:41.850) [Sara]: Okay.
(00:28:42.632) [Sara]: Thank you.
(00:28:43.073) [Sara]: Thank you, Maurice.
(00:28:46.980) [Sara]: Next, Alexander.
(00:28:48.323) [Sara]: Go ahead.
(00:28:50.461) [Alex]: Yes, for sure.
(00:28:52.624) [Alex]: But first I would like to point out that ethics and sustainability might be really different in different parts of the world.
(00:29:04.860) [Alex]: So I think that locations where a full-blast IGF was conducted have completely different approaches to what’s ethical, what’s not ethical.
(00:29:13.782) [Alex]: And events like Internet Governance Forum allows, first of all, to understand each other.
(00:29:20.250) [Alex]: Not to synchronize, but to understand each other’s approaches.
(00:29:24.936) [Alex]: So that still Internet Governance Forum not just connects different stakeholders from the same group, but understanding of what’s going on in different regions, different countries, different regions.
(00:29:38.352) [Alex]: Overall, IGF allows to connect all positively thinking people who are looking forward for development of the internet for good.
(00:29:51.246) [Alex]: I think not just IGF, maybe some other platforms like World Summit for Information Society, which actually spinned off IGF 20 years ago, it still have forums which are more populated by governmental people,
(00:30:08.390) [Alex]: So I think we should continue not just in IGF, in our local IGF, in our local communities, but also have broader interaction within United Nations and intergovernmental organizations.
(00:30:26.966) [Sara]: Okay, thank you, Alexander.
(00:30:28.309) [Sara]: Denis, what about you?
(00:30:32.572) [Dennis]: This is sort of a follow-up from Maurice’s answer.
(00:30:35.397) [Dennis]: I think that assembling the right people is only half of the process.
(00:30:39.164) [Dennis]: The IGF’s next crucial role is to ensure that the insights also radiate outwards.
(00:30:46.297) [Dennis]: And the IGF is already highly effective at collectively identifying emergent issues.
(00:30:52.969) [Dennis]: I think what can be done next is sort of like how do we
(00:30:57.040) [Dennis]: translate that awareness into action, because our research on ethics and mathematics has demonstrated that many technical practitioners, like mathematicians, computer scientists, network engineers, quite often view their work as separate from ethics, sustainability, and also from policy.
(00:31:15.552) [Dennis]: So while many people who are in this room understand that technology and ethics or technology and sustainability are inseparable, the understanding is not very widespread from our experience.
(00:31:30.608) [Dennis]: And so the primary role that we see here is for IGF stakeholders to act as ambassadors, championing this integrated perspective and spreading awareness within their respective fields, within their respective companies, and bringing it where
(00:31:45.404) [Dennis]: We are people who are not yet convinced that this is important.
(00:31:52.004) [Sara]: Very important point.
(00:31:54.191) [Sara]: Thank you.
(00:31:54.632) [Sara]: Thank you, Dennis.
(00:31:56.257) [Sara]: And finally, Mark.
(00:31:57.622) [Sara]: Go ahead.
(00:32:00.119) [Marc]: Well, when we look into the story of IGF and why it is and so, when we talk about internet, that’s not something we initially come up from the ITU.
(00:32:13.954) [Marc]: ITU has been, I mean, the very beginning of ITU was it weren’t before United Nations.
(00:32:20.321) [Marc]: But in 47, that was the very first chapter out of the Second War.
(00:32:26.208) [Marc]: to be United Nations before UNESCO and so on and so on.
(00:32:30.552) [Marc]: ITU is still there and so for standardizations for telecommunication.
(00:32:37.059) [Marc]: But come up in the meantime that we all know, internet, very different way of to be governed and coming origin and so the way to decide the standards are very, very different.
(00:32:49.252) [Marc]: And then it is actually winning compared to ITU standards.
(00:32:55.418) [Marc]: what we call about RSE, ETF, ERTF, the different things that’s coming up from this community.
(00:33:01.794) [Marc]: Very different.
(00:33:03.197) [Marc]: Then United Nations created IGF because they realized that something is missing.
(00:33:09.933) [Marc]: It went out of ITU.
(00:33:12.648) [Marc]: Then IGF is the good place actually to get many many people in a very different aspect to take over what we call the Internet today.
(00:33:21.523) [Marc]: But it’s not only in the tubes is in the way the protocol has been designed and also the content store all different aspects.
(00:33:30.417) [Marc]: And the thing that is very very open and we have this occasion today is very important.
(00:33:35.227) [Marc]: It could be, and then the missing part of it is how we can influence a little bit more.
(00:33:43.597) [Marc]: And participating a little bit more from the IGF community, interacting with EITF, the design and the department of technological standardizations as it is.
(00:33:57.934) [Marc]: There is gateway, people coming a little bit more in IGF from EITF and vice versa.
(00:34:03.661) [Marc]: But I think it’s very important as well.
(00:34:05.944) [Marc]: And then W3C and all different aspects and so on.
(00:34:11.532) [Marc]: Then I haven’t been participating much on understanding the relationship between studies and organization like this.
(00:34:18.422) [Marc]: But that’s very important.
(00:34:20.806) [Marc]: That could be for the future.
(00:34:22.949) [Sara]: So having those platforms in place gives us more leverage.
(00:34:29.160) [Sara]: Okay, thank you very much, everyone.
(00:34:32.666) [Sara]: Thank you, Mark and everyone.
(00:34:35.171) [Sara]: We’re doing okay time-wise, so we have time for one more question.
(00:34:38.477) [Keith]: Yeah, we have time for one more question.
(00:34:40.280) [Keith]: I’ll take that over as in sort of a question for myself as well.
(00:34:44.224) [Keith]: And then we’ll try and get some questions from the audience.
(00:34:46.247) [Keith]: So we have just about nine minutes left.
(00:34:48.990) [Keith]: And the last question is, how can we evaluate the human component of networks?
(00:34:52.614) [Keith]: We’ve talked a lot about the fact that these aren’t just systems, there’s people behind them.
(00:34:56.859) [Keith]: What can we do to learn more about how we learn new networks?
(00:35:00.723) [Keith]: What practical tools can we use to evaluate computer networking practices?
(00:35:05.068) [Keith]: So go back in reverse order maybe with Mark first.
(00:35:12.663) [Marc]: The human part of it is a good question.
(00:35:17.310) [Marc]: And we have some way of trying to understand this.
(00:35:24.582) [Marc]: But it is social work and so on.
(00:35:27.386) [Marc]: The only things I’ve learned recently, trying, and I mean it’s a fact, the quantitative space have nothing to see with the qualitative space.
(00:35:40.507) [Marc]: And trying to understand these two different spaces for deciding what quality we want to give to some evaluation we do as an engineer to get a better optimization process or performance of whatever system.
(00:35:57.683) [Marc]: And so finding the right gap to be able to get the quantitative design we want as a good quality as a beginning.
(00:36:06.612) [Marc]: We need people who guide us for pushing to the questions and finding the right way of making finite choices.
(00:36:20.668) [Keith]: Really quick, so maybe go off to Daphne next.
(00:36:26.204) [Daphne]: Yeah, I don’t know if I have much to add to this question.
(00:36:29.348) [Daphne]: So I think that’s really kind of a typical question.
(00:36:32.452) [Daphne]: Then we need collaboration across disciplines.
(00:36:35.095) [Daphne]: And yeah, for us, like people working on computer networks, that’s quite important.
(00:36:40.221) [Daphne]: We understand the human perspective.
(00:36:41.663) [Daphne]: But we don’t necessarily know, well, we don’t necessarily have the tools that we can use to actually access to human perception.
(00:36:48.631) [Daphne]: human feedback on this.
(00:36:50.313) [Daphne]: So I think that’s where we need to collaborate, for example, with social scientists.
(00:36:53.979) [Daphne]: I mean, we started working with you on that purpose, to learn how we can run survey, how we can do consultation, how do we analyze the feedback we get, I mean, through this method.
(00:37:07.358) [Keith]: Okay.
(00:37:08.680) [Keith]: Since we’re short on time, Maurice, Dennis, Alexander, would any of you like to chime in?
(00:37:14.585) [Maurice]: I’d be happy to, at this stage.
(00:37:17.508) [Maurice]: So I think the more pertinent question really to consider here is how to evaluate the network as a socio-technical system.
(00:37:25.237) [Maurice]: So humans and technical components cannot be assessed in isolation.
(00:37:28.340) [Maurice]: Their value and risks emerge from their interaction.
(00:37:31.003) [Maurice]: This becomes evident by looking at socio-technical systems potential points of failure.
(00:37:34.807) [Maurice]: So they must assess the potential for a failure of the technical or AI component, or a failure of the human component, or a failure of the process or workflow they’re meant to follow.
(00:37:43.677) [Maurice]: Crucially, we must also evaluate the human-machine interface itself, as this is the primary site of miscommunication and error.
(00:37:49.145) [Maurice]: And finally, we must account for failures caused by exogenous circumstances, acknowledging that no system operates in a vacuum.
(00:37:55.334) [Maurice]: This method ensures a comprehensive socio-technical evaluation.
(00:37:58.318) [Maurice]: And as you can clearly see, three-fifths of the problems listed above are neither purely human nor purely technical, instead stemming from their interaction.
(00:38:07.842) [Maurice]: Great.
(00:38:09.365) [Maurice]: Dennis or Alex?
(00:38:11.329) [Alex]: I just would like to add shortly that our main task is just not to lose our focus and continue observing developments.
(00:38:23.772) [Alex]: in case we shortly stop paying attention to latest developments, to technological advances, they could and I think will go the wrong way.
(00:38:36.679) [Alex]: So just keep an eye and follow and communicate with each other.
(00:38:41.188) [Alex]: That’s important.
(00:38:43.668) [Dennis]: Last thoughts, Dennis?
(00:38:46.493) [Dennis]: I think the really big first step is to not view human components of a network similar to technical or mathematical components.
(00:38:55.489) [Dennis]: Our experience of working with mathematicians, engineers, but also with users,
(00:39:00.037) [Dennis]: is that their actions, their awareness, and their motivation are almost equally important when it comes to eventual outcomes.
(00:39:08.010) [Dennis]: And the failure modes that Maurice outlined are deeply connected to who a human is.
(00:39:14.181) [Dennis]: So from that perspective, we really need to think about this question, how do we understand who the humans are involved in these networks?
(00:39:23.290) [Keith]: Thanks.
(00:39:24.351) [Keith]: And just to chime in myself, that this workshop itself really began as a questionnaire that Mark, Daphne, and I self-developed to try and learn about how humans are learning difficult new methods for operating computer networks.
(00:39:42.230) [Keith]: And just for our last four or five minutes, I ask Bailey, who’s with us online, to collect some questions from the audience, and maybe she can read them out to us.
(00:39:53.851) [SPEAKER_01]: Hello, everybody.
(00:39:55.875) [SPEAKER_01]: So we do have a couple of questions in the chat here.
(00:39:58.479) [SPEAKER_01]: I’ll start with the first question from Henan Zahir.
(00:40:04.950) [SPEAKER_01]: I apologize if I mispronounce anybody’s name.
(00:40:07.635) [SPEAKER_01]: But her question is, how can ethical networking be democratized to ensure meaningful citizen oversight over data-driven public systems?
(00:40:24.192) [Keith]: Would one any of you like to quickly quickly three minutes and 30 seconds half that time answer it No Alex did you wanna go ahead Dennis?
(00:40:42.753) [Dennis]: I think
(00:40:44.105) [Dennis]: It goes back to what Alex says.
(00:40:46.528) [Dennis]: We need to respect the different cultures and different regions of this world have different perspectives on this very question.
(00:40:53.736) [Dennis]: So in this sense, the IGF should probably try to be even more international and to really bring in these different cultures and perspectives.
(00:41:09.753) [Dennis]: But it’s a hard question.
(00:41:11.257) [Alex]: Yeah, and I would like to reply to this question by noting that technology could not insure you in something.
(00:41:22.418) [Alex]: You are your own insurance.
(00:41:24.843) [Alex]: You have to communicate, you have to oversight, you have to think about what’s going on with your data and how it’s being driven.
(00:41:33.920) [Alex]: So IGF is a good starting venue for discussions like this.
(00:41:39.732) [Alex]: But your participation is also really important.
(00:41:44.763) [Keith]: Great.
(00:41:44.903) [Keith]: And Bailey one more question.
(00:41:46.186) [Keith]: Two minutes.
(00:41:46.988) [Keith]: Question and answer.
(00:41:48.668) [SPEAKER_01]: Yep, so there’s one more question here from Anna Gretel Ichazu, and she’s asking, I would like to know how do you think of global north and global south dynamics across the issues you are arising?
(00:42:09.109) [Alex]: Yeah, let me answer this question because I’m from country which for a long time pretended to be global north, but now we’re pretending to be global south.
(00:42:17.677) [Alex]: So, Internet and these technologies actually could shorten the gap between what we call West world and the others, or North and South.
(00:42:32.334) [Alex]: But you also have to oversight really clearly, because in not very democratic developed countries,
(00:42:39.682) [Alex]: especially in countries of so-called global south, technology can easily be abused by the government, which will make gap to the north, economical gap, civilizational, well, not civilizational, societal gaps, democratic gaps, much bigger than it exists.
(00:42:58.588) [Alex]: So I will repeat my answer to previous questions.
(00:43:02.674) [Alex]: Technology could not
(00:43:06.192) [Alex]: close gaps.
(00:43:07.576) [Alex]: You have to oversight really, really accurately and constantly and not releasing it.
(00:43:13.714) [Alex]: Thanks.
(00:43:16.281) [Keith]: Last 40 seconds.
(00:43:17.164) [Keith]: Any other ideas?
(00:43:22.780) [Keith]: Okay, well then I will close off this session and thank everybody for coming.
(00:43:27.265) [Keith]: It was really interesting.
(00:43:29.248) [Keith]: I hope we can make a routine of this and produce some studies that also look into these very difficult questions.
(00:43:36.276) [Keith]: And hopefully we’ll have a publication or some other outputs for you all to read soon.
(00:43:40.120) [Keith]: So thank you everybody for coming.


pp-international.net/2025/06/2…





Kirsten stelt zich voor


Beste allemaal, Ik ben Kirsten Zimmerman, sinds 2022 met veel plezier stadsdeelcommissielid in Amsterdam-Noord voor De Groenen Basis Piraten. Mijn speerpunten zijn behoud van het groen, directe democratie, (online) privacy en sociale verbinding. Ik help de Noorderlingen graag en probeer bewoners in hun kracht te zetten om zelf met oplossingen voor maatschappelijke problemen aan de […]

Het bericht Kirsten stelt zich voor verscheen eerst op Piratenpartij.



Greene County says employees aren’t prohibited from talking to press


Earlier this month, Freedom of the Press Foundation (FPF) and the Society of Professional Journalists led a letter to Steve Catalano, chair of the Greene County Board of Supervisors, objecting to a policy that reportedly prohibited county employees from speaking to the press and required them to label anything they provide to the press as “opinion.”

In response, the county claimed that the policy does not exist, despite several county employees reportedly telling local newspaper The Daily Progress that they could not speak to the media about public records requests they’d denied due to the policy.

We don’t know why county employees would be “confused” by a nonexistent gag policy if such a policy had not been communicated to them but, to resolve any confusion, here is the email chain that includes the denial of the policy’s existence.

freedom.press/static/pdf.js/we…

We hope any reporters whose requests for information are denied underthe supposed policy will show their sources this email, and that any Greene County employees who are confronted for not abiding by the policy will show it to their supervisors.

We note that even the above email denying the existence of the policy acknowledged that it is the county’s position that “When staff are asked about the County’s official position on policy matters, they are to represent the majority position of the Board or defer the question to the Chairman.”

We informed the County of the ambiguity of that position and the need to put its rules in writing, so that there won’t be further confusion if they’re constitutional, and so those impacted can challenge them if they’re not. As you can see, they said they would — we’ll hold them to it and will keep you posted.


freedom.press/issues/greene-co…



Trump-Paramount mediation is toxic for all involved


Dear Friend of Press Freedom,

It’s the 94th day that Rümeysa Öztürk is facing deportation by the United States government for writing an op-ed it didn’t like. More press freedom news below.

Paramount should abandon mediation with Trump. So should the mediator


Earlier this week, CBS News filed a strong brief outlining why President Donald Trump’s lawsuit over the editing of its 2024 interview with Kamala Harris is completely frivolous and an affront to the First Amendment.

But just days later, The Wall Street Journal reported that a mediator had proposed CBS owner Paramount Global settle the suit for $20 million. It’s been reported that Paramount — believing the Trump administration will block its merger with Skydance Media if it doesn’t settle – had previously offered $15 million, which Trump declined, demanding at least $25 million.

We wrote about why the unnamed mediator needs to consider their ethical obligations to not facilitate what may amount to an illegal bribe. Read more here.

Stop prosecuting journalist who exposed antisemitism


The Trump administration loves to position itself as an enemy of antisemitism. But it has continued its predecessor’s legally dubious prosecution of journalist Tim Burke, who found outtakes of a 2022 antisemitic rant by Ye, formerly Kanye West, that Fox News cut from a Tucker Carlson interview.

We’ve written before about why the government’s legal theories are nonsense, but there’s also the issue of why two presidential administrations thought it was a wise use of prosecutorial discretion to go after someone who clearly did a public good.

Freedom of the Press Foundation (FPF) Advocacy Director Seth Stern, along with Bobby Block of the Florida First Amendment Foundation, urged the administration to drop the case in USA Today. Read more here.

Putting public records to use


Federal agencies are closing their Freedom of Information Act offices, disappearing information from their websites, no longer creating records, and possibly even inappropriately destroying them. At the local level, we’re seeing legislation introduced across various states that would make it easier for local governments to ignore requests they don’t like.

To discuss how the public can use public records requests to fight back against mounting secrecy, we hosted a webinar June 24 with Washington Post FOIA Director Nate Jones, MuckRock CEO Michael Morisy, investigative journalist and author Miranda Spivack, and FPF’s Daniel Ellsberg Chair on Government Secrecy Lauren Harper. Watch it here.

What’s going on in Greene County?


Earlier this month, FPF and the Society of Professional Journalists led a letter to Steve Catalano, chair of the Greene County Board of Supervisors, objecting to a policy that reportedly prohibited county employees from speaking to the press and required them to label anything they provide to the press as “opinion.”

In response, the county claimed that the policy does not exist, despite several county employees reportedly telling local newspaper The Daily Progress that they could not speak to the media about public records requests they’d denied due to the policy.

To resolve any confusion among reporters and county employees under the impression that there’s a gag order, we posted the exchange on our website. Read more here.

What we’re reading


‘They’re not breathing’: Inside the chaos of ICE detention center 911 calls (Wired). This important story is available to read for free, thanks to Wired’s partnership with FPF to unpaywall FOIA-based reporting. Other outlets should follow suit.

DeKalb solicitor-general dismisses charges against journalist Mario Guevara (Atlanta Civic Circle). Dropping charges is nice, but too little, too late after they handed the journalist (who has a legal work authorization) over to Immigration and Customs Enforcement.

‘Giving information to the enemy’: Israel’s ban on Al Jazeera extends to foreign broadcasters (The Dissenter). It was obvious from the outset that Israel’s Al Jazeera ban was a pretext for further censorship. The same goes for U.S. efforts to crack down on foreign media.

The Paramount risk in settling Trump’s lawsuit: ‘Bribery’? (The Wall Street Journal). The Journal’s editorial board writes that, instead of caving to Trump, Paramount should “win the legal case [and] vindicate its CBS journalists and the First Amendment.”

White House to limit intelligence sharing, skip Gabbard at Senate Iran briefing (The Washington Post). The public should know if the Trump administration is lying about its Iran strike. Leaks undermining the administration’s claims aren’t an excuse for secrecy — they are a reason to declassify the underlying records.


freedom.press/issues/trump-par…


The Pirate Post ha ricondiviso questo.


26/6 Digest roundup: le notizie più rilevanti delle ultime 24 ore
#CyberSecurity
insicurezzadigitale.com/266-di…



Trotz Fristverlängerung: Schwarz-Rot peitscht Änderung des BKA-Gesetzes durch den Bundestag


netzpolitik.org/2025/trotz-fri…




Brandung-Live #94 on June, 29th


The next “Brandung-Live” will be on 29.06.2025 at 20.00h CEST/DST.

News from Potsdam, Brandenburg, the Pirates of Germany and international news – in German.

If you want to join the conversation, just contact info@PiratesOnAir.net.


piratesonair.net/brandung-live…



Paramount should abandon mediation with Trump. So should the mediator


Earlier this week, CBS News filed a strong brief outlining why President Donald Trump’s lawsuit over the editing of its 2024 interview with Kamala Harris is completely frivolous and an affront to the First Amendment.

But just days later, The Wall Street Journal reported that a mediator had proposed CBS owner Paramount Global settle the suit for $20 million. It’s been reported that Paramount — believing the Trump administration will block its merger with Skydance Media if it doesn’t settle – had previously offered $15 million, which Trump declined, demanding at least $25 million.

A mediator’s job is to find a compromise number, so, from that perspective, it’s easy to understand why they’d propose $20 million (mediators get paid the big bucks to do that kind of math). But mediators are still under an obligation not to facilitate criminality. And, as both federal and state lawmakers have said, a settlement by Paramount may amount to illegal bribery.

The Model Standards of Conduct for Mediators — adopted by the American Bar Association, American Arbitration Association, and Association for Conflict Resolution — provide as follows: “If a mediation is being used to further criminal conduct, a mediator should take appropriate steps including, if necessary, postponing, withdrawing from or terminating the mediation.” The ethics guidelines of Judicial Arbitration and Mediation Services go a step further: “A mediator should withdraw from the process if the mediation is being used to further illegal conduct.“

The mediator’s identity hasn’t been publicly disclosed, but presumably, they’re a lawyer, also bound by the rules of professional conduct. Those rules also include obligations to uphold the integrity of the legal profession and not facilitate illegal conduct. Violations are punishable by discipline up to disbarment. That said, it should not require citation of specific rules and standards to establish that it’s improper to facilitate the abuse of the legal system to funnel bribes disguised as settlements to politicians.

Everyone knows the case is not worth $20 million, or even 20 cents, in terms of legal merit. It’s beyond frivolous — and that’s saying something given the myriad frivolous lawsuits Trump has filed. As Paramount’s own lawyers note, Trump often doesn’t even attempt to cite cases supporting his cockamamie legal theories or refuting the decades of First Amendment precedent that obviously protects CBS’ editorial judgment. Any first-year law student can see this case should be thrown out at the first opportunity. It’s a joke.

“If a mediation is being used to further criminal conduct, a mediator should take appropriate steps including, if necessary, postponing, withdrawing from or terminating the mediation.”


The Model Standards of Conduct for Mediators

Yet it’s been widely reported that Paramount directors believe they need to pay up if they want Trump’s Federal Communications Commission to approve the Skydance merger. Possibly the only thing stopping them is the fact that the directors are reportedly concerned that settling could put it at risk of liability for bribery.

They should be worried. In addition to the lawmakers that have launched probes into potential criminality, as Paramount shareholders and defenders of press freedom, we’ve threatened to bring a shareholder derivative suit against Paramount directors and officers if the company pays Trump off, and retained counsel to do so on our behalf.

Any settlement agreed to by Paramount to facilitate its merger is not only tantamount to bribery but throws its own storied news outlet under the bus and invites Trump to extort countless other news corporations as soon as the check clears.

The Wall Street Journal reported that directors at Paramount have been mulling over a number it can offer that reduces the odds of a bribery case. When a party to a mediation is calculating settlement numbers not based on the merits of the lawsuit and cost of litigation but on the maximum it can pay without officers and directors getting indicted, that’s a gigantic red flag that no ethical mediator should ignore (as if investigations from federal and state lawmakers aren’t enough of a warning).

We hope the mediator is taking their obligations seriously. And we also hope Paramount directors are reading their own lawyers’ legal briefs. Editing interviews is something news outlets across the political spectrum do every hour of every day (even Trump-aligned Fox News hosts have openly acknowledged that). As Paramount’s lawyers wrote, if this case were to go forward, it “would amount to green-lighting thousands of consumer claims brought by individuals who merely disagree with a news organization’s editorial choices.”

The same can be said for if Paramount settles. Trump sued The Des Moines Register right after ABC owner Disney shamefully settled with Trump earlier this year. If he is gifted $20 million (and no, it doesn’t help matters if the money goes to his purported library foundation) for disliking the way an interview was edited, there’s no telling how many news outlets he’ll target next.


freedom.press/issues/paramount…



Vuoi saperne di più sulle istituzioni e l’elaborazione delle politiche europee? Approfitta del corso online della Pirate Academy


La Pirate Academy online, organizzata dall’eurodeputata Markéta Gregorová e dal gruppo politico Verdi/ALE, si terrà nell’autunno del 2025. David, dicci, cosa tratterà esattamente il corso? Si articolerà in cinque sessioni online interattive, più un incontro in diretta a Bruxelles, presso il Parlamento europeo, per i migliori partecipanti. Il corso ti preparerà a lavorare come consulente…

Source



The Pirate Post ha ricondiviso questo.


25/6 Digest roundup: le notizie più rilevanti delle ultime 24 ore
#CyberSecurity
insicurezzadigitale.com/256-di…

in reply to The Pirate Post

Sorry for the downvotes, OP, but in good old reddit tradition this community is for US politics.

I'm sure you can find better communities for this.



Do you want to learn about European institutions and policymaking? Take advantage of the Online Pirate Academy course


Markéta Gregorová has opened applications for the Pirate Academy, and that’s exactly what we’re going to talk about here. Are you interested in applying for the Pirate Academy but want to know more? We asked one of the organizers to share what you can expect. David Wagner gave us an overview of what’s coming up.

The online Pirate Academy, hosted by MEP Markéta Gregorová and the Greens/EFA political group, will take place in fall 2025. So David, tell us, what exactly will the course cover?

It’ll be five interactive online sessions, plus one extra live meeting in Brussels at the European Parliament for the top participants. The course will prepare you for working as a policy advisor, not only in the European Parliament. You’ll learn how real political conflicts are handled, get hands-on experience with practical tools for shaping European legislation, understand EU values and how it works — and of course, the course includes concrete real-life examples.

That sounds really interesting, David. How much time will the course take? And do applicants need any special education?

No special education is needed. It’s good if you’re enthusiastic about topics related to the European Union and its institutions, since that’s what the whole online academy is about. Each of the five online sessions lasts 180 minutes, including preparation time. Applicants will need to read and review some materials we provide before the sessions.

Can you tell us more about the topics of the individual online sessions?

There’s a lot to cover. For example: Understanding the structure and logic of the European institutions, the EU budget vs national budgets, pillars of European security, energy transition, and China’s dominance in renewables, fossil fuels, prices, the international market, and relations with the biggest suppliers. All topics are very current and linked to today’s political situation.

And the last question: Why should people apply to the Online Pirate Academy?

It’s a great chance for anyone who wants to gain knowledge about European legislation. It’s also a stepping stone for a future career in European institutions. With this knowledge, you’ll be able to actually influence political developments — and that’s something every European citizen should do.
The post Do you want to learn about European institutions and policymaking? Take advantage of the Online Pirate Academy course first appeared on European Pirate Party.



Bastian’s Night #431 June, 26th


Every Thursday of the week, Bastian’s Night is broadcast from 21:30 CET (new time).

Bastian’s Night is a live talk show in German with lots of music, a weekly round-up of news from around the world, and a glimpse into the host’s crazy week in the pirate movement aka Cabinet of Curiosities.


If you want to read more about @BastianBB: –> This way


piratesonair.net/bastians-nigh…


The Pirate Post ha ricondiviso questo.


In a tumultuous galaxy, dynes across the network exchange know-how and action items. Resilient, sovereign, and steady. They source the hope within and spread it throughout, elevating their peers to root and reclaiming reality.

Home is not a page; it is a private key.
news.dyne.org/planet-dyne-s202…

reshared this


The Pirate Post ha ricondiviso questo.


BEEPFREEZE: un nuovo wiper iraniano nei confronti dell’Albania
#CyberSecurity
insicurezzadigitale.com/beepfr…


European Pirate Academy: scopri tutto sulla negoziazione della legislazione dell’UE


La politica e la sicurezza europea ti appassionano? Non perdere l’occasione di partecipare alla Pirate Academy, che si terrà da settembre a novembre 2025. Trenta candidati selezionati prenderanno parte a sessioni online incentrate su sfide e aree problematiche chiave, dove acquisiranno una comprensione più approfondita del funzionamento delle istituzioni europee. Dieci di loro avranno l’…

Source

informapirata ⁂ reshared this.


The Pirate Post ha ricondiviso questo.


La politica e la sicurezza europea ti appassionano? Non perdere l'occasione di partecipare alla Pirate Academy, che si terrà da settembre a novembre 2025

La politica e la sicurezza europea ti appassionano? Non perdere l'occasione di partecipare alla Pirate Academy, che si terrà da settembre a novembre 2025.

pirati.io/2025/06/european-pir…

@pirati@feddit.it

reshared this



„Going Dark“: EU-Kommission stellt Fahrplan für Datenzugang für Polizeien vor


netzpolitik.org/2025/going-dar…


The Pirate Post ha ricondiviso questo.


24/6 Digest roundup: le notizie più rilevanti delle ultime 24 ore
#CyberSecurity
insicurezzadigitale.com/246-di…

The Pirate Post ha ricondiviso questo.


🚨 #BudapestPride is just days away but Hungary has criminalised Pride marches and is using real-time #FacialRecognition to identify protesters ❌

We, together with 46 other civil society organisations, are urging the European Commission to take IMMEDIATE action to defend human rights in Hungary.

People at Budapest Pride should be able to safely exercise their right to peaceful assembly and freedom of expression ✊🏽 🏳️‍🌈

Read our call to the Commission ⤵️ edri.org/our-work/open-letter-…

in reply to EDRi

Here are a few helpful ideas for avoiding facial recognition tech.

8 Genius Ways to Trick Surveillance Systems

Free archived link:

archive.nytimes.com/www.nytime…