Old Laws & New Tech: As Courts Wrestle with Tough Questions under US Biometric Laws, Immersive Tech Raises New Challenges
Extended reality (XR) technologies often rely on users’ body-based data, particularly information about their eyes, hands, and body position, to create realistic, interactive experiences. However, data derived from individuals’ bodies can pose serious privacy and data protection risks for people. It can also create substantial liability risks for organizations, given the growing volume of lawsuits under the Illinois Biometric Information Privacy Act (BIPA) and scrutiny of biometric data practices by the Federal Trade Commission (“FTC” or “Commission”) in their recent Policy Statement. At the same time, there is considerable debate and lack of consensus about what counts as biometric data under existing state privacy laws, creating significant uncertainty for regulators, individuals, and organizations developing XR services.
This blog post explores the intersection of US biometric data privacy laws and XR technologies, particularly whether and to what extent specific body-based data XR devices collect and use may be considered “biometric” under various data protection regimes. We observe that:
- Face templates and iris scans used to authenticate an individual’s identity are regulated biometrics, therefore those use cases in XR are covered by biometric laws.
- Laws with broad definitions of biometrics may apply to systems that use face detection, as seen in emerging case law from Illinois regarding virtual try-on XR applications.
- Organizations have taken steps that reduce their liability risk regarding face-based biometric systems, including by minimizing collection of identifying data or processing biometric data on individuals’ devices.
- Other body-based data not used for identification in XR, like eye-tracking and voice analysis, may also be considered “biometric” if the technology and data are capable of identifying an individual.
A. Face Templates, Hand Scans, and Iris Scans Used to Authenticate an Individual’s Identity Are Regulated Biometrics, Therefore User Authentication in XR is Covered by Biometric and Comprehensive Privacy Laws
In the US, there are three biometric data privacy laws: the Illinois Biometric Information Privacy Act (BIPA), the Texas Capture and Use of Biometric Identifier Act (CUBI), and the Washington Biometric Privacy Protection Act (BPPA). Other states like California, Connecticut, and Virginia also maintain comprehensive data privacy laws that regulate biometric data, and the FTC regulates “unfair and deceptive” biometric data practices. But because BIPA is the only privacy law to have a private right of action for any person “aggrieved” by a violation of a biometric privacy statute, Illinois cases abound and Illinois courts play a fundamental role in determining the scope of what policymakers, companies, and consumers consider to be “biometric data.”
With the exception of CUBI (and to a certain extent, BIPA), most biometric and comprehensive data privacy statutes tie their definitions of “biometric data” to identification, meaning the laws are intended to regulate unique physiological characteristics that entities use to identify an individual. Generally, each biometric and comprehensive law focuses on five forms of biometric data: retina or iris scan, fingerprint, voiceprint, hand scan, and face scan. BIPA, in particular, applies to “biometric identifiers,” defined as a “retina or iris scan, fingerprint, voiceprint, or scan of hand or face geometry,” as well as “biometric information,” which includes “any information…based on an individual’s biometric identifier used to identify an individual.” Therefore, any entity that uses technology to scan an individual’s iris, finger, hand, or face to uniquely identify an individual (1:many) or authenticate their identity (1:1) must comply with BIPA’s requirements, unless they fall within one of BIPA’s exemptions or exclusions. The same conclusion applies to CUBI, BPPA, and comprehensive data privacy laws.
XR devices often use iris, face, or hand scans to authenticate a user’s identity to log in to their profile or enable in-app payments. Much like computers or smartphones, more than one user may use a single XR device, so authenticating the specific person using the device at a given time allows for more personalization and secure transactions. As a result, iris or face authentication systems in XR devices are likely covered by U.S. biometric and comprehensive data privacy laws. The laws typically require organizations to obtain user consent before enrolling individuals in this sort of biometric XR system, and BIPA has potentially thorny provisions requiring “written consent,” which can be challenging to implement for many XR applications. The face and eye scans that XR technologies use for authentication may also be considered “sensitive data” under comprehensive data privacy laws, such as the California Privacy Rights Act (CPRA) or the Connecticut Data Privacy Act, requiring organizations to provide individuals opt-out rights, including the right to opt out of data sales and other transfers.
Most XR authentication technologies employ live capture of biometrics. Iris, face, or hand scans are captured in real-time when an individual first enrolls, and subsequent scans are likewise captured in real-time when the individual authenticates their identity to the device or app. These scenarios are typically covered by biometrics laws as described above. However, there is some uncertainty regarding biometric laws’ application to XR devices that create a biometric template from non-real-time photos, videos, and audio recordings. Most biometric and comprehensive privacy laws exclude photos, videos, and audio recordings from the scope of “biometric data” to varying degrees (with the exception of CUBI and the CPRA). Utah and Virginia’s comprehensive privacy laws, for example, broadly exempt photographs “or any data generated therefrom” from coverage, making their biometric regulations perhaps less likely to apply to photographic scanning. But case law under BIPA shows that these provisions may not exclude “biometric templates” derived from non-real-time photos, videos, or audio recordings. In Shutterfly v Monroe, the United States District Court for the Northern District of Illinois concluded that narrowly reading “biometric identifier” only to mean real-time scans would swallow the intent of the law, thus photographic scanning to create “templates” were still within scope. Laws like the Connecticut Data Privacy Act (CTDPA) and the final rules for the Colorado Privacy Act (CPA) that do not exclude photos or data generated from these sources if an entity uses them for identification purposes, or CUBI, which contains no exemptions for photographs at all, are likely to follow this analysis. The FTC’s conception of biometric information similarly and explicitly encompasses photos, videos, audio recordings, and certain data derived from these sources, making it likely that most regulators will still consider “biometric templates” created from photographic scanning subject to applicable biometric regulations.
B. Laws with Broad Definitions of Biometrics May Apply to Systems that Use Face Detection, as Seen in Emerging Case Law from Illinois Regarding Virtual Try-On XR Applications
Despite most laws’ goal to regulate biometric data that is uniquely identifying, several statutes’ text can be interpreted to apply to biometric technologies that merely distinguish a face from other objects or analyze facial characteristics, without identifying a particular individual. Depending on a privacy law’s definition of “biometric data,” courts may hold that the term regulates technologies that utilize data derived from an individual’s face, eyes, or voice even when they are not used for identification purposes. In XR, devices may use inward-facing cameras to conduct facial analysis for non-identification purposes, such as rendering expressive avatars. Augmented reality (AR) products like “virtual try-on” may also use facial analysis for people to visualize how different products – like eyeglasses – might look on them. Like many other XR applications, VTO primarily uses facial scans to detect and correctly align the product with an individual’s physical features, rather than for identification purposes.
Some laws with broad definitions can apply to these non-identification technologies unless a specific exception applies. CUBI does not require “biometric identifiers” to uniquely identify an individual, which has prompted the Texas Attorney General to claim that CUBI applies to the capture of face geometry regardless of whether an entity uses these facial maps for individual identification. The FTC’s conception of biometric technologies also broadly encompasses “all technologies that use or purport to use biometric information for any purpose.” But most notably, BIPA is complex because its definition of “biometric identifiers” does not explicitly require that the data be used for identification (in contrast to the statute’s definition of “biometric information,” which does require identification). As a result, Illinois courts have largely found that any facial scan may create a “biometric identifier,” such as with doorbell cameras, photo grouping, and Snapchat filters. This is true even when that technology’s facial scan feature was not used to identify the individual in the photo or video frame.
Recent BIPA lawsuits brought against companies that offer (VTO) illustrate how broad biometric laws might apply to XR devices that use facial analysis. In Theriot v. Louis Vuitton North America, Inc., a federal court permitted BIPA claims to proceed against Louis Vuitton’s VTO sunglasses application, finding that the technology’s use of facial scans was analogous to BIPA case law holding that face scans derived from photographs constitute biometric identifiers. Other VTO cases have had similar outcomes. Only VTO technology used for healthcare-related purposes, such as trying on prescription eyeglasses, have been found by courts to be outside the scope of BIPA. But this result did not rest on BIPA’s overall definition of biometric data, but rather arose from a narrow exception for “information captured from a patient in a health care setting.” So BIPA may not apply to medical providers’ use of XR apps or other immersive technologies, such as brain computer interfaces (BCIs), for diagnostic purposes, but BIPA’s coverage of non-identifying, non-medical uses remains a source of substantial confusion. This confusion undermines individuals’ understanding of their privacy rights and presents liability risks for organizations.
C. Organizations May Reduce their Liability Risk by Minimizing Collection of Identifying Data or Processing Biometric Data on Individuals’ Devices
Some organizations have taken steps to limit their liability risks by minimizing the collection of identifying data or processing biometric data on individuals’ devices. Case law suggests that some facial detection technologies fall outside the scope of BIPA and other biometric regulations if (1) there is no mechanism for the technology to retain facial scans or link scans to a user’s individual identity or account; and/or (2) all of the data is stored on-device.
First, in Daichendt and Odell v. CVS Pharmacy, the Northern District of Illinois dismissed a case against CVS for its passport photosystem, which scans facial geometry in photos to confirm that they meet government requirements for passports (e.g., a person’s eyes are open, their mouth is closed and not smiling, and eyeglasses are not present). The court held that the plaintiffs failed to allege that CVS’ photosystem enabled CVS to determine their identities, nor did the plaintiffs provide CVS “with any information, such as their names or physical or email addresses, that could connect the voluntary scans of face geometry with their identities.”
Separately, in Apple v. Barnett, the Illinois’ appellate court held that Apple was not subject to BIPA requirements regarding their Face ID on iPhone because the company was not “collecting” or “possessing” users’ biometric data since the data was completely stored on the device and never stored on Apple servers. Thus, XR devices that do not retain facial scans that can link to users’ accounts, or only store data on-device (such as Apple’s recently announced Vision Pro) may be out of scope of even some of the broadest biometrics laws.
D. Eye-tracking and Voice Analysis May Also be Considered “Biometric” if the Technology and Data are Capable of Identifying an Individual
In addition to face-based biometric technologies, most XR devices also use other forms of body-based detection or characterization systems for device functionality, such as voice analysis and eye-tracking. As seen with facial detection, these features are developed to detect or create predictions regarding bodily characteristics or behavior, but the subject is typically not identifiable and PII is typically not retained. For example, XR devices often contain microphones to capture a user’s voice and surroundings, which can enable voice commands, verbal interactions with other users, spatial mapping, and realistic sound effects. XR devices may also maintain inward-facing cameras that collect data about a user’s gaze—where they look and for how long—to enable eye tracking. This may be used to improve graphics and allow for more expressive avatars, including avatars that can display microexpressions.
Whether these systems that collect voice or gaze data are covered by biometric or comprehensive data privacy laws may depend on whether an organization can use the technology to identify an individual, even if not used in that capacity. As seen in CVS Pharmacy, many Illinois courts focus on the capacity of the technology to identify an individual. As an initial matter, biometric and comprehensive privacy laws typically apply to “voiceprints,” and not voice recordings. As stated by the Illinois Attorney General, “a voiceprint, which is a record of mechanical measurement, is not the same as a simple recording of a voice.”
However, the line between a voice recording and a voiceprint is blurry, particularly as it relates to the gray area of natural language processing (NLP)—a kind of artificial intelligence (AI) that can use audio to understand, interpret, and manipulate language. In Carpenter v. McDonald’s Corp., the U.S. District Court for the Northern District of Illinois found that McDonald’s drive-through voice assistant technology could be used for identification purposes, and thus could be considered a “voiceprint” under BIPA, since the technology’s patent application states that the technology may capture voice characteristics “like accent, speech pattern, gender, or age for the purpose of training the AI.” In a similar ongoing case against Petco, an Illinois federal judge permitted BIPA claims to proceed regarding employee voice data, stating “[w]hat matters [at the dismissal stage] is not how defendant’s software actually used plaintiffs’ data, but whether the data that Petco gathered was capable of identifying [the employees].” As a result, if an XR device captures vocal characteristics that are capable of unique identification, certain voice data may be considered a “voiceprint” under BIPA. This analytical framework will likely apply to jurisdictions that define biometric data to include biological characteristics that have the potential to identify an individual, such as in the final rules under the Colorado Privacy Act regarding biometric identifiers, or under the FTC’s Policy Statement on Biometric Information.
Whether privacy laws apply to gaze data, however, is even less clear. BIPA lawsuits against online exam proctoring services, autonomous vehicles, and “smart advertising screens” suggest that eye-tracking could be a biometric identifier under BIPA, even if not used for identification. In each of these cases, the technology conducted eye-tracking to determine where a user was looking—whether on the screen, the road, or in the store—but did not identify the individual. Instead, these technologies made inferences about whether someone may be cheating, not paying attention to the road, or what product they were looking at. Plaintiffs in these cases argue that eye-tracking is part of the technology’s collection and analysis of facial geometry, thus making it a “biometric identifier” under BIPA.
Unfortunately, state and federal courts in Illinois have not analyzed whether and to what extent eye tracking, without additional face analysis, constitutes a biometric identifier, nor whether it is a subset of facial analysis. Rather, most cases proceed based solely on the software’s overall facial analysis features, if at all. If courts are prone to equate facial detection scans to “facial geometry,” and voice analysis to “voiceprints,” they may also conflate eye tracking with “a retina or iris scan,” and thus treat eye tracking as a biometric identifier. Or they may follow the BIPA plaintiffs’ analysis, lumping eye-tracking into facial analysis as “facial geometry.” Alternatively, courts could characterize eye tracking as altogether separate from BIPA’s “facial geometry” and “retina or iris scan” categories. In any event, like with voice analysis, if an XR device collects gaze data that could be used for identification purposes, laws with broad biometrics definitions will apply, while other laws that have narrower definitions focused on the data or technology’s current use, may exclude the technology.
Takeaways
Statutory language and court opinions vary in how they define and/or apply to biometric data and identifiers. Though the plain text of most U.S. biometric and comprehensive data privacy laws tie their definition of a “biometric” to the identification of an individual, some laws may be more broadly applied to technologies that use body-based data for non-identification purposes. While most of the body-based data XR collects is not used for identification, litigation brought under BIPA and other state laws suggest that lawmakers and judges may consider certain kinds and uses of such data—for example, AR “facial scans,” eye tracking, and voice— to be biometrics. Whether this will be the case (or continue to be the case) depends on how policymakers draft these laws, and how courts, enforcement bodies, and other parties to litigation interpret statutes regulating biometrics.
#NoiSiamoLeScuole questa settimana racconta l’Istituto Comprensivo “Cesalpino” di Arezzo!
Grazie al PNRR il vecchio edificio della scuola verrà demolito e ricostruito secondo una moderna organizzazione degli spazi, ma soprattutto nel rispetto della …
Ministero dell'Istruzione
#NoiSiamoLeScuole questa settimana racconta l’Istituto Comprensivo “Cesalpino” di Arezzo! Grazie al PNRR il vecchio edificio della scuola verrà demolito e ricostruito secondo una moderna organizzazione degli spazi, ma soprattutto nel rispetto della …Telegram
FPF Submits Comments to the FTC on the Application for a New Parental Consent Method
Today, the Future of Privacy Forum (FPF) submitted comments to the Federal Trade Commission (FTC) regarding the use of “Privacy-Protective Facial Age Estimation” as a potential mechanism for verifiable parental consent (VPC) under the Children’s Online Privacy Protection Act (COPPA) Rule.
FPF observes:
- The “Privacy-Protective Facial Age Estimation” technology may improve the existing landscape for verifiable parental consent, provided appropriate privacy safeguards are in place;
- The “Privacy-Protective Facial Age Estimation” technology and associated risks are distinct from the biometric privacy risks associated with facial recognition technologies; and
- If the FTC approves the application, the Commission’s approval should require ongoing implementation of the privacy and fairness safeguards outlined in the application.
In June, FPF published The State of Play: Is Verifiable Parental Consent Fit for Purpose?, investigating the shortcomings and opportunities presented by the current framework for verifiable parental consent (VPC) under COPPA and encouraging ingenuity to address key challenges. As federal lawmakers seek more comprehensive ways to update the 1998 law to match the 2023 online landscape, the approval of a new method for obtaining VPC has the potential to improve a process that is grappling with changing technologies, business practices, and individuals’ expectations.
FPF’s comments do not discuss the merits of using technology as a method of age estimation or verification for all users of a child-directed or mixed-audience service, which may place disproportionate privacy risks and burden on all users. Rather, we confine our analysis to the proposed context of this application, which we understand to only refer to the limited use of verifying that a purported parent granting COPPA consent is, in fact, an adult.
FPF’s full comments to the Commission are available here.
Data Sharing for Research: A Compendium of Case Studies, Analysis, and Recommendations
Today, the Future of Privacy Forum (FPF) published a report on corporate-academic partnerships that provides practical recommendations for companies and researchers who want to share data for research. The Report, Data Sharing for Research: A Compendium of Case Studies, Analysis, and Recommendations, demonstrates how, for many organizations, data-sharing partnerships are transitioning from being considered an experimental business activity to an expected business competency.
Corporate data-sharing partnerships offer compelling benefits to companies, researchers, and society to drive progress in a broad array of fields. However, organizations have long faced complex commercial, legal, ethical, and reputational risks that accompany the activity and act as disincentives to sharing data for academic research.
This report contains eight case studies that look at specific corporate/academic data-sharing partnerships in depth, from initiation through the publication of research findings. These case studies illuminate practical challenges for implementing corporate data sharing with researchers. Some common themes that emerged from the case studies include:
- Successful data-sharing partnerships use Data-Sharing Agreements that require both the company and researchers to take steps to protect privacy.
- Some of the challenges of data sharing include technical knowledge and infrastructure gaps between companies and researchers, and the continuing need for ethics and privacy review for industry-based research.
- Promising practices for data sharing include the use of Privacy Enhancing Technologies and company-created, public-facing data-sharing menus to facilitate new partnerships.
- While data sharing has significant costs and inherent risks, the risks can be managed, and the benefits to researchers, companies, and society make data sharing worth the effort.
This report builds upon prior FPF research, including the publication of The Playbook: Data Sharing for Research and the companion infographic in 2022. The case studies examine how data sharing works in a practical environment. By analyzing the case studies as a group, we arrived at recommendations for all parties interested in pursuing an ethical data-sharing partnership that protects against privacy risks.
For companies considering data sharing for research, we recommend the following:
- Create a public webpage listing all data the company is willing to share, describe any requirements for potential data-sharing partnerships, and create a public form for researchers to ask questions.
- Bolster privacy by using Privacy Enhancing Technologies (PETs), reduce data sensitivity through data minimization and aggregation, and include metadata as part of internal privacy reviews before sharing.
- Promote rigorous data governance by assigning multiple people with expertise to manage data sharing, connect core team members to the data-sharing team, and adapt Data Sharing Agreements to align with the company’s available budgetary and personnel support.
- Ensure researchers maintain authorial control over research methods, data analysis, interpretation, and publishing/communication venue. Where appropriate, companies may reserve the right to review data before publication to assess privacy risks and consult on the analytical limitations of the data.
For researchers interested in using data held by a company for research, we recommend the following:
- Proactively contact companies that may hold data of interest and maintain continuous communication, especially about publication expectations.
- Cultivate internal partnerships by involving the university general counsel early on and checking to see if the university has a standard Data Sharing Agreement. Contact the university’s Research Integrity Office and Information Technology Office before any data is shared, and consult the library for research support.
- Receive training on how to integrate Privacy Enhancing Technologies in research and include privacy-related technical infrastructure in all funding proposals.
- Coordinate with the company about any requirements for publishing, data sharing, data retention, and citation while maintaining academic independence.
You can access each of our individual case studies at these links:
- AIMS Collaboratory and External Partners
- Gravy Analytics and the University of Florida
- IBM and External Partners
- Johnson & Johnson and The YODA Project
- Khan Academy and External Partners
- LinkedIn and External Partners
- Meta and External Partners
- Microsoft and the United Nations
Download accessible versions of these documents here.
FPF offers the Ethics and Data in Research Working Group, which analyzes US legislation impacting research and data, discusses ethical and technological research challenges, and develops best practices for privacy protection, risk reduction, and data sharing in research. Learn more and join the Working Group here.
For inquiries about this report, please contact Shea Swauger, Senior Researcher for Data Sharing and Ethics, at sswauger@fpf.org.
This project is supported by the Alfred P. Sloan Foundation, a not-for-profit grantmaking institution whose mission is to enhance the welfare of all through the advancement of scientific knowledge.
Audizione del Presidente del Garante per la protezione dei dati personali, Prof. Pasquale Stanzione in merito alla gestione dei dati dei contatori legati alle utenze del servizio elettrico
@Notizie dall'Italia e dal mondo
"andrebbero definiti i limiti di ordine oggettivo e soggettivo dell’accesso al Portale, specificando, sotto il primo profilo, il novero delle “terze parti” abilitate a fruire della messa a disposizione dei dati di consumo dei clienti finali, nonché, sotto il secondo, le tipologie di dati “relativi all'immissione” e al “prelievo” di energia elettrica e al prelievo di gas naturale. Il Portale consumi contiene, infatti, una molteplicità di dati personali (es. dati anagrafici, POD e PDR, pratiche di switching, dati contrattuali, ecc.) la cui ampia disponibilità, da parte di soggetti diversi dall’interessato, potrebbe determinare implicazioni importanti sulla riservatezza (si pensi al fenomeno diffuso dell’attivazione fraudolenta di utenze nel mercato libero dell’energia). Andrebbero, inoltre, più dettagliatamente individuate le finalità dell’accesso, con una formulazione più circoscritta rispetto a quella relativa al confronto tra “offerte comparabili” o “all’erogazione di servizi da parte dei predetti soggetti terzi”, di per sé inadeguata a escludere scopi ulteriori (es. profilazione dei clienti, elaborazione di dati statistici, etc.) rispetto a quelli più strettamente connessi alla valutazione dell’impronta energetica del cliente finale o del confronto di offerte comparabili perseguite dalla normativa (v. in merito anche le direttive nn. 2019/944/UE e 2012/27/UE).
L’ampliamento delle possibilità di accesso ai dati di consumo degli utenti finali da parte di terzi imporrebbe, infatti, proprio per le sue possibili implicazioni, una valutazione più analitica dell’opportunità di ridefinire il ruolo del Portale consumi all’interno del Sistema informativo integrato, secondo un congruo bilanciamento tra l’interesse alla promozione dell’efficienza energetica e il diritto alla protezione dei dati personali."
like this
reshared this
The Digital Personal Data Protection Act of India, Explained
Authors: Raktima Roy, Gabriela Zanfir-Fortuna
Raktima Roy is a Privacy Attorney [i]with several years of experience in India and holds an LLM in Law and Technology from Georgetown University, as well as an FPF Global Privacy Intern.
The Digital Personal Data Protection Act of India (DPDP) sprinted through its final stages last week after several years of debates, postponements and negotiations, culminating with its publication in the Official Gazette on Friday, August 11, 2023. In just over a week, the Bill passed the lower and upper Houses of the Parliament and received Presidential assent. India, the most populous country in the world with more than 1.4 billion people, is the largest democracy and the 19th country among the G20 members to pass a comprehensive personal data protection law – which it did during its tenure holding the G20 Presidency.
The adoption of the DPDP Bill in the Parliament comes 6 years after Justice K.S. Puttaswamy v Union of India, a landmark case in which the Supreme Court of India recognized a fundamental right to privacy in India, including informational privacy, within the “right to life” provision of India’s Constitution. In this judgment, a nine-judge bench of the Supreme Court urged the Indian Government to put in place “a carefully structured regime” for the protection of personal data. As part of India’s ongoing efforts to create this regime, there have been several rounds of expert consultations and reports, and two previous versions of the bill were introduced in the Parliament in 2019 and 2022. A brief history of the law is available here.
The law as enacted is transformational. It has a broad scope of application, borrowing from the EU’s General Data Protection Regulation (GDPR) approach when defining “personal data” and extending coverage to all entities who process personal data regardless of size or private status. The law also has significant extraterritorial application. The DPDP creates far reaching obligations, imposing narrowly defined lawful grounds for processing any personal data in a digital format, establishing purpose limitation obligations and their corollary – a duty to erase the data once the purpose is met, with seemingly no room left for secondary uses of personal data, and creates a set of rights for individuals whose personal data are collected and used, including rights to notice, access and erasure. The law also creates a supervisory authority, the Data Protection Board of India (Board), which has the power to investigate complaints and issue fines, but does not have the power to issue guidance or regulations.
At the same time, the law provides significant exceptions for the central government and other government bodies, the degree of exemption depending on their function (such as law enforcement). Other exemptions include those for most publicly available personal data, processing for research and statistical purposes, and processing the personal data of foreigners by companies in India pursuant a contract with a foreign company (such as outsourcing companies). Some processing by startups may also be exempt, if notified by the government. The Act also empowers the central government to act upon a notification by the Board and request access to any information from an entity processing personal data, an intermediary (as defined by the Information Technology Act, 2000 – the “IT Act”) or from the Board, as well as to order suspension of access of the public to specific information. The Central Government is also empowered to adopt a multitude of “rules” (similar to regulations under US state privacy laws) that detail the application of the law.
It is important to note that the law will not come into effect until the government provides notice of an effective date. The DPDP Act does not contain a mandated transitional period akin to the two-year gap between the 2016 enactment of the GDPR and its entry into force in May 2018. Rather, it empowers the Government to determine the dates on which different sections of the Act will come into force, including the sections governing the formation of the new Board that will oversee compliance with the law.
This blog will lay out the most important aspects of the DPDP Act, understanding nonetheless that many of its key provisions will be shaped up through subsequent rules issued by the central government, and through practice.
- The DPDP Act Applies to “Data Fiduciaries,” “Significant Data Fiduciaries,” and provides rights for “Data Principals”
The DPDP Act seeks to establish a comprehensive national framework for processing personal data, replacing a much more limited data protection framework under the IT Actand rules that currently provide basic protections to limited categories of “sensitive” personal data such as sexual orientation, health data, etc. The new law by contrast covers all “personal data” (defined as “any data about an individual who is identifiable by or in relation to such data”) and does not contain heightened protection for any special category of data. The definition of “personal data,” thus, relies on the broad “identifiability” criterion, similar to the GDPR. Only “digital” personal data, or personal data collected through non-digital means that have been digitized subsequently are covered by the law.
The DPDP Act uses the term “data principal” to refer to the individual that the personal data relates to (the equivalent of “data subject” under the GDPR). A “data fiduciary” is the entity that determines the purposes and means of processing of personal data, alone or in conjunction with others, and is the equivalent to a “data controller” under GDPR. While the definition of data fiduciaries includes a reference to potential joint fiduciaries, the Act does not provide any other details about this relationship.
The definition of fiduciaries does not distinguish between private and public, natural and legal persons, technically extending to any person as long as the other conditions of the law are met.
Specific Fiduciaries, Public or Private, Are Exempted or May Be Exempted from the Core Obligations of the Act
The law includes some broad exceptions for government entities in general, and others apply to specific processing purposes. For instance, the law allows the government to exempt activities that are in the interests of the sovereignty and integrity of India, the security of the State, friendly relations with foreign States, maintenance of public order, or preventing incitement to commit crimes if it provides notice of the exemptions. Justice Srikrishna, who as the head of an expert committee set up to recommend a data protection law in India led the creation of the 2017 first draft of the law, has been critical of these government exemptions, as have been several Members of Parliament during the legislative debate.
Some targeted exceptions also apply to companies, and are either well defined in the law or left to the government for specification. Under what can be called an “outsourcing exception,” the Act exempts companies based in India who process the personal data of people outside of India pursuant to a contract with a company based outside of India from core DPDP obligations including the rights of access and erasure normally held by data principals. Instead, such companies are largely required to only comply with data security obligations.
In addition, the government is empowered to exempt any category of data fiduciaries from some or all of the law, with the DPDP itself referring to “startups” in this context. These are fairly broad provisions and do not include any guidance on how they will apply or who could benefit from them. The government will need to make a specific designation for this exception to operate.
Significant Data Fiduciaries Have Significant New Obligations, such as DPOs, DPIAs and Audits
The DPDP Act empowers the Government to designate any data fiduciary or class of data fiduciaries as a “Significant Data Fiduciary” (SDF), which is done using a series of criteria that lack quantifiable thresholds. These factors range from assessing characteristics of the processing operations (volume and sensitivity of personal data processed and the risk posed to the rights of data principals), to broader societal and even national sovereignty concerns (potential impact of the processing on the sovereignty and integrity of India; risk to electoral democracy; security of the state; and public order).
The designation of companies as SDFs is consequential, because it comes with enhanced obligations. Chief among them, SDFs will need to appoint a Data Protection Officer (DPO), who must be based in India and be the point of contact for a required grievance redressal mechanism. SDFs must also appoint an independent data auditor to carry out data audits and evaluate the SDF’s compliance with the DPDP Act, and to undertake periodic Data Protection Impact Assessments.
It is important to note that appointing a DPO is not an obligation for all data fiduciaries. However, all fiduciaries are under an obligation to establish a “readily available” mechanism for redressing grievances by data principals in a timely manner. In order for such a process to be operationalized, usually an internal privacy compliance function or a dedicated privacy officer would be helpful.
The DPDP Act Recognizes the Role of Data Processors
Data processors are recognized by the DPDP Act, which makes it clear that fiduciaries may engage, appoint or otherwise involve processors to process personal data on their behalf “only under a valid contract” (Section 8(2)). There are no prescribed rules for what a processing contract should entail. However, the DPDP Act places all obligations on data fiduciaries, which remain liable for complying with the law.
Data fiduciaries remain liable for overall compliance, regardless of any contractual arrangement to the contrary with data processors. The DPDP Bill requires data fiduciaries to mandate that a processor delete data when a data principal withdraws consent, and fiduciaries be able to share information of processors they have engaged when requested by a data subject.
- The DPDP Act Has Broad Extraterritorial Effect and Almost No Restrictions for International Data Transfers
The DPDP Act applies to the processing of “digital personal data” within India. Importantly, the definition of the “data principal” does not include any condition related to residence or citizenship, meaning that it is conceivable fiduciaries based in India who process the personal data of foreigners within the territory of the country may be covered by the Act (outside of the “outsourcing exception” mentioned above).
The Act also applies extraterritorially to processing of digital personal data outside India, if such processing is in connection with any activity related to offering of goods or services to data principals within India. The extraterritorial effect is similar in scope to the GDPR, and it may leave room for a broader interpretation through its inclusion of “any activity” connected to the offering of goods or services.
The DPDP Act does not currently restrict the transfer of personal data outside of India. It reverses the typical paradigm of international data transfer provisions in laws like the GDPR, by presuming that transfers may occur without restrictions, unless the Government specifically restricts transfers to certain countries (blacklisting) or enacts any other form of restriction (Section 16). No criteria for such restrictions have been mentioned in the law. This is a significant departure from previous instances of the Bill, which at one point contained data localization obligations (2018), and evolved at another point into “whitelisting” of countries (2022).
It should also be noted that other existing sectoral laws (e.g., those governing specific industries like banking and telecommunications) already contain restrictions on cross-border transfers of particular kinds of data. The DPDP Act clarifies that existing localization mandates will not be affected by the new law.
- Consent Remains Primary Means for Lawful Processing of Personal Data Under the Act
Data fiduciaries are under an obligation to process personal data for a lawful purpose and only if they either obtain consent from the data principal for that purpose, or they identify a “legitimate use” consistent with Section 4. This process is conceptually similar to the approach proposed by the GDPR, requiring a lawful ground before personal data can be collected or otherwise processed. However, in contrast to the GDPR (which provides for six possible lawful grounds), the DPDP Act includes only two: strictly defined “consent” and “legitimate use.”
Which lawful ground is used for a processing operation is consequential. Based on the wording of the Act and in the absence of further specification, the obligations of fiduciaries to give notice and respond to access, correction and erasure requests (see Section 4 of this blog) are only applicable if the processing is based on consent and on voluntary sharing of personal data by the principal.
Valid Consent Has Strict Requirements, Is Withdrawable, And Can be Exercised Through Consent Managers
The DPDP Act requires that consent for processing of personal data be “free, specific, informed, unconditional and unambiguous with a clear affirmative action.” These conditions are similarly strict to those required under the GDPR, highlighting that the people whose personal data are processed must be free to give consent, and their consent must not be tied to other conditions.
In order to meet the “informed” criterion, the Act requires that notice be given to principals before or at the time that they are asked to give consent. The notice must include information about the personal data to be collected, the purpose for which it will be processed, the manner in which data principals may exercise their rights under the DPDP Act, and how to make a complaint to the Board. Data principals must be given the option to receive the information in English or a local language among the languages specified in the Constitution.
The DPDP Act addresses the issue of legacy data for which companies may have received consent prior to the enactment of the law. Fiduciaries should provide the same notice to these data principals as soon as “reasonably practicable.” In that case, however, the data processing may continue until the data principal withdraws consent.
Data fiduciaries may only process personal data for the specific purpose provided to the data principal and must obtain separate consent to process the data for a new purpose. In practice, this will make it difficult for data fiduciaries to rely on “bundled consent.” Provisions around “secondary uses” of personal data or “compatible purposes” are not addressed in the Act, making the purpose limitation requirements strict.
Data principals may also withdraw their consent at any time – and data fiduciaries must ensure that the process for withdrawing consent is as straightforward as that for giving consent. Once consent is withdrawn, personal data must be deleted unless a legal obligation to retain data applies. Additionally, data fiduciaries must ask any processors to cease processing any personal data for which consent has been withdrawn, in the absence of legal obligations imposing data retention.
The DPDP Act allows principals to give, manage, review and withdraw their consent through a “Consent Manager,” which will be registered with the Board and must provide an accessible, transparent, and interoperable platform. Consent Managers are part of India’s “Data Empowerment And Protection Architecture” policy, and similar structures have been already functional for some time, such as in the financial sector. Under the DPDP Act, Consent Managers will be accountable to data principals and act on their behalf as per prescribed rules. The Government will notify (in the Gazette) the conditions necessary for a company to register as a Consent Manager, which may include fulfilling minimum technical or financial criteria.
“Legitimate Uses” Are Narrowly Defined and Do Not Include Legitimate Interests or Contractual Necessity
As alternative to consent, all other lawful grounds for processing personal data have been amalgamated under the “legitimate uses” section, including some grounds of processing that previously appeared under a “reasonable purposes” category in previous iterations of the bill. It is notable that the list of “legitimate uses” in Section 7 of the Act does not include similar provisions to the grounds of “contractual necessity” and “legitimate interests” found in GDPR-style data protection laws, leaving limited options to private fiduciaries for grounding processing of personal data outside of consent, including for routine or necessary processing operations.
Among the defined “legitimate uses”, the most relevant ones for processing personal data outside of a government, emergency or public health context, are the “voluntary sharing” of personal data under Section 7(a) and the “employment purposes” use under Section 7(i).
The lawful ground most likely to raise interpretation questions is “voluntary sharing.” It allows a fiduciary to process personal data for a specified purpose for which a principal has voluntarily provided their personal data to the data fiduciary (presumably, provided it without the fiduciary seeking to obtain consent), and for which the principal has not indicated to the fiduciary an objection to the use of the personal data. For instance, one of the illustrations included in the law to explain Section 7(a) is the hypothetical of a buyer requesting a receipt of purchase at a store be sent to her phone number, permitting the store to use the number for that purpose. There is a possibility that subsequent rules may expand this “legitimate use” to cover instances of “contractual necessity” or “legitimate interests.”
A fiduciary may also process personal data without consent for purposes of employment or those related to safeguarding the employer from loss or liability, such as prevention of corporate espionage, maintenance of confidentiality of trade secrets, intellectual property, classified information or provision of any service to employees.
- Data Principals Have a Limited Set of “Data Subject Rights,” But Also Obligations
The DPDP Act provides data principles a set of enumerated rights, which is limited compared to those offered under modern GDPR-style data protection laws. The DPDP guarantees a right of access and a right to erasure and correction, in addition to a right to receive notice before consent is sought (similar to the right to information in the GDPR). Thus, a right to data portability, a right to object to processing based on other grounds than consent, and the right not to be subject to solely automated decision-making are missing.
Instead, the DPDP Act provides for two other rights – a right to “grievance redressal,” which entails the right to have an easily accessible point of contact provided by the fiduciary to respond to complaints from the principal, and a right to “appoint a nominee,” which permits the data principal to nominate someone who can exercise rights on their behalf in the event of death or incapacity.
Notably, the rights of access, erasure and correction are limited to personal data processing based on consent or the “voluntary disclosure,” legitimate use, which means that whenever government bodies or other fiduciaries rely on any of the “legitimate uses” grounds they will not need to reply to access or erasure/correction requests, unless further rules adopted by the government specify otherwise.
In addition, the right of access is quite limited in scope. It only gives data principals the right to request and obtain a summary of the personal data being processed and of the relevant processing activities (as opposed to obtaining a copy of the personal data), and the identities of all fiduciaries and processors with whom the personal data has been shared by the fiduciary, along with a summary of the data being shared. However, Section 11 of the law leaves space for subsequent rules that may specify additional information to be given access to.
Data principals have the right to request erasure of personal data pursuant to Section 12(3), but it is important to highlight that erasure may also be required automatically – after the withdrawal of consent or when the specified purpose is no longer being served (Section 8(7)(a)). Similarly, correction, completion and updating of personal data can be requested by the principal, but must also occur automatically when the personal data is “likely to be used to make a decision that affects” the principal (Section 8(3)).
Data Principals May Be Fined if They Do Not Comply With Their Obligations
Unlike the majority of international data protection laws, Section 15 of the DPDP Act imposes duties on data principals, similar to Article 10 of Vietnam’s recently adopted Personal Data Protection Decree (titled “Obligations of data subjects”).
These obligations include, among others, a duty not to impersonate someone else while providing personal data for a specified purpose, not suppress any material information while providing personal data for any document issued by the Government, and, significantly, not register a false or frivolous grievance or complaint. Noncompliance may result in a fine (see clause 5 of the Schedule). This may hamper the submission of complaints with the Board, per expert analysis.
- Fiduciaries are Bound by a Principle of Accountability and Have Data Breach Notification Obligations
The DPDP Act does not articulate Principles of Processing, or Fair Information Practice Principles, but the content of several of its provisions put emphasis on purpose limitation (as explained in previous sections of the blog) and on the principle of accountability.
Section 8 of the Act includes multiple obligations for data fiduciaries, all under an umbrella expectation in paragraph 1 that they are “responsible for complying” with the provisions of the Act and any subsequent implementation rules, both regarding processing undertaken by the data fiduciary and by any processor on its behalf. This specification echoes the GDPR accountability principle. In addition, data fiduciaries are under an obligation to implement appropriate technical and organizational measures to ensure the effective implementation of the law.
Data security is of particular importance, considering that data fiduciaries must both take reasonable security safeguards to prevent personal data breaches, and notify the Board and each affected party if such breaches occur. The details related to modalities and timeline of notification will be specified in subsequent implementation rules.
A final obligation of data fiduciaries to highlight is the requirement they establish a “readily available” mechanism for redressing “grievances” by data principals in a timely manner. The “grievance redress” mechanism is of utmost importance, considering that data principals cannot address the Board with a complaint until they “exhaust the opportunity of redressing” the grievance through this mechanism (Section 13(3)). The Act leaves determination of the time period for responding to grievances to delegated legislation, and it is possible that there may be different time periods for different categories of companies.
- Fiduciaries Have a Mandate to Verify Parental Consent for Processing Personal Data of Minors under 18
The DPDP Act creates significant obligations concerning the processing of children’s personal data, with “children” defined as minors under 18 years of age, without any distinguishing sub-category for older children or teenagers. As a matter of principle, data fiduciaries are forbidden to engage in any processing of children’s data that is “likely to cause any detrimental effect on the well-being of the child.”
Data fiduciaries are under an obligation to obtain verifiable parental consent before processing the personal data of any child. Similarly, consent must be obtained from a lawful guardian before processing the data of a person with disability. This obligation, which is increasingly common to privacy and data protection laws around the world, may create many challenges in practice. A good resource for untangling its complexity and applicability is FPF’s recently published report and accompanying infographic – “The State of Play: Is Verifiable Parental Consent Fit For Purpose?”
Finally, the Act also includes a prohibition on data fiduciaries engaging in tracking or behavioral monitoring of children, or targeted advertising directed at children. Similar to many other provisions of the Act, the government may issue exemptions from these obligations for specific classes of fiduciaries, or may even lower the age of digital consent for children when their personal data is processed by designated data fiduciaries.
- The Act Creates a Data Protection Board to Enforce the Law, But Reserves Regulatory Powers For the Government
The DPDP Act empowers the Government to establish the Board as an independent agency that will be responsible for enforcing the new law. The Board will be led by a Chairperson and will have Members appointed by the Government for a renewable two-year mandate.
The Board is vested with the power to receive and investigate complaints from data principals, but only after the principal has exhausted the internal grievance redress mechanism set up by the relevant data fiduciaries. The Board can issue binding orders against those who breach the law, can direct urgent measures to remediate or mitigate a data breach, imposing financial penalties and direct parties to mediation.
While the Board is granted “the same powers as are vested in a civil court” – including summoning any person, receiving evidence, and inspecting any documents (Section 28(7)), the Act specifically excludes any access to civil courts in the application of its provisions (Section 39), creating a de facto limitation on effective judicial remedy similar to the relief provided in Article 82 GDPR. The Act grants any person affected by a decision of the Board the right to pursue an appeal in front of an Appellate Tribunal, which is designated the Telecom Disputes Settlement and Appellate Tribunal established under other Indian law.
Penalties for breaches of the law have been stipulated in a Schedule attached to DPDP Act and range from the equivalent in rupees of USD $120 to USD $30.2 million. The Board can determine the penalty amount from a preset range based on the offense.
However, the Board does not have the power to pass regulations to further specify details related to the implementation of the Act. The Government is conferred broad discretion in adopting delegated legislation to further specify the provisions of the Act, including clarifying modalities and timelines for fiduciaries to respond to requests from data principals, the requirements of valid notice for obtaining a data principal’s consent for processing of data, details related to data breach notifications, and more. The list of operational details that may be specified by the Government in subsequent rules is open-ended and detailed in Section 40(2)(a) to (z). Subsection (z) of this provision provides a catch-all permitting the Central Government to prescribe rules on “any other matter” related to the implementation of the Act.
In practice, it is expected that it will take time for the new Board to be established and for rules to be issued in key areas for compliance.
Besides rulemaking power, the Central Government has another significant role in the application of the law. Pursuant to Section 36, it can require any information (including presumably personal data) that it wants (or “call for”) from the Board, data fiduciaries, and “intermediaries” as defined by the IT Act. No further specifications are made in relation to such requests, other than that they must be made “for the purposes of the Act.” This provision is broader and subject to fewer restrictions than provisions on data access requests in the existing IT Act and its subsidiary rules.
Additionally, the Central Government may also order or direct any governmental agency and any “intermediary” to block information for access by the public “in the interests of the general public.” To issue such an order, the Board will need to have sanctioned the data fiduciary concerned at least twice in the past, and the Board must advise the Central Government to issue such an order. An order blocking public access may refer to “any computer resource” that enables data fiduciaries to offer goods or services to data principals within the territory of India. While it is now common among modern comprehensive data protection laws around the world for independent supervisory authorities to order erasure of personal data unlawfully processed, or to order international data transfers or sharing of personal data to cease if conditions of the law are not met, these provisions of the DPDP Act are atypical because the orders will come directly from the Government, and also because they more closely resemble online platform regulation than privacy law.
- Exceptions for Publicly Available Data And Processing for Research Purposes Are Notable for Training AI
Given that this law comes in the midst of a global conversation about how to regulate artificial intelligence and automated decision-making, it is critical to highlight provisions in the law that seem directed at facilitating development of AI trained on personal data. Specifically, the Act excludes from its application most publicly available personal data, as long as it was made publicly available by the data principal – for example, a blogger or a social media user publishing their personal data directly – or by someone else under a legal obligation to publish the data, such as personal data of company shareholders that regulated companies must publicly disclose by law.
Additionally, the Act exempts the processing of personal data necessary for research or statistical purposes (Section 17(2)(b)). This exemption is extremely broad, with only one limitation in the core text: the Act will still apply to research and statistical processing if the processing activity is used to make “any decision specific to the data principal.”
There is only one other instance in the DPDP Act where processing data to “make decisions” about a data principal is raised. Data fiduciaries are under an obligation to ensure the “completeness, accuracy and consistency” of personal data if it is used to make a decision that affects the data subject. In other words, while the Act does not provide for a GDPR-style right not to be subject to automated decision-making, it does require that when personal data are used for making any individual decisions, presumably including automated or algorithmic decisions, such data must be kept accurate, consistent and complete.
Additionally, the DPDP Act remains applicable to any processing of personal data through AI systems, if the other conditions of the law are met, given the broad definitions of “processing” and of “personal data.” Further rules adopted by the Central Government or other notifications may provide more guidance in this regard.
Notably, the Act does not exempt processing of personal data for journalistic purposes, a fact criticized by the Editors’ Guild of India. In previous versions of the Bill, especially the expert version spearheaded by Justice Srikrishna in 2017, this exemption was present. It is still possible that the Central Government will address this issue through delegated legislation.
Key Takeaways and Further Clarification
India’s data protection Act has been in the works for a significant period of time and the passage of the law is a welcome step forward after the recognition of privacy as a fundamental right in India by the Supreme Court in its landmark Puttaswamy judgment.
While the basic structure of the law is similar to many other global laws like the GDPR and its contemporaries, India’s approach has its differences, such as more limited grounds of processing, wide exemptions for government actors, regulatory powers for the government to further specify the law and to exempt specific fiduciaries or classes of fiduciaries from key obligations, no baked-in definition or heightened protection for special categories of data, and the rather unusual inclusion of powers for the Government to request access to information from fiduciaries, the Board and “intermediaries”, as well as to block access by the public to specific information in “computer resources”.
Finally, we note that many details of the Act are still left to be clarified once the new Data Protection Board of India is set up and further rules for the specification of the law are drafted and officially notified.
Editors: Lee Matheson, Dominic Paulger, Josh Lee Kok Thong
Sul 2% alla Difesa l’Italia dimostri serietà. Parla Pinotti
In un recente intervento il presidente del Copasir, e già ministro della Difesa, Lorenzo Guerini si è detto preoccupato da quello che ha percepito come un potenziale “arretramento” del Partito Democratico sull’impegno italiano a raggiungere il 2% del Pil da destinare alle spese per la Difesa. Per il deputato Dem, le necessità di garantire la sicurezza allo spazio euro-atlantico, minacciato direttamente da una guerra ai confini dell’Europa, e l’esigenza di dimostrare la credibilità del Paese nel contesto internazionale, richiedono un gesto di responsabilità da parte di tutte le forze politiche. Airpress ne ha parlato con Roberta Pinotti, già ministro della Difesa e già presidente della Commissione Difesa prima alla Camera e poi al Senato.
Lei condivide i timori di Guerini?
Fare parte di un’Alleanza contempla la necessità, per reputazione e serietà, di impegnarsi a mantenere gli impegni che insieme sono stati assunti. La dichiarazione finale del vertice Nato, in cui i leader dei Paesi alleati si impegnavano, dopo anni di riduzioni anche drastiche alle spese per la difesa, a puntare alla linea guida del 2% del Pil, incrementandole progressivamente, fu sottoscritta a Cardiff nel luglio 2014, alcuni mesi dopo l’occupazione della Crimea da parte della Russia. Il Partito Democratico, così come i partiti le cui storie sono confluite per formarlo, ha sempre impostato la propria politica estera con senso di responsabilità e una adesione convinta alle alleanze internazionali. Credo che il PD non dovrebbe deflettere da questa linea di serietà e affidabilità internazionale.
Nel corso della passata legislatura, il PD si era impegnato in prima linea per la costruzione di una visione condivisa in Parlamento che fissasse l’obiettivo al 2028. Questo cambio di passo da parte della segreteria non rischia di apparire una contraddizione da parte del Partito?
Ricordo bene quella discussione e credo sia stato importante costruire una visione condivisa in Parlamento, visione a cui il PD diede un significativo contributo, non solo attraverso il ministro della Difesa, Lorenzo Guerini, ma anche grazie al lavoro dei gruppi parlamentari. Venne immaginata una seria road map che portava a un incremento progressivo fino al 2% entro il 2028, tempistica che fu scelta valutando le effettive capacità di spesa e di crescita industriale del nostro Paese. La Germania scelse allora un balzo molto più rapido per l’incremento delle risorse: a differenza di quanto viene detto da più parti non sta riabbassando l’asticella dell’obiettivo, lo ha rimodulato proprio per rispettare l’effettiva capacità di spesa.
Un eventuale ritardo, inoltre, potrebbe rendere il Paese il “fanalino di coda” dell’Europa, con un conseguente danno anche alla credibilità internazionale dell’Italia?
Per costruire la difesa europea, tema a cui mi sono dedicata con determinazione arrivando a promuovere la prima cooperazione rafforzata, sottoscritta nel 2017, le risorse sono necessarie; gli impegni degli Stati nazionali in termini di sicurezza e difesa dovranno, semmai, essere ancora più credibili. In prospettiva una Difesa comune consentirà razionalizzazioni che eviteranno sprechi di risorse e inutili duplicazioni, ma ad oggi gli Stati europei, che sono nella stragrande maggioranza membri Nato, richiedono da parte di tutti i partner analogo impegno e serietà. Tanto più che c’è ancora una guerra nel cuore dell’Europa.
Sia le istituzioni della Difesa, sia gli addetti ai lavori, sono concordi nel mettere in guardia dai rischi per la sicurezza italiana, europea e transatlantica da un ritardo nell’adeguamento delle proprie spese al 2%. Un impegno che, ricordiamo, è stato assunto in sede Nato nel 2024, quando lei era ministro della Difesa, e che è stato confermato da tutti i governi italiani che si sono succeduti da allora. Su questo aspetto, qual è la sua posizione?
Ho misurato con mano quanto all’estero la buona reputazione del nostro Paese sia anche connessa alla serietà e alle indubbie capacità con cui le nostre Forze armate hanno operato e operano negli scenari di crisi. Nel 2014 eravamo molto distanti dal 2% di spesa, ma da allora ho operato, anche se non sempre le condizioni generali della finanza pubblica lo hanno consentito, perché gli impegni non fossero scritti sulla sabbia. Pacta sunt servanda.
XVII edizione del concorso "Juvenes Translatores", per promuovere l'apprendimento delle lingue nelle scuole e consentire ai giovani di conoscere il mestiere di traduttore.
Ministero dell'Istruzione
#NotiziePerLaScuola XVII edizione del concorso "Juvenes Translatores", per promuovere l'apprendimento delle lingue nelle scuole e consentire ai giovani di conoscere il mestiere di traduttore.Telegram
Piers Paul Read – Tabù
L'articolo Piers Paul Read – Tabù proviene da Fondazione Luigi Einaudi.
Conoscenza e processo sociale – Lorenzo Infantino
Friedrich A. von Hayek è noto soprattutto per i suoi scritti di economia e di filosofia politica. Ma egli ha lasciato un’estesa eredità anche nel campo della psicologia teorica e della teoria della conoscenza. Anzi, si può dire che è proprio quanto da lui sostenuto in tale ambito a fornire gli strumenti con cui afferrare il significato della sua intera opera. Cresciuto in quella che era ancora la Grande Vienna, Hayek ha intrapreso gli studi economici munito di una vasta dotazione culturale, la cui presenza è chiaramente avvertibile anche nei suoi primi scritti di teoria economica. Il che lo ha progressivamente spinto a misurarsi con questioni che, nella spiegazione della vita individuale e collettiva, precedono e conferiscono una più adeguata identificazione ai problemi economici e sociali. Il lettore vedrà che, posti per la prima volta assieme, gli scritti raccolti in questo volume consentono di percorrere un itinerario cha va dalla trasformazione del cervello in una mente umana al perché il mondo sensoriale non sia il punto di partenza, dall’esistenza di un ordine presensoriale alla constatazione che ciò di cui siamo consapevoli è un fenomeno secondario, dalla scienza come sistema ipotetico-deduttivo ai gradi delle nostre spiegazioni e ai fenomeni complessi, dalla dispersione della conoscenza all’interno della società al processo sociale come esplorazione dell’ignoto, dalla presunzione di onniscienza agli «abusi della ragione». È un viaggio che getta una potente luce sull’estensione dell’opera hayekiana e sulla sua fecondità. Non sorprende pertanto che Hayek abbia portato a un più alto grado di elaborazione teorica l’insegnamento metodologico di Carl Menger, il fondatore della Scuola austriaca di economia. Più esattamente, ha mostrato come quell’insegnamento possa essere considerato la provincia di un continente molto più vasto, dentro cui si trovano, per rammentare solo i principali, i contributi di Bernard de Mandeville, David Hume e Adam Smith. Sono autori accomunati dalla stessa premessa gnoseologica, dal riconoscimento cioè della condizione di ignoranza e di fallibilità, a cui indefettibilmente soggiacciono tutti gli esseri umani: perché non c’è nulla che possa renderci onniscienti e non c’è precauzione che possa sottrarci all’errore.
L'articolo Conoscenza e processo sociale – Lorenzo Infantino proviene da Fondazione Luigi Einaudi.
Palestinesi in Israele: lo Stato ci lascia nelle mani della criminalità
Twitter WhatsAppFacebook LinkedInEmailPrint
di Michele Giorgio*
(foto di Ibrahim Husseini/The New Arab)
Pagine Esteri, 5 settembre 2023 – In una domenica di inizio agosto, calda e umida, una folla si è radunata in piazza Habima a Tel Aviv per una manifestazione organizzata per ragioni ben diverse da quelle legate alla protesta contro la riforma giudiziaria che ha reso simbolico questo luogo. In maggioranza c’erano palestinesi d’Israele ma anche ebrei. Ragazzi e adulti, uomini e tante donne, non poche della quali velate. Portavano cartelli in arabo ed ebraico con slogan come «Paghiamo le tasse, non il pizzo». Poi è partita la «Marcia dei Morti», con decine di giovani accanto a bare bianche. È stata la protesta più rilevante organizzata dalla società civile araba contro l’inazione del governo Netanyahu nei confronti della criminalità che miete vittime ormai ogni giorno nelle città e nei villaggi arabi in Israele. «Voglio che la polizia faccia rispettare la legge per tutti – ha aggiunto Naja Nasrallah, un’attivista – voglio che smantelli le organizzazioni criminali nella comunità araba. Non vogliamo essere cittadini di seconda classe».
Ben pochi i leader politici ebrei che hanno partecipato. Si è intravista la presidente del partito laburista Merav Michael e gli ex parlamentari Michael Melchior, Yair Golan e Mossi Ras. «È una vergogna il disinteresse delle autorità. Lo Stato deve sradicare questi fenomeni, non può restare a guardare», spiegava Daniel Spitz, 72 anni, ai giornalisti.
Non la pensano allo stesso modo il governo Netanyahu e i comandi delle forze di polizia e di intelligence. Da quella afosa domenica sera non è cambiato nulla se non il numero dei morti ammazzati, 160 in otto mesi, per mano della criminalità o in vendette tra famiglie. Appena pochi giorni fa c’è stato un massacro nella cittadina a maggioranza drusa di Abu San, alle porte di San Giovanni d’Acri: quattro persone sono state uccise a colpi di arma da fuoco, tra cui Ghazi Saab un politico locale. «Lo Stato e le forze dell’ordine non possono chiudere un occhio davanti al dilagante terrore criminale. È responsabilità esclusiva del governo e delle forze dell’ordine», ha protestato il leader spirituale dei drusi Sheikh Mubarak Tari.
Il capo dell’opposizione, il centrista Yair Lapid, ha attribuito la responsabilità al ministro Ben Gir e al governo di estrema destra. «Queste uccisioni, la criminalità nei centri arabi, però non sono un fenomeno di questi mesi. C’erano anche quando governava Lapid e quando erano al potere altri primi ministri» commenta Fares, un insegnante di Haifa. «Gli ebrei – prosegue – ci chiedono perché non partecipiamo alle manifestazioni per la democrazia ma la democrazia in questo paese è ebraica, quindi per loro. Siamo cittadini di seconda classe, da noi la polizia dello Stato ebraico non manda i suoi agenti a combattere i criminali».
Sino ad oggi è stata minima la partecipazione di cittadini arabi ai raduni contro la riforma della giustizia con centinaia di migliaia di persone che paralizzano il centro di Tel Aviv e di altre città. «Per i cittadini arabi – ci spiega l’analista Nadim Nashef, presidente dell’associazione Amli – lo scontro sulla giustizia è una lotta per l’egemonia tra ebrei laici, spesso ashkenaziti, che si rifanno all’Israele dei primi 40 anni e gli altri con varie origini che oggi sono al governo. I palestinesi d’Israele vogliono la democrazia ma non quella ebraica. Chiedono che Israele non sia più lo Stato degli ebrei ma lo Stato di tutti i suoi cittadini. Non si appassionano allo scontro in atto, perché punta a conservare o a scardinare lo status quo che a loro comunque non va bene». Pagine Esteri
*Questo articolo è stato pubblicato in origine dal quotidiano Il Manifesto
Twitter WhatsAppFacebook LinkedInEmailPrint
L'articolo Palestinesi in Israele: lo Stato ci lascia nelle mani della criminalità proviene da Pagine Esteri.
In Cina e Asia – Per la prima volta Xi salterà il G20
Per la prima volta Xi salterà il G20
Cina: un nuovo ufficio governativo per il settore privato
Cooperazione tra Giappone e Turchia per la ricostruzione dell’Ucraina
Il fondatore di Huawei esorta il gigante tecnologico a far crescere i talenti
I "gate-crashers" cinesi preoccupano gli Usa per rischio spionaggio
200 mila a Seoul per chiedere maggiori tutele per gli insegnanti
L'articolo In Cina e Asia – Per la prima volta Xi salterà il G20 proviene da China Files.
Ventotene: seminario nazionale di formazione federalista
La quarantaduesima edizione del Seminario nazionale di formazione federalista avrà luogo sull’isola di Ventotene dal 3 all’8 settembre, promosso dall’Istituto di Studi Federalisti “Altiero Spinelli”.
Nato nel 1982 su proposta di Altiero Spinelli che in quell’isola scrisse assieme ad Ernesto Rossi il “Manifesto di Ventotene”, il Seminario è diventato uno dei più importanti momenti di riflessione sul futuro dell’Europa e del mondo al quale hanno partecipato nel corso degli anni importanti personalità europee del panorama politico e culturale.
Ogni anno vi prendono parte circa 150 giovani europei attraverso 60 ore di formazione e dibattito tenute da circa 30 relatori.
In tale quadro, con un dibattito di apertura incentrato sulla capacità del Parlamento europeo di imprimere in questo momento storico una svolta federale al processo di integrazione dell’UE, saranno ricordati anche due importanti anniversari legati alla fondazione delle più rilevanti organizzazioni impegnate per l’unità federale europea ossia i 75 anni del Movimento Europeo e gli 80 anni del Movimento Federalista europeo.
Il Comune di Ventotene, in collaborazione con l’Istituto Altiero Spinelli, propone inoltre, in occasione del Seminario di formazione, alcuni eventi dal 3 al 6 settembre: PROGRAMMA.
Per la Fondazione Luigi Einaudi, sarà presente il Project Manager Avv. Gian Marco Bovenzi.
L'articolo Ventotene: seminario nazionale di formazione federalista proviene da Fondazione Luigi Einaudi.
«Rafforzare le relazioni». L’equilibrismo di Tajani a Pechino
Il partenariato strategico è ora per Tajani «più importante della Via della Seta». Il ministro degli Esteri ha anche invitato la Cina a «usare la sua influenza» per favorire una «pace giusta» in Ucraina. E si sarebbe parlato anche di Africa
L'articolo «Rafforzare le relazioni». L’equilibrismo di Tajani a Pechino proviene da China Files.
intervento del nostro compagno Vincenzo Colaprice, al dibattito ““Struggles and alternatives for a Europe of peace, progress, cooperation”. al Festival do Avante
Vincenzo Colaprice* Care compagni, Vorrei iniziare questo il mio intervento confessando che è un onore ed un piacere immenso per me poter intervenire inRifondazione Comunista
I diritti LGBTQ+ sono stati SEMPRE legati alla privacy, mentre la violazione della privacy è stata spesso utilizzata per opprimere le persone LGBTQ+ criminalizzandole in base ai propri comportamenti
Una delle osservazioni che ci è stata fatta già ai tempi del primo Privacy Pride del 13 novembre 2021 è la natura del nome "Pride".
Questo nome infatti non vuole soltanto richiamare il principio su cui si basa quest'iniziativa, ossia l'orgogliosa rivendicazione della privacy, un diritto umano che per sua natura è rivendicabile con tanta più difficoltà proprio da parte di quelle persone che ne hanno più bisogno; ma il nome è anche un tributo alle battaglie della comunità LGBTQ+ che hanno compreso che la scelta coraggiosa di occupare gli spazi pubblici per rivendicare la propria esistenza nella società era un passaggio fondamentale per iniziare a dare agibilità pubblica alla rivendicazione dei propri diritti.
Ma il nome Privacy Pride ci ricorda anche che i diritti LGBTQ+ sono sempre stati legati alla privacy e che proprio la violazione della privacy è stata spesso utilizzata per opprimere le persone LGBTQ+ criminalizzandole in base ai propri comportamenti.
Due anni fa FPF e LGBT Tech hanno passato in rassegna tre delle più significative violazioni della privacy che abbiano avuto impatto sulla comunità LGBTQ+ nella storia moderna degli Stati Uniti:
1. Leggi anti-sodomia e privacy sessuale
2. Il "Lavender scare" iniziato negli anni ’50 e l'impatto sulla tutela dell'occupazione
3. L'epidemia di HIV/AIDS e l'importanza della protezione dei dati personali.
Questi esempi, insieme a molti altri, verranno analizzati nel libro bianco di FPF e LGBT Tech "New Decade, New Priorities: A summary of twelve European Data Protection Authorities’ strategic and operational plans for 2020 and beyond".
Le lezioni apprese dal passato sulla #privacy e sulla storia #LGBTQ+ possono e dovrebbero continuare a plasmare le conversazioni di oggi. Ad esempio, durante l’era del COVID, possiamo applicare le lezioni apprese dall’epidemia di HIV/AIDS per esaminare le questioni relative alle divulgazioni mediche richieste per il COVID-19. Mentre contempliamo questioni che vanno dall’implementazione del tracciamento digitale dei contatti alle divulgazioni mediche obbligatorie per le persone che sono risultate positive al test per COVID-19, dobbiamo comprendere che la raccolta di dati medici, almeno per la comunità LGBTQ+, è una questione profondamente radicata nella storia, intrisa di stigma e contrassegnata dalla mancanza di protezione legale.
Oggi, i dispositivi e i servizi connessi consentono ai membri della comunità LGBTQ+ di partecipare in modo più completo alla vita online. I dati riguardanti l'orientamento sessuale, l'identità di genere o i dettagli sulla sua vita sessuale di un individuo possono essere importanti per la fornitura di servizi sociali e sanitari, la sanità pubblica e la ricerca medica. Tuttavia, i dati relativi all’identità di genere, all’orientamento sessuale e alla vita sessuale di un individuo possono essere incredibilmente delicati e critici e la raccolta, l’uso e la condivisione di questi dati possono sollevare rischi e sfide unici per la privacy. Il dibattito sulla privacy dei dati LGBTQ+ devono tenere conto dei danni del passato.
Vedi anche:
1. Gender Identity, Personal Data and Social Networks: An analysis of the categorization of sensitive data from a queer critique
2. Data collection in relation to LGBTI People
like this
reshared this
XMPP + SNIKKET: Guida in italiano per installare in self hosting un server Xmpp
Ecco una breve guida sulla pagina #misskey (sì, Misskey mette a disposizione la possibilità di creare delle pagine...) creata da @:misskey: Lorenzo Sintoni
like this
Paolo Redaelli reshared this.
quoto, il link teoricamente è visibile anche senza un account nel fediverso. Porta direttamente alla pagina, ho provato a refreshare. Fate sapere.
Grazie per l'interesse!
Un golpe tira l’altro, la françafrique va in pezzi
Twitter
WhatsApp
Facebook
LinkedIn
Email
Print
di Marco Santopadre*
L'articolo Un golpe tira l’altro, la françafrique va in pezzi proviene da Pagine Esteri.
È disponibile il nuovo numero della newsletter del Ministero dell’Istruzione e del Merito.
Ministero dell'Istruzione
#NotiziePerLaScuola È disponibile il nuovo numero della newsletter del Ministero dell’Istruzione e del Merito.Telegram
Tajani a Pechino per evitare ritorsioni dopo l’uscita dalla Via della Seta
ITALIA/CINA. Il ministro degli Esteri incontrerà l'omologo cinese e il responsabile del commercio
L'articolo Tajani a Pechino per evitare ritorsioni dopo l’uscita dalla Via della Seta proviene da China Files.
#laFLEalMassimo – Episodio 100: Prometeo Il distruttore di Mondi
Mentre le bombe dell’invasore russo continuano a cadere sul popolo ucraino che resiste eroicamente, negli schermi cinematografici dell’occidente lampeggiano le esplosioni nucleari del nuovo film di Cristopher Nolan dedicato al fisico Oppenheimer e alla sua controversa figura divisa tra il contributo dato alla realizzazione della prima bomba atomica e la resistenza nei confronti degli sviluppi successivi che hanno portato l’America ad emarginarlo.
C’è un filo logico, semantico e simbolico che lega la realtà e la finzione, non troppo dissimile dalla fallacia intellettuale, con la quale troppo spesso intellettuali di parte e media distorti nascondono ipocritamente il proprio sostengo al regime criminale di Putin.
Come il fuoco di Prometeo, la tecnologia delle armi nucleari ha cambiato per sempre il modo in cui si può intendere la politica internazionale e gli equilibri che la caratterizzano. La Russia può permettersi di invadere impunemente uno stato libero confinante senza che il resto del mondo possa intervenire in modo diretto perché dispone di armi nucleari.
Il film di Nolan, ricco di spunti filosofici e di riferimenti storici ci racconta il punto di partenza del dilemma civile che caratterizza la guerra in corso ai confini dell’Europa: ci sono armi che nessuno dovrebbe avere, perché nelle mani sbagliate possono portarci alla fine del genere umano. Tuttavia il vaso di pandora è già stato aperto, la reazione a catena è in corso quelle armi sono già nelle mani di un criminale genocida.
Non esiste dunque alternativa al pieno supporto dell’occidente alla resistenza del popolo ucraino e dalla vittoria di questo popolo glorioso dipende l’unica possibilità di pace che esiste per il mondo intero, confidando che come cantava Sting negli anni “anche i russi abbiano a cuori il futuro dei propri figli”.
youtube.com/embed/2XqJY-54x3o?…
L'articolo #laFLEalMassimo – Episodio 100: Prometeo Il distruttore di Mondi proviene da Fondazione Luigi Einaudi.
Gatta Cikova reshared this.
La decisione di Apple di eliminare il suo strumento di scansione fotografica CSAM suscita nuove polemiche (e nuove riflessioni sul pessimo chatcontrol)
Apple ha risposto alla Heat Initiative, delineando le ragioni per cui ha abbandonato lo sviluppo della funzionalità di scansione CSAM di iCloud e si è invece concentrata su una serie di strumenti e risorse sul dispositivo per gli utenti, noti collettivamente come funzionalità di sicurezza della comunicazione. La risposta dell'azienda alla Heat Initiative, che Apple ha condiviso con WIRED questa mattina, offre uno sguardo raro non solo alla sua logica per concentrarsi sulla sicurezza delle comunicazioni, ma alle sue visioni più ampie sulla creazione di meccanismi per eludere le protezioni della privacy degli utenti, come la crittografia, per monitorare dati. Questa posizione è rilevante per il dibattito sulla crittografia più in generale, soprattutto perché paesi come il Regno Unito valutano l’approvazione di leggi che imporrebbero alle aziende tecnologiche di essere in grado di accedere ai dati degli utenti per conformarsi alle richieste delle forze dell’ordine.
"Il materiale pedopornografico è ripugnante e ci impegniamo a spezzare la catena di coercizione e influenza che rende i bambini suscettibili ad esso",
ha scritto Erik Neuenschwander, direttore della privacy degli utenti e della sicurezza dei bambini di Apple, nella risposta dell'azienda a Heat Initiative. Ha aggiunto, tuttavia, che dopo aver collaborato con una serie di ricercatori sulla privacy e sulla sicurezza, gruppi per i diritti digitali e sostenitori della sicurezza dei bambini, la società ha concluso che non poteva procedere con lo sviluppo di un meccanismo di scansione CSAM, nemmeno uno costruito appositamente per preservare la #privacy. .
like this
reshared this
Il prossimo Privacy Pride si terrà il 23 settembre 2023. Ecco perché...
🏖 Molti di voi sono ancora in vacanza, ma come ben sapete, i portabandiera della sorveglianza sono sempre al lavoro e quindi crediamo che sia opportuno lanciare un nuovo #PrivacyPride per il 23 settembre!
🇪🇺 Come alcuni di voi già sapranno, i governi degli Stati membri dell'UE stanno pianificando di adottare la loro posizione ufficiale, denominata "approccio generale", sul regolamento sugli abusi sessuali su minori (per gli amici ChatControl 😅) alla riunione dei ministri della giustizia e degli affari interni del 28 settembre 2023.
💪🏼 È opportuno mobilitarsi velocemente, al fine di alzare l'attenzione dell'opinione pubblica, finora molto insensibile, su questo epocale cambiamento dello stato di diritto e della inviolabilità della corrispondenza.
🚸 Uno dei cavalli di battaglia dei nemici della privacy è la tutela dei bambini, proprio quei bambini che gli stati abbandonano Nelle mani delle grandi piattaforme centralizzate ormai universalmente adottate nelle scuole.
Ma si sa, c'è privacy dei bambini e privacy dei bambini... 😁
🏙 L'obiettivo sarà perciò organizzare tanti presidi nelle città italiane e, se possibile, anche in alcune città d'Europa.
❤ Il tema del Privacy Pride sarà questo: "i bambini e la privacy: ignorati quando bisogna tutelare la loro privacy, ma sfruttati quando si tratta di diminuire quella di tutti i cittadini".
🕒 Appena saremo in grado di farlo, metteremo a disposizione tutti gli strumenti necessari per l'organizzazione!
like this
reshared this
Privacy Pride likes this.
@Emanuele Se fosse sufficiente una ricetta per risolvere i problemi, basterebbe sostituire legislatori e burocrati rispettivamente con chef e camerieri...
E magari non sarebbe neanche una cattiva idea: almeno chef e camerieri sanno che i bambini non sono clienti da servire con gli avanzi degli adulti ma sono anzi dei clienti speciali con stomaci più delicati, che necessitano di maggiori attenzioni e molto più rispetto
like this
Privacy Pride reshared this.
Poliverso & Poliversity reshared this.
Non dimenticare e parlane
Soltanto, bada bene a te stesso e guàrdati dal dimenticare le cose che i tuoi occhi hanno viste, ed esse non ti escano dal cuore finché duri la tua vita. Anzi, falle sapere ai tuoi figli e ai figli dei tuoi figli. Deuteronomio 4:9
L’imperativo di questo versetto del Deuteronomio non è tanto di richiamo alla fedeltà, ma al non dimenticare. Se si dimenticano le grandi opere che il Signore ha fatte, presto non si dà più molto valore alla sua Parola. E se nessuno racconta delle grandi opere che il Signore ha realizzato, come si può rimanere con il Signore?
Questo non dimenticare è una costante, iniziando proprio dal Deuteronomio, della fede di Israele. Un riconoscersi parte di un popolo, attraverso tante epoche, che il Signore ha scelto e con cui ha fatto un patto.
Come cristiani anche noi parliamo di un patto, di un nuovo patto in Gesù Cristo. Questo è come un’estensione di quello del Sinai, ma con persone e popoli di tutta la terra. Spesso però non abbiamo questa idea di popolo, di essere parte del popolo di Dio, e nemmeno questa idea di patto.
Da una parte perché ovviamente la fede è qualcosa di personale, individuale. E la fede non si insegna, ma al massimo la si testimonia, d’altra parte però questo individualismo ci porta a pensare che ognuno debba in fondo provvedere da sé. Non dimenticare conta però innanzitutto per noi, ma poi anche per gli altri a cui ne parleremo, e quindi per i figli, i figli dei figli e le generazioni che verranno...
pastore D'Archino - Non dimenticare e parlane
Il Deuteronomio nelle nostre Bibbia si chiama così, dalla antica traduzione greca che lo individuava come la “seconda legge”. In effetti le leggi e le prescrizioni che si trovano in Eso…pastore D'Archino
Perché il Digital Service Act è un rischio per la libertà di parola su internet | L'Indipendente
«Una parte dell’opinione pubblica identifica la legge come un modo per imporre una sorta di censura mascherata finalizzata ad evitare che si possano esprimere tesi e opinioni divergenti da quelle “dominanti”. La facoltà di vigilare sulla correttezza delle informazioni e dei contenuti, stabilendo, dunque, ciò che è vero e ciò che è falso è stata attribuita in primo luogo ad un organo politico: la Commissione Europea e, nello specifico, al Comitato europeo per i servizi digitali che vigilerà strettamente sulle società e sui contenuti. Un’architettura di controllo che ha portato diversi rappresentanti politici e dell’informazione a parlare di una minaccia per la democrazia.»
Meta sta finalmente lanciando un'app Web molto più potente per Threads
@Informatica (Italy e non Italy 😁)
Sarai in grado di pubblicare, interagire con altri post e guardare il tuo feed, dice a The Verge la portavoce Christine Pa
Da giovedì la versione web è online per tutti, ha detto in un post il capo di Instagram Adam Mosseri
Firenze: sferrato al Corridoio Vasariano un attacco alla civiltà
La notte fra il 22 e il 23 agosto qualcuno ha preso della pittura nera e ha fatto molte scritte a tema palloniero sui pilastri del Corridoio Vasariano.
Eh, niente da fare. Più in là non ci vanno; tocca buttar giù.
Nelle ore successive le gazzette hanno sgazzettato, i ben vestiti hanno benvestitato, un certo numero di afferenti al democratismo rappresentativo ha afferentato e democratato di arresti e pene esemplari secondo un copione tanto familiare quanto irritante. Per chi tratta gazzette, ben vestiti e quant'altro con la scostanza gelida degli statistici e degli entomologi spicca comunque la dichiarazione di uno fra i meglio retribuiti individui dell'ambiente, tale Eike Schmidt, che ha cianciato di attacco alla civiltà.
Le stesse gazzette che piacciono tanto ai ben vestiti raccontano che nella sola giornata del 14 novembre 1978 a Firenze furono collocati 6 (sei) ordigni.
Altro che scritte e pallone.
La civiltà rimase al suo posto.
BRICS, il vertice più atteso al via in Sudafrica: obiettivo cambiare gli equilibri globali | L'Indipendente
«Prenderà il via domani a Johannesburg, in Sudafrica, fino al 24 agosto il quindicesimo vertice dei BRICS, attesissimo in quanto considerato uno dei più importanti dalla fondazione del gruppo che si è dato come obiettivo principale quello di porre la basi per instaurare un nuovo ordine internazionale multipolare più equo di quello “unipolare” attuale, in grado di contrastare e sfidare l’egemonia occidentale.»
L'impressionante espansione della mafia in Veneto svelata dalle carte dei giudici | L'Indipendente
«Indicibili intrecci tra imprenditoria e criminalità organizzata, crescita esponenziale dell’economia sommersa, ma anche intimidazioni mafiose ai danni di giornalisti e sindacalisti a colpi di arma da fuoco, in cui ad agire da burattinai sono uomini d’affari in giacca e cravatta: c’è tutto questo nella nuova maxi-inchiesta contro la ‘ndrangheta della Direzione distrettuale antimafia di Venezia, che ha puntato la sua lente di ingrandimento sul pericoloso binomio tra mafia e colletti bianchi in un’area dello stivale che, almeno a detta delle autorità, negli ultimi decenni sembrava essersi difesa piuttosto bene dall’opera di “colonizzazione” messa a punto dal crimine organizzato nel Nord Italia.»
Cultura e impegno sociale chiudono l’estate della Pigna. Torna Scambi Festival: laboratori e spettacoli dal 24 al 27 agosto nel centro storico
Tra gli eventi di #ScambiFestival segnaliamo il laboratorio “Direzione FediVerso!” in collaborazione con il collettivo @Etica Digitale e #Slimp (Software Libero #Imperia): saranno lanciate spedizioni di mappatura del quartiere su #OpenStreetMap e alla scoperta del #Fediverso, l’universo dei social network alternativi.
cc @Tommi @Scambi Festival
Se paghi salti la fila al pronto soccorso: la sanità neoliberista arriva a Bergamo | L'Indipendente
"Quello dei pronto soccorsi a pagamento rappresenta uno dei risultati più evidenti del processo di smantellamento del Sistema sanitario nazionale e si può considerare l’anticipazione della sanità del futuro nel suo complesso se non ci sarà un’inversione di tendenza in quest’ambito. Si tratta della vittoria del neoliberismo e del business sullo stato sociale e sulla cura e i servizi ai cittadini."
The Privacy Post
Unknown parent • •