Naked in the Garden: Privacy and the Next Generation Digital Learning Environment

min read

Key Takeaways

  • By defining NGDLE Privacy Interoperability Standards, we elevate our stewardship of learning data to more fully support the next generation of students as they cultivate their digital learning environments.

  • Whether an institution's NGDLE looks more like a meticulously designed garden, a patchwork of community plots, or a field of native flowers and grasses, designing privacy requirements into interoperability standards helps ensure alignment with institutional and personal privacy values.

  • A crosswalk between the various elements of multiple privacy frameworks categorizes related concepts from these distinct schemes into five broad themes that offer the basis for an NGDLE Privacy Interoperability Standard.

UC Berkeley campus photo montage

It's lunchtime, and the campus is bustling with activity. To reach a noon-hour meeting about using social media for instruction, I skirt a eucalyptus grove, a stand of redwoods along a creek, and a wide variety of blossoming trees, drought-tolerant landscapes, and manicured lawns between modern glass, historic brick, and classic stone buildings. The variety of settings on our physical campus —reflecting and stimulating the community’s diversity of inspirations and collaborations — is mirrored in our virtual learning environment: a dynamic and complex amalgamation of rapidly evolving educational technology and consumer applications.

The meeting was prompted by recognition that faculty frequently reach for tools beyond those offered in the campus-licensed learning management system (LMS) in order to engage students with their favorite consumer apps. In response, an academic senate committee seeks to publish guidance for evaluating use of social media assignments in course proposals. Our small group offers considerations on issues ranging from device availability and support for assistive technology, to privacy, long-term data control, ownership, and curation.

In my next meeting across campus, I spend an hour and a half in conversation with a data analyst about how privacy and online monitoring issues relate to institutional research. We discuss the Institutional Review Board (IRB) and Federal Education Rights and Privacy Act (FERPA) regulatory obligations associated with research involving student data1 and how studies conducted on behalf of the institution fall into a special niche between those protections. We contemplate the many disparate data points we collect on students and envision the benefits of correlating and analyzing more information to evaluate, quantify, and improve their experience. We also ruminate over the impact of pervasive monitoring in a creative learning environment where individuals need space to experiment without intimidation and freedom to take risks without fearing the consequence of closing doors.

These two sequential conversations portend the future of educational technology. Phil Hill describes today's learning management system as a "walled garden" where user-created content and transactional data stay protected inside an institutionally curated landscape. This contrasts to the future's next generation digital learning environment (NGDLE), which he describes as an "open garden" where data freely traverses disparately sourced applications. When the online campus ecosystem is assembled by instructors and learners into personalized learning tool mash-ups, where metrics-oriented data standards enable us to measure, correlate, and analyze every point of interaction, how do we safeguard privacy so we don't leave students and other users overly exposed in our virtual open garden?

Privacy Interoperability Standards

Privacy impact assessments (PIAs) are the standard measuring tape in a privacy officer's toolkit for evaluating a potential system's privacy risks. The E-Government Act of 2002 requires PIAs before collecting personally identifiable information or developing or purchasing federal systems that handle personal information. The PIA outlines a systematic process of defining the scope and purpose of information collection, appraising compliance requirements, and analyzing privacy risks and mitigations.2 While PIAs are an accepted convention, formally performing them might be more aspirational than routine in much of higher ed because of resource limitations. Conducting PIAs as part of a centralized vetting process is especially untenable when students and instructors construct their custom learning environments by assembling apps from an uncharted universe of offerings. We need a way to vet privacy practices on the fly.

Interoperability with "unimpeded exchange of data"3 is touted as an imperative for the NGDLE in order to support an ecosystem of diverse components. I recommend a measure of caution. While the ultra-interchangeable Lego system might be ideal for plastic toys, perhaps the electric socket model is more appropriate when student privacy and responsible stewardship of data are at stake. Having once been thrown to the floor of a cottage in Bali when plugging an inappropriate adapter into the wall, I consider effort spent finding the correct three-prong plug to be worth any sacrifice of convenience. Standard two-prong plugs may be acceptable for maximum interoperability in low-risk scenarios, but due diligence calls for grounding data exchanges in approved privacy practices.

Institutions have different cultures and hold different norms around responsible data use. The 2016 Asilomar II Convening on responsible use of student data proposed creating a "template or broad set of principles which can be expanded to include institutionally-specific context/priorities." This concept could be realized in the form of an NGDLE Privacy Interoperability Standard. Institutions that prioritize interoperability and have minimal privacy restrictions might accept a generic template, like a world-traveler's adapter plug that accepts any-shaped prongs. Other institutions with more rigorous privacy expectations could require assertion of a specific privacy standard in the interoperability handshake.

The IMS Global Privacy and Security Task Force has already laid out a model for a Data Use Label,4 like a nutrition label, to be displayed to the system administrator at the time of tool registration with the Learning Tools Interoperability (LTI) standard. Enhancing this type of handshake to perform automated vetting based on institutionally defined settings for a broader set of privacy standards would create a path for effective privacy management in the distributed next-generation digital learning environment.

What would those privacy standards comprise? Multiple privacy frameworks provide source material for crafting an NGDLE Privacy Interoperability Standard.

Fair Information Practice Principles (FIPPs) and OECD Privacy Principles define core tenets that have served as the foundation for privacy law and policies around the world for many decades.5 The Consumer Privacy Bill of Rights, a more recent attempt by the Obama White House to define a privacy framework for the Internet age, recognizes that we share personal information more freely than in the past. Several frameworks have emerged in recent years focused specifically on the learning data context. The NISTIR 8062 Privacy Engineering Objectives6 attempt to bridge the gap between high-level privacy principles and their implementation within systems by identifying categories of system capabilities needed to implement privacy policies.

As a crosswalk between the various elements in each privacy framework, table 1 categorizes related concepts from these distinct schemes into five broad themes. These themes offer the basis for an NGDLE Privacy Interoperability Standard. Some privacy concepts are consistent across the schemes, though the frameworks differ in labeling, emphasis, goal, and approach to each concept. One framework may express multiple ideas within a given theme, or may address the theme's basic concept as part of a different theme, or may not touch that theme at all. The sections following the table provide a high-level guided tour through the crosswalk.


Table 1. Crosswalk of digital learning environment privacy principles

Theme UC Learning Data Privacy Principles IMS Learning Data Principles Delicate Framework - LACE Project* Asilomar I Principles General Policy for Responsible Use of Student Data OECD Guidelines Consumer Privacy Bill of Rights NIST 8062
I. Degree of Transparency and Predictability TransparencyData owners have a right to understand the particulars, methods and purposes for which their data are collected, used and transformed, including what data are being transmitted to third-party service providers (and their affiliated partners) and the details of how algorithms shape summaries, particularly outputs and visualizations.
Freedom of ExpressionFaculty and students retain the right to communicate and engage with each other in the learning process without the concern that their data will be mined for unintended or unknown purposes.
TransparencyIndividuals have the right to understand the specific reasons, methods, and purposes for which their learning data is collected, used, and transformed. This includes any learning data being shared with third-­party service providers and other institutional affiliates or partners. Individuals also have the right to know how their data is transformed and/or used thru processes such as summative or algorithmic modifications, particular outputs, and visualizations. InvolveTalk to stakeholders and give assurances about the data distribution and use.
ExplainDefine the scope of data collection and usage.
Respect for the rights and dignity of learnersData collection, retention, use, and sharing practices must be made transparent to learners, and findings made publicly available, with essential protections for the privacy of individuals. Respect for the rights and dignity of learners requires responsible governance by institutional repositories and users of learner data to ensure security, integrity, and accountability. Researchers and institutions should be especially vigilant with regard to the collection and use of identifiable learner data, including considerations of the appropriate form and degree of consent. TransparencyClarity of process and evaluation are hallmarks of humane education systems, and must be maintained even while those systems grow more complex. Students are entitled to clear representations of the nature and extent of information describing them that is held in trust by [name] and relevant third­-party organizations. Students are entitled to explication of how they are being assessed, whether by human beings, machines, or hybrid protocols. Students also are entitled to request that assessments be reviewed through a clearly articulated governance process.
Shared understandingData describing student interactions with [name] are joint ventures. Instructors, administrators, students, and third­-party vendors all contribute to the process of data production. All of these parties deserve to have a shared understanding of the basic purposes and limits of data collection. Shared understanding is maintained by clear, brief, and explicit messaging to students about the nature of [name]’s data collection systems; the provision of detailed information and consultation about data collection, administration, and use upon request; common contractual language between [name] and all relevant third­-party vendors; and continuous consideration of student messaging and data use protocols throughout the [University/College].
OpennessThere should be a general policy of openness about developments, practices and policies with respect to personal data. Means should be readily available of establishing the existence and nature of personal data, and the main purposes of their use, as well as the identity and usual residence of the data controller.
Purpose SpecificationThe purposes for which personal data are collected should be specified not later than at the time of data collection and the subsequent use limited to the fulfillment of those purposes or such others as are not incompatible with those purposes and as are specified on each occasion of change of purpose.
Collection LimitationThere should be limits to the collection of personal data and any such data should be obtained by lawful and fair means and, where appropriate, with the knowledge or consent of the data subject.
TransparencyConsumers have a right to easily understandable and accessible information about privacy and security practices.
Respect for ContextConsumers have a right to expect that companies will collect, use, and disclose personal data in ways that are consistent with the context in which consumers provide the data.
PredictabilityEnabling of reliable assumptions by individuals, owners, and operators about personal information and its processing by an information system.
II. Degree of Anonymity or Choice     AnonymizeDe-identify individuals as much as possible.
ConsentSeek consent through clear consent questions.
Respect for the rights and dignity of learnersData collection, retention, use, and sharing practices must be made transparent to learners, and findings made publicly available, with essential protections for the privacy of individuals. Respect for the rights and dignity of learners requires responsible governance by institutional repositories and users of learner data to ensure security, integrity, and accountability. Researchers and institutions should be especially vigilant with regard to the collection and use of identifiable learner data, including considerations of the appropriate form and degree of consent.   Use LimitationPersonal data should not be disclosed, made available or otherwise used for purposes other than those specified in accordance with Paragraph 9 except: a) with the consent of the data subject; or b) by the authority of law. Individual ControlConsumers have a right to exercise control over what personal data companies collect from them and how they use it. DisassociabilityEnabling the processing of personal information or events without association to individuals or devices beyond the operational requirements of the system.
ManageabilityProviding the capability for granular administration of personal information including alteration, deletion, and selective disclosure.
III. Degree of Access, Ownership and Control Access and ControlData owners have the right to access their data. Given that faculty and students own their learning data and share in its disposition, access to and ultimate authority and control of the data rests with the faculty and student owners, and the UC data stewards acting on their behalf. Data retention access and control practices will be governed under UC policies and supplier contractual agreements.
OwnershipThe University of California (UC), its faculty, and students retain ownership of the data and subsequent computational transformations of the data they produce. Individual data owners have the right to determine how their data will be used. The UC acts as stewards of data on behalf of its faculty and students.
AccessLearning data, whether generated locally or in a vendor-supplied system, is strategic to an institution’s business and mission and must be available to the institution.
OwnershipFaculty, staff, and students generate and own their learning data. As governed by institutional policies, individuals, being owners of the data they generate, have the right to access, port, and control the disposition of their data stored by the institution, its service providers, and their affiliated partners.
  OpennessLearning and scientific inquiry are public goods essential for well-functioning democracies. Learning and scientific inquiry are sustained through transparent, participatory processes for the scrutiny of claims. Whenever possible, individuals and organizations conducting learning research have an obligation to provide access to data, analytic techniques, and research results in the service of learning improvement and scientific progress.   Individual ParticipationAn individual should have the right: a) to obtain from a data controller, or otherwise, confirmation of whether or not the data controller has data relating to him; b) to have communicated to him, data relating to him within a reasonable time; at a charge, if any, that is not excessive; in a reasonable manner; and in a form that is readily intelligible to him; c) to be given reasons if a request made under subparagraphs(a) and (b) is denied, and to be able to challenge such denial; and d) to challenge data relating to him and, if the challenge is successful to have the data erased, rectified, completed or amended. Access and AccuracyConsumers have a right to access and correct personal data in usable formats, in a manner that is appropriate to the sensitivity of the data and the risk of adverse consequences to consumers if the data is inaccurate. ManageabilityProviding the capability for granular administration of personal information including alteration, deletion, and selective disclosure.
IV. Accountability for Ethical Use, Stewardship, and Governance Ethical UseLearning data collection, use, and computational transformation are governed by pedagogical and instructional concerns, with an aim toward student success through prescriptive, descriptive, or predictive methodologies. As with grades and other sensitive data, uses of learning analytics should be pursued on a “need to know” basis. StewardshipAs stewards of learning data, institutions should have a data governance plan and governance policies that protect the data and the interests of its owners. These should transcend, but encompass, existing protocols, such as IRB.
GovernanceLearning data use and retention will be governed by institutional policies, and faculty and students retain the right of data access and retrieval.
EfficacyLearning data collection, use, and computational transformation is aimed at student and instructor success and instructional concerns through prescriptive, descriptive, or predictive methodologies.
DeterminationDecide on the purpose of learning analytics for your institution.
LegitimateExplain how you operate within the legal frameworks, refer to the essential legislation.
Respect for the rights and dignity of learnersData collection, retention, use, and sharing practices must be made transparent to learners, and findings made publicly available, with essential protections for the privacy of individuals. Respect for the rights and dignity of learners requires responsible governance by institutional repositories and users of learner data to ensure security, integrity, and accountability. Researchers and institutions should be especially vigilant with regard to the collection and use of identifiable learner data, including considerations of the appropriate form and degree of consent.
BeneficenceIndividuals and organizations conducting learning research have an obligation to maximize possible benefits while minimizing possible harms. In every research endeavor, investigators must consider potential unintended consequences of their inquiry and misuse of research findings. Additionally, the results of research should be made publicly available in the interest of building general knowledge.
JusticeResearch practices and policies should enable the use of learning data in the service of providing benefit for all learners. More specifically, research practices and policies should enable the use of learning data in the service of reducing inequalities in learning opportunity and educational attainment.
Continuous considerationIn a rapidly evolving field there can be no last word on ethical practice. Ethically responsible learner research requires ongoing and broadly inclusive discussion of best practices and comparable standards among researchers, learners, and educational institutions.
The humanity of learningInsight, judgment, and discretion are essential to learning. Digital technologies can enhance, do not replace, and should never be allowed to erode the relationships that make learning a humane enterprise.
Open futuresEducation should enable opportunity, not foreclose it. [Name’s] instructional, advisement, and assessment systems must always be built and used in ways that enable students to demonstrate aptitude, capacity, and achievement beyond their own or others’ prior accomplishments. [Name] and its personnel engage in continuous consideration of how our educational environments equitably enable humane learning and academic progress.
Informed improvement[Name] is a learning organization. We study student data in order to learn how our own educational environments can be made more effective and to contribute to the growth of relevant knowledge generally. Any and all research with student data is governed by the [University’s/College’s] science governance protocols.
Data QualityPersonal data should be relevant to the purposes for which they are to be used, and, to the extent necessary for those purposes, should be accurate, complete and kept up-to-date.
AccountabilityA data controller should be accountable for complying with measures which give effect to the principles stated above.
Focused CollectionConsumers have a right to reasonable limits on the personal data that companies collect and retain.
AccountabilityConsumers have a right to have personal data handled by companies with appropriate measures in place to assure they adhere to the Consumer Privacy Bill of Rights.
PredictabilityEnabling of reliable assumptions by individuals, owners, and operators about personal information and its processing by an information system.
ManageabilityProviding the capability for granular administration of personal information including alteration, deletion, and selective disclosure.
DisassociabilityEnabling the processing of personal information or events without association to individuals or devices beyond the operational requirements of the system.
V. Security and Technical Standards ProtectionStewards, on behalf of data owners, will ensure learning data are secure and protected in alignment with all federal, state, and university regulations regarding secure disposition. Security & PrivacyIndividuals’ security and privacy relating to collecting, using, and algorithmically transforming learning data is fundamental and must not be treated as optional. It must also be balanced with the effective use of the data.
InteroperabilityThe collection, use, and access to learning data requires institutional and supplier collaboration, which is dependent upon interoperability standards, protocols, data formats, and content to achieve institutions goals.
Technical aspectsMonitor who has access to data, especially in areas with high staff turn-over.
External partnersMake sure externals provide highest data security standards.
Respect for the rights and dignity of learnersData collection, retention, use, and sharing practices must be made transparent to learners, and findings made publicly available, with essential protections for the privacy of individuals. Respect for the rights and dignity of learners requires responsible governance by institutional repositories and users of learner data to ensure security, integrity, and accountability. Researchers and institutions should be especially vigilant with regard to the collection and use of identifiable learner data, including considerations of the appropriate form and degree of consent.   Security SafeguardsPersonal data should be protected by reasonable security safeguards against such risks as loss or unauthorised access, destruction, use, modification or disclosure of data. SecurityConsumers have a right to secure and responsible handling of personal data. (Security Objectives: Confidentiality, Integrity, Availability)
* [https://lace.apps.slate.uib.no/ethics-privacy/]

 

I. Degree of Transparency and Predictability

How specific is notice regarding the purpose of data collection and how predictable (based on collection context) are data uses?

Transparency is a theme found across privacy schemes, but the expected degree of specificity in the disclosure about data use varies by scheme. Traditional methods of providing transparency involve giving notice regarding purpose of collection, offering the data subject the choice about whether or not to participate, and limiting collection and use to that specified purpose. Since the rise of data analytics — where the next significant correlation between disparate data points is yet to be discovered, and research funding requirements often stipulate data sharing — "purpose specification" is notably missing from newer privacy frameworks.

One alternative approach suggests shifting the focus away from the data collector's responsibility to provide notice of specific uses so the data subject can make informed choices about participation, toward the data processor's responsibility to make contextually appropriate decisions about use of the data. The Consumer Privacy Bill of Rights, for example, calls for "Respect for Context," meaning data should only be used in a manner that is consistent with the original data collection, or if different, only with heightened transparency and individual choice regarding participation. Proponents of this approach cite the difficulty of predicting future data uses plus the fact that few data subjects read the notice anyway. Others remain skeptical of relying on data processors to keep the best interests of data subjects in mind when determining "appropriate" data uses, and call for just-in-time notices to alert users regarding data collection and use. The NIST framework leads system designers to identify the desired approach (notice and choice, respect for context, or otherwise) to meeting the system's predictability goals so that stakeholders are not surprised by data handling, even when contexts change.

II. Degree of Anonymity or Choice

To what degree is data anonymized or is the individual given a choice regarding participation?

As already mentioned, the concept of choice is frequently tied to the concept of notice. Like the spectrum of approaches available within the transparency theme, the extent of an individual's ability to choose whether to participate in data collection and allow use of their data also spans a range of possibilities. When data uses are not predictable and consistent with the context in which the data was collected, it is ideal to give subjects the choice whether or not to participate. Institutional and application-specific values and context will dictate the level of control an individual has over the use of their data. For example, does the institution allow students to opt out of research conducted using student learning data?

Techniques for de-identifying data play a key role in this arena. For some uses, e.g., for research studies, we can disassociate data from personal identities, while in other cases data must remain identifiable, e.g., to provide actionable information to a student or advisor. Some data may need to move back and forth between anonymized and identifiable states, e.g., anonymous grading of law school exams to prevent bias and to protect students with disability accommodations from stigma. A NGDLE Privacy Interoperability Standard is needed to specify the parameters for institutional, application, and individual requirements and preferences regarding choice, and these specifications need to be embedded and transferred together as attributes of the data.

III. Degree of Access, Ownership, and Control

What level of individual, institutional, vendor, and public access to and ownership of data is defined?

Access, as originally described in privacy frameworks, focused on the rights of data subjects to be engaged in the data collection process and to have access to their own data in an understandable format where they can ensure the accuracy of data affecting them. With the increasing complexity of data algorithms, machine learning decision support, and predictive analytics, these goals have renewed importance. The European Union's General Data Protection Regulation (GDPR) requires that data processors provide information to data subjects about the existence of automated decision making with meaningful information about the logic involved and the significance and consequences of such processing. The broad scope of GDPR is expected to impact institutions world-wide. When new information is produced through algorithms that are difficult to understand with human logic, explanations highlighting factors most important to the algorithmic model need to be embedded with the data as it is exchanged through the NGDLE.

A new facet of access and ownership arises in the NGDLE stemming from dependence on external providers and the need to protect the interests of higher ed institutions. Embedding data ownership arrangements within the interoperability handshake between any plug-and-play system is needed to ensure educational institutions maintain their access to the learning data that is core to the educational mission.

The NGDLE Privacy Interoperability Standard needs robust assignment of student, instructor, and institutional data-sharing rights in order to support multi-institution collaborations, data sharing for public-funded research, and competency-based course credit models where students retain evidence of their learning throughout their academic careers and professional lives. The NGDLE must support the ability for student work to move back and forth between private experimental spaces, limited collaboration modes, and public exchange of ideas according to granular definitions of intellectual property rights.

IV. Accountability for Ethical Use, Stewardship, and Governance

What ethical guideposts and accountability measures are demonstrated by entities handling student data?

Privacy is just one aspect of ethical use of learning data. An application's benefit to pedagogy and instruction, such as its ability to strengthen the student-instructor relationship, supports their success and enables opportunity for all learners without foreclosing futures, balancing the risks that come with sharing learning data. An application provider's attestation or demonstration of adherence to governance processes and ongoing accountability to ethical principles extends the level of trust between entities and thus allows the exchange of more sensitive data.

V. Security and Technical Standards

Privacy standards that govern what is deemed appropriate for authorized uses of personal data are supported by security standards that protect systems and data from unauthorized use. Authentication, authorization, and encryption specifications are important security mechanisms for safeguarding data. The Caliper and xAPI open standards address technical and security standards to some extent, and the robustness of the security model is key to the strength of the NGDLE Privacy Interoperability Standard.

Baselines and Extensions

Like urban planners laying out utilities and grids for a new city, architects of the NGDLE Privacy Interoperability Standard can define baseline specifications and optional extensions on these five privacy themes:

  1. Degree of Transparency and Predictability
  2. Degree of Anonymity or Choice
  3. Degree of Access, Ownership and Control
  4. Accountability for Ethical Use, Stewardship and Governance
  5. Security and Technical Standards

For example, the minimum assertion for the NGDLE Privacy Interoperability Standard I: Transparency and Predictability could be "Data collected will be used or disclosed primarily for purposes consistent with pedagogy and student success. Data subjects will have the ability to opt-out of participation in any data use unrelated to pedagogy and student success." Optional extensions might be, for example: (a) general information about data collection, retention, use, and sharing practices will be made available to all users in the form of a published privacy statement; (b) data uses unrelated to pedagogy and student success are only allowed if active (opt-in) consent is first obtained; and (c) just-in-time notice of data access and use will be provided to users.

Before allowing any exchange of data, applications and institutions can assess their mutual privacy compatibility on-the-fly by comparing each other's claims of compliance with established standards for each privacy theme. Institution A may not feel any Transparency and Predictability assertions are necessary and accept all plug-ins indiscriminately; Institution B might deem the baseline and any extension acceptable; and Institution C might only accept application plug-ins that conform to standards I(b), opt-in for secondary use, and I(c), just-in-time notice.

Enterprising institutions and students would take on the work of verifying the compliance assertions of applications. Standards bodies would take the role of collecting and publishing black, white, and grey lists of applications whose assertions have been verified as compliant or not compliant, and this, too, could factor into an institution's privacy requirements based on their risk tolerance. This low-overhead model allows institutions to engage in the NGDLE ecosystem regardless of where they stand on the spectrum between stringent privacy requirements and "unimpeded exchange of data."

Tending the Garden

photo of white picket fence with colorful flowers growing along it

Whether an institution's NGDLE looks more like a meticulously designed garden bed, a patchwork of community plots, or a field of native flowers and grasses, designing privacy requirements into data interoperability standards gives us tools to ensure that practices around transparency, choice, ownership, governance, and security align with institutional and personal privacy values.

By defining NGDLE Privacy Interoperability Standards, we elevate our stewardship of learning data to more fully support the next generation of students as they cultivate their digital learning environments.


Acknowledgment

I would like to thank Jenn Stringer of the University of California, Berkeley, for her contributions to this article.

Notes

  1. Federal regulations (Common Rule 45 CFR 46) require IRB approval of research involving humans. Separately, FERPA provides exemptions from the requirement to obtain individual consent for use of student data in studies conducted for or on behalf of the school to improve instruction.
  2. Department of Homeland Security, "Privacy Impact Assessment Guidance," published June 2010; template available as of 2017.
  3. Malcolm Brown, Joanne Dehoney, and Nancy Millichap, "Next Generation Digital Learning Environment: A Report on Research," EDUCAUSE Learning Initiative, April 2015, 4.
  4. Improving Privacy for Humans and Robots," slides for Learning Impact Leadership Institute, IMS Global Learning Consortium, May 18, 2017.
  5. FIPPs [https://a130.cio.gov/appendix1/]: 1973 Federal Government report from the Department of Health, Education, and Welfare Advisory Committee, "Records, Computers, and the Rights of Citizens"; OECD: Organisation for Economic Co-operation and Development Guidelines on the Protection of Privacy and Transborder Flows of Personal Data adopted in 1980.
  6. Sean Brooks, Michael Garcia, Naomi Lefkovitz, Suzanne Lightman, and Ellen Nadeau," NISTIR 8062: An Introduction to Privacy Engineering and Risk Management in Federal Systems," Information Technology Laboratory, U.S. Department of Commerce and National Institute of Standards and Technology, January 2017.

Lisa Ho is campus privacy officer at the University of California, Berkeley.

© 2017 Lisa Ho. The text of this article is licensed under Creative Commons BY-NC-ND 4.0.