国产视频

In Depth

The Power of Shared Data: Building Blocks for Interoperability

virtualbuilding
Viaframe via Getty Images

Introduction

Imagine having every tool and dataset needed to turn good intentions into measurable impact and solve complex policy challenges鈥攐nly to find that they can鈥檛 connect or work together. Many state agencies face this challenge. Datasets sit isolated, technical standards diverge, and legal or procedural silos prevent meaningful integration. Without shared access to the data, insights can鈥檛 flow, programs can鈥檛 scale, and potential impact is left untapped.

This is particularly an issue in higher education state data. Unlike K鈥12 schools, which are largely state-run, higher education involves public and private universities, community colleges, workforce programs, and state agencies. Without linked systems, tracking students across these diverse institutions is extremely difficult. The disconnected systems are unable to inform programs that would improve student outcomes, optimize public spending, and guide institutions toward greater equity.

Rather than treating their systems as a collection of isolated parts, agencies can instead approach them as building blocks, with each database designed to fit, stack, and connect to a large structure through shared formats and connective points. This concept, known as interoperability, is the foundation for building systems that can work together effectively.

鈥淩ather than treating their systems as a collection of isolated parts, agencies can instead approach them as building blocks, with each database designed to fit, stack, and connect to a large structure through shared formats and connective points.鈥

There鈥檚 more to this than just making data sharing a priority鈥攖rue interoperability requires establishing common standards and designing infrastructure so that different databases, tools, and applications can consistently recognize, trust, and interpret one another. It requires structuring data fields and metadata that align across systems and agencies. Most critically, it means viewing data and systems not as a collection of isolated parts, but as a cohesive blueprint that is continually refined.

This brief delineates the challenges and opportunities specific to education data interoperability, though many issues affecting one sector or agency often extend to others. While data is critical to most good policymaking, higher education outcomes鈥攕uch as graduation, employment, and wage growth鈥攗nfold over years or even decades. Disconnected systems make longitudinal tracking almost impossible, undermining policy, funding, and equity analysis, particularly for students who have historically faced barriers. When data flows efficiently and responsibly, policymakers and educators gain a clearer, more comprehensive understanding of who is succeeding, who needs additional resources, and where systemic gaps persist. This understanding empowers targeted interventions that promote fairness, opportunity, and success for all students.

States and institutions are being asked to answer urgent, complex questions: Are public investments translating into real student outcomes? Who is making it to graduation, and who is leaving school without a degree? What happens to students after their studies, and how does that vary by race, income, or geography? The answers can鈥檛 be found in a single dataset, or even by a single agency. To fully address them, we need systems that can securely connect education data to workforce, social service, and financial aid data.

Many of our current systems were created before the digital era and weren鈥檛 built to support cross-agency interoperability. They were designed to meet internal reporting requirements, not to answer collective questions or adapt to shifting policy demands and technological innovation.

鈥淭rue interoperability doesn鈥檛 start with code鈥攊t starts with aligning leadership, legal frameworks, technical infrastructure, and a collective understanding of what good data sharing entails.鈥

This brief is a guide for how agencies can change that, beginning with readiness. True interoperability doesn鈥檛 start with code鈥攊t starts with aligning leadership, legal frameworks, technical infrastructure, and a collective understanding of what good data sharing entails. In addition to addressing common barriers, priorities for ensuring readiness, and practical steps to start the process, this brief also introduces a agencies can use to evaluate their own interoperability.

When done well, interoperability isn鈥檛 just efficient鈥攊t鈥檚 transformative. The goal isn鈥檛 to merely replicate datasets and systems or to connect them without a clear purpose鈥攊t鈥檚 to ensure they fit together in a way that鈥檚 useful and sustainable.

Why Interoperability?

Interoperability means the ability to securely and meaningfully exchange data across systems without requiring custom workarounds or constant translation. As , a K鈥12 interoperability initiative, puts it, interoperability is the 鈥渟eamless, secure, and controlled exchange of data between applications.鈥 According to the United Nations鈥 , interoperability also entails the ability to access, process, and integrate data 鈥渨ithout losing meaning.鈥 To put it succinctly, interoperability makes data useful beyond its original context.

In practical terms, interoperability means that different databases are either technically connected to each other or structured in analogous ways. For example, while a post-graduate survey can generate some insights into alumni careers, linking student data to workforce outcomes ensures a much larger sample size, allowing the state to better surface disparities and identify necessary changes. Interoperability can also mean ensuring common data formats or standards that align definitions and metadata, enabling different systems to automatically exchange information without manual intervention. For instance, across all levels of education allow K鈥12 and postsecondary systems to recognize the same course or subject across schools, enabling student progress and achievements to be compared and analyzed consistently. The applications of interoperability are endless: Shared analysis, service delivery, or program evaluation all depend on making sense of fragmented data.

Breaks in the Blueprint: Common Barriers to Interoperability

Agencies and institutions want to make better use of their data. But interoperability can鈥檛 be built overnight, and they face real barriers to connecting their data in usable ways. These barriers are more than technical: They鈥檙e also institutional, legal, and cultural. Often, they鈥檙e hiding in plain sight:

  • Legacy systems are one of the most stubborn challenges. Many agencies still run on decades-old infrastructure, patched together to meet past program needs. These systems weren鈥檛 designed to connect externally, and vendors don鈥檛 always make it easy to change that.
  • Manual processes and incompatible platforms compound the problem. From spreadsheets to custom logins, workarounds have piled up in the absence of shared systems. There are often too many tools with too little alignment.
  • Data silos are a persistent issue (but not always for the reasons you think). The was created to reduce silos and make data more open, accessible, and usable across government. But not all silos are accidental or malicious; data segregation can even be intentional for added safeguards, but these should still be built within interoperable frameworks. Without shared standards or alignment, simply pooling data can still risk duplication, limited utility, and barriers to timely access.
  • Legal confusion can be paralyzing. Staff may not know what kinds of data sharing are permitted under federal privacy laws, such as the Family Educational Rights and Privacy Act () and Health Insurance Portability and Accountability Act (). In the absence of clarity, agencies sometimes see inaction as the best way to reduce liability.
  • Unclear leadership and misaligned priorities can stall even the best ideas. Limited capacity, unclear authority, and unfamiliarity with new tools can stall initiatives, even when ideas are strong.
  • Funding constraints are a common barrier. Interoperability requires real investment in capacity, infrastructure updates, and system updates. Often, a larger initial investment can ensure long-term sustainability and be more cost-effective in the long term than fixing a poorly designed system.
  • The lack of a coherent plan is often the biggest obstacle. Interoperability isn鈥檛 a feature you tack on at the end鈥攊t needs to be part of the blueprint to begin with. Even with the right staff and technical capacity in place, treating coordination as an afterthought can prevent seamless collaboration.

These barriers are real, but agencies can effectively navigate them once they develop high interoperability readiness鈥攖hat is, the degree to which an organization, agency, or system is prepared to integrate, exchange, and use data across multiple systems securely and consistently.

Defining what 鈥渞eady鈥 looks like will provide teams with a clear mandate to work across boundaries. The process is more than a technical checklist鈥攊t鈥檚 a commitment to engaging with the people, policies, and infrastructure that make connection possible.

Prepared agencies and institutions ask questions like:

  • Are our systems capable of exchanging data with other internal or external systems without requiring manual workarounds?
  • Are key datasets mapped to standardized field definitions and data types across our agency and with sector-wide conventions, ensuring consistent interpretation and integration?
  • Do we have metadata and documentation to support shared meaning?
  • Do we have formal policies, agreements, or procedures that govern the responsible use and sharing of data, including technical and legal safeguards?

These questions are complex, but agencies that answer them are more likely to have systems that communicate seamlessly, decisions that reflect the whole picture, and data that works as hard as the people behind it.

Designing an Interoperability Framework

A data system is only as strong as its foundation. While you can retrofit a system to be more interoperable, a system that builds it in from the start will be stronger鈥攁s long as the fundamentals are correct.

Think of data systems like buildings on a city block. Each building may look different on the outside, designed for its own purpose, but behind the facades, they all rely on the same underlying infrastructure. The plumbing, electricity, and internet system in each building links together across the entire block, feeding back into centralized systems. This is the kind of cooperative infrastructure that interoperability requires. Systems don鈥檛 need to be identical or replicate each other鈥檚 designs鈥攂ut they do need to be designed with the greater context in mind.

This brief outlines six essential 鈥渂uilding blocks鈥 of interoperability. These key characteristics were identified through a review of interoperability frameworks from multiple sectors鈥攊ncluding , , , and 鈥攂efore focusing on those most relevant to education. The building blocks draw from commonly cited frameworks, including the 鈥 搁耻产谤颈肠,鈥 Diagnosis and Recommendations to Integrate Administrative Records Related to Children, Data Ready Playbook, and the 鈥淚nteroperability Maturity Model鈥 from the .

These organizations underscore the importance of: (1) data quality, (2) data governance, (3) grounded leadership, (4) technical capacity, (5) legal frameworks, and (6) risk management.

sixessentialbuildingblocksofreadiness
Alex Bri帽as/国产视频

These core components make any data system capable of flexing, connecting, and growing across agencies and contexts. They鈥檙e the shared foundation that lets agencies align systems without erasing differences鈥攄esigning not for uniformity, but for compatibility. Adopting these building blocks intentionally is more important than adopting them perfectly.

This brief includes a to help agencies and institutions take stock of their progress toward interoperability readiness, spot potential obstacles, and map a path forward.

The Six Building Blocks of Architectural Readiness

1. Data Quality

Interoperability only works if the underlying data is reliable. Incomplete submissions, mismatched timelines, inconsistent survey instruments, missing metadata, and basic entry errors can all undermine confidence in shared . Without reliable, accurate, and well-structured data, leaders risk arriving at misleading and meaningless conclusions. Quality isn鈥檛 a one-time achievement鈥攊t鈥檚 a continuous process of cleaning, removing duplication, validating, and documenting. Tools like , entry-level , and automated improve trust in both generated and shared data. Agencies also benefit from tracking simple metrics, such as the percentage of missing or unmatchable records, to monitor quality over time. That said, the best defense is context: people who know the data well enough to spot issues before they become problems.

How to implement: Start by assessing whether your current datasets actually reflect the questions they are trying to answer. For example, are they reliable indicators of student experiences and outcomes? Prioritize clarity: Inventory what鈥檚 in use, identify what鈥檚 immovable, and consider how the data was collected. Pay special attention to manual processes, which are often the source of inconsistency. Updating metadata as part of this process also improves interpretability and long-term usability. These initial steps help identify where quality might break down in fragmented data systems. Once that鈥檚 accomplished, data users can normalize and data where needed, such as aligning inconsistent grading scales or formatting addresses to match across districts.

Case study: The draws data from six agencies and institutions to evaluate students鈥 career and college readiness. Early challenges with missing and inconsistent submissions led to the adoption of statewide and a , which ensures agencies鈥 data is chronologically comparable and checked for matching issues before publication. These tools aligned timing, clarified expectations, and set a threshold where review is triggered if more than 10 percent of any variable is missing. Together, they鈥檝e supported a decade of productive collaboration and credible recommendations.

2. Data Governance

Good turns scattered inputs into usable systems that preserve, secure, and transmit data throughout their lifecycle鈥攆rom collection to access to archiving. Defining the policies, processes, and roles that govern data answers practical questions like: Who updates the spreadsheet? Where do survey results live? Can two datasets be linked over time? For this reason, many agencies a data governance body, policy, or manual that covers topics such as data collection, encryption, and the linkage of data systems. Without governance, agencies confusion, security breaches, or tech investments that don鈥檛 meet long-term needs.

How to implement: Start by designating : people responsible for managing specific datasets and ensuring they鈥檙e accurate, protected, and shareable. As a guiding principle, these assignees should have intimate knowledge of how various data is collected and interpreted. Their first job should be to audit current practices and align them with relevant standards. From there, they can develop clear protocols for how data is gathered, stored, migrated, protected, and linked. These stewards should also produce guidance or training to help partners comply. As systems evolve, they鈥檒l need to manage vendor contracts and clarify who has the right to access or modify data. Data governance should be focused on long-term outcomes, not just short-term functionality.

Case study: The (OSPI) has developed a robust comprehensive governance . The manual outlines roles across local, state, and federal levels; lays out questions to ask before adding or eliminating a data element; and provides frameworks for linking K鈥12 and postsecondary data. When the state needed new data on dual language programs, OSPI already had the protocols in place, including a template for a project timeline and a standardized way of collecting student and teacher data. The result: a timely, coordinated effort that provided actionable data without the need to start from scratch in support of Washington鈥檚 widespread .

3. Grounded Leadership

Interoperability is often treated as a technical endeavor鈥攂ut at its core, it is a that depends on strong leadership and vision. Lasting progress requires champions who understand the landscape, bridge agencies, and sustain momentum across political cycles. Too many initiatives start with one passionate administrator, only to stall after that person moves on. To endure, interoperability efforts need shared vision, cross-agency coordination, a range of invested stakeholders, and governance structures that outlast individuals. Whether it鈥檚 a governor-appointed board, a multi-agency task force, or a working group of district representatives and IT staff, must connect the dots between vision, strategy, and operations. Most critically, they are the people who can the 鈥渨hy鈥 and bring others along.

How to implement: Leaders should start by aligning stakeholders on a clear purpose: a shared perspective on interoperability, core questions to explore in data, and a timeline. What are we trying to answer with this data? Who benefits? What can we feasibly achieve given our environment? From there, they can coordinate with policy and funding bodies, shepherd interagency agreements or legislation, and ensure interoperability is baked into long-term strategic and budget plans. Crucially, strong leaders also make sure that costs and benefits are distributed fairly across participants鈥攅specially when resources are uneven.

Case study: 颁补濒颈蹿辞谤苍颈补鈥檚 brings together 15 stakeholders through a 21-person that includes higher education administrators, state social service leaders, elected officials, nonprofit research professionals, governor and state appointees, and critically, members of the public. The board is tasked with strategic planning, reviewing governance protocols, and elevating community priorities. A key of the initiative has been the cultivation of data champions who have successfully communicated the importance of data to the public. For example, since 2022, the governing board meetings have been shared on YouTube in both English and Spanish. This cross-cutting and intentional structure鈥攁long with its focus on continuity鈥攈as been with the initiative鈥檚 staying power.

4. Technical Capacity

Once leadership and governance are in place, agencies need the technical capacity to systems. That includes infrastructure (like student information or learning management systems), integration tools (like application programming interfaces, or APIs), and adherence to shared data standards (like the Department of Education鈥檚 , or CEDS). But capacity isn鈥檛 just about using the right software鈥攊t鈥檚 also about who can maintain, troubleshoot, and evolve systems over time.

How to implement: Begin by inventorying the tools currently in use鈥攚hat they cost, how well they work, and whether they meet program needs. Many agencies are managing thousands of technology tools, often with overlapping or underused features. One nationwide survey found used an average of in the 2023鈥2024 academic year, up from 841 in 2018. Clarifying which ones are essential helps reduce bloat, align systems, and map gaps. From there, develop procurement practices that prioritize interoperability, specs for secure login and scalability. Professional associations like CoSN also the importance of tools that protect verified access to platforms through multi-factor authentication (MFA), single sign-on (SSO), and levels of authorized access. As systems evolve, ensure that at pace and that staff are equipped to manage them.

Case study: The New Jersey Education to Earnings Data System () sparks insights based on tech from the Coleridge Initiative, whose provides a cloud-based dashboard for the state to manage data stewards鈥 profiles, datasets from four state agencies, and project details, along with tools for queries and statistical analysis.

Thanks to this scaffolding, NJEEDS published reports on 鈥渨age scarring鈥 post-COVID-19, the impact of scholarships on educational attainment, and post-graduation retention of out-of-state students. The state now has trackable metrics on higher education outcomes that will inform its workshop for 90 higher education institutions in the state. New Jersey is also able to use the platform to share and receive data with others in the organized by the National Association of State Workforce Agencies.

5. Risk Management

As data flows more freely, so does risk. Interoperability requires strong safeguards to prevent misuse, , or violations of privacy. Risk management isn鈥檛 about saying no to data sharing鈥攊t鈥檚 about designing systems that are secure, ethical, and compliant to protect and institutions from harm. While , , the Children鈥檚 Online Privacy Protection Act (), and protect some of students鈥 personal information from certain types of disclosure, they are not comprehensive enough to assess risk within an interoperable context.

How to implement: Start by defining your agency鈥檚 privacy principles, as modeled by Data Quality Campaign鈥檚 . Then assess current tools and practices through a risk analysis that considers factors like the presence of personally identifiable information (PII), encryption practices, and levels of access. Don鈥檛 forget to consider each tool and system in light of changing . Agencies should clarify who can access what, under what conditions, and with what oversight. More advanced systems may implement statistical , automated , and . Additionally, everyone鈥攆rom contractors to internal teams鈥攕hould receive regular privacy training and operate under clear enforcement policies.

Case study: appointed privacy officers across various state agencies, along with a chief position. These officers track legislative developments, distribute grants for capacity-building projects, audit compliance, and support agency-wide training. Beyond strengthening policies to avoid inadvertent leaks, they鈥檝e also helped vet third-party vendors, ensuring student data isn鈥檛 being misused or sold. By building a formal privacy infrastructure, Indiana has both reduced risk and reinforced public trust.

6. Legal Framework

Even the best-intentioned collaboration can鈥檛 proceed without the legal scaffolding to support it. (DUAs) and memorandums of understanding (MOUs) are critical tools that define roles, responsibilities, and protections, including the project鈥檚 purpose, what PII will be shared, and how access permissions will be distributed. DUAs what data is shared, with whom, for what purpose, and for how long. MOUs clarify broader terms鈥攚ho owns the data, how it will be secured, and how staff will be allocated. Together, these documents lay the legal foundation for safe, purposeful data exchange. They also establish mechanisms for agencies and institutions to assess and ensure that data-sharing partners comply with federal and state law, safeguarding privacy and maintaining trust throughout the collaboration.

How to implement: Use 鈥攍ike those from the 鈥攁s starting points. Tailor them to clarify each party鈥檚 obligations, governance roles, and mechanisms for amendment. Legal agreements should connect directly to the data governance structures discussed earlier, reinforcing consistent workflows and compliance. Periodically revisit agreements to reflect changes in law, technology, or policy priorities.

Case study: To better understand outcomes for non-credit students, built upon an existing data-sharing agreement to request data on students who complete noncredit coursework to better understand employment and performance outcomes for community college students. The state updated an existing MOU between five state agencies to form a new data integration workforce with eight key stakeholders, all of whom fed individual data into a newly designed data warehouse. The revised system enabled the state to collect and aggregate information across many previously unlinked public workforce and higher education systems鈥攗ltimately revealing new insights and patterns. As a result, the public dashboard is now available for public use.

From Readiness to Action

Interoperability is a distinct project, but building readiness can start anywhere: in any institution, of any size, at any time. Despite two decades of literature on interoperability in education, identifying where to begin can often be one of the biggest challenges. But the path forward doesn鈥檛 need to be complicated. Fortunately, organizations can take several practical steps to build momentum:

Start with the questions. What gaps or opportunities exist in your state鈥檚 evidence-based policymaking? What education, workforce, or health questions are leaders wrestling with? What new insights could be gained by integrating data across agencies or states? Reflecting on these priorities helps focus energy on high-impact, achievable projects.

Prioritize easy wins. From that list, identify where you already have momentum. Existing state laws鈥攍ike those authorizing longitudinal data systems鈥攃an make some efforts easier to launch. Target simpler administrative projects and willing partners to establish proof points and build a base for more ambitious efforts down the line.

Fix downstream, think upstream. Ensure consistency and accuracy in how data enters your systems across formats, agencies, and timelines. Early investment in shared , secure infrastructure, and clear metadata will make future efforts鈥攍ike dashboards, visualizations, or cross-agency disbursements鈥攎ore effective and trustworthy.

Lean into regional collaboration. Communities of practice offer resources, certifications, and shared strategies. The collects case studies and has even published a based on work with 150 school districts in 34 states. The operates a multistate longitudinal data exchange for six nearby states. and are great options for those seeking to publicly demonstrate their commitment to secure interoperability with credentials. And those not ready to join an can find meaningful resources in both the 鈥 comparative analysis of longitudinal data systems and the 鈥檚 extensive case studies and policy guides.

Plan for sustainable funding. Interoperability efforts often raise questions about long-term costs, especially for technical infrastructure and staffing. While pandemic-era aid sparked historic , many of those grants are sunsetting. Agencies can still find support across federal, state, and subject-matter lines (though funding availability is subject to change).

Within the public sphere, the has supported data system upgrades for enrollment and case management. The Institute of Education Sciences鈥 10 partner with state education departments on programs and research to improve student outcomes, offering both funding and technical support. In addition, the National Center for Education Statistics disseminates grants through the , with a focus on efficiency, resource allocation, and labor market results. And education agencies aren鈥檛 the only source鈥攖he Department of Labor鈥檚 funds states鈥 efforts to connect workforce and education data in longitudinal databases.

Private grantmakers are stepping up, too. The is investing in bolstering interoperable systems so students nationwide gain valuable postsecondary credentials, and it helps fund Coleridge Initiative鈥檚 , which supports states, postsecondary institutions, and research organizations in building accessible, secure, and scalable data infrastructure. From studying unemployment claims to teacher vacancies, the project promotes cross-stakeholder engagement. The program from Results for America also supports data-driven decision-making by local program leaders, many of whom have launched multi-agency collaborations serving students in vulnerable conditions.

Operationalizing Readiness: A Self-Assessment Tool

To move from planning to implementation and actually operationalize the building blocks, agencies and institutions need a clear sense of where they are, where the gaps lie, and what鈥檚 needed to move forward.

The diagnostic tool below is designed to help agencies surface existing strengths, identify areas of growth, and prioritize next steps. It is not necessarily a checklist to complete, but a practical framework for reflection, coordination, and action. Whether launching a new project or improving an existing one, these questions can help teams align their strategies and structure their efforts.

Conclusion: Building Interoperability That Lasts

Whether an agency is examining students鈥 paths into high-demand careers or tracking university retention, it doesn鈥檛 need to start from scratch. Understanding common barriers and investing deliberately in foundational building blocks can transform interoperability from a distant ideal into a practical reality. Even where this is a new endeavor, readiness is attainable. It spurs systems to connect thoughtfully, securely, and with purpose, unlocking collective impact to better serve students and communities. In the end, interoperability鈥檚 power does not stem from the technology鈥攊t stems from the people using these tools so collaboration yields results.

Editorial disclosure: This brief was supported by the Gates Foundation. The views expressed here are solely those of the authors and are not intended to reflect the views of the Gates Foundation or 国产视频.

More 国产视频 the Authors

Sydney Saubestre
Sydney_Saubestre_1.original (1)
Sydney Saubestre

Senior Policy Analyst, Open Technology Institute, 国产视频

Comfort Sampong headshot
Comfort Sampong

Google Public Policy Fellow, Open Technology Institute

Programs/Projects/Initiatives

Topics

Downloads


The Power of Shared Data: Building Blocks for Interoperability