Methodological Steps

Step 1 - Definition of Indicators and Data Collection

Guidance

In short:

  • Build an SDG16 Indicator framework:
  • Scoping: Check what is already being monitored in the National Development Plan, etc. and  cross-check with SDG 16 targets and indicators; Cluster by issues or  go through SDG 16 targets.
  • Assessment: Establish consultation group(s) for common understanding existing indicators and gaps; Determine readiness to produce data; Determine a feasible number of supplementary indicators to the global indicators to monitor.
  • Selection: Ensure balance in structural, process and outcomes indicators as well as quantitative and qualitative indicators; Ensure that indicators capture those “left behind”; Use existing indicators and initiate development of complementary indicators.
  • Assess SDG16 Data Availability
  • Engage data producers: Stakeholder workshops to map available data and gap; Identify and agree on needed additional data sources (admin data, survey data, among others).
  • Consider non-official data sources: Engage civil society, NHRI, academia to review data; Explore how non-official data can be used; Also data from international commitments (OGP, UPR).
  • Inclusive validation: Generate a report from this consultation, conduct a workshop to validate the findings and produce a report on the SDG 16 data availability.

Tips:

  • Securing high-level political ownership to ensure collaboration between data-producing government institutions is important.
  • Linking to VNRs, UNDAF/ CPD/ and other national initiatives sustains the effort.
  • Be aware of the pitfalls of using international indices when no national indicator source is available. The main shortcomings of international indices are that they rarely disaggregate population groups and thus run counter to the 2030 Agenda’s emphasis on “leaving no one behind”. The elaboration of a national SDG16 monitoring system offers a valuable opportunity to identify national data gaps and to incentivize the production of new national datasets by state and civil society actors to better reflect the specific experiences of the various specific population groups.
  • Complementary national indicators should only be integrated into the national framework after pilot-testing data collection, which can sometimes prove to be more challenging than anticipated.
  • It is important to recognize and meet the challenges of using administrative data.

     Two issues are common:
  1. Administrative records can be incomplete or not consistent across time or across administrative levels;
  2. Weak (or non-existent) coordination mechanisms for data collection on a given issue make it difficult to reconcile related datasets and to compute indicators that require data from more than one institution.
Lessons Learned From Pilot Countries

Challenges:

  • Political will (El Salvador, Georgia)
  • Lack of coordination of government institutions on their respective data, causing delays (Georgia)
  • Government hesitation regarding non-official data, including from civil society (Georgia, Mexico)
  • Data by non-state actors does not have full geographical coverage (Indonesia)
  • National relevance vs global comparability (Tunisia)
  • No meaningful consultations with non-state actors (Indonesia, Georgia)
  •  Determining the ideal number of complementary indicators to measure, to avoid the risk of having an unmanageable number of indicators (Mexico)
  • Cooperation of UN agencies working in the areas related to SDG 16, including UNDP, UNICEF, UNESCO, OHCHR, UNWOMEN, UNODC, UNOWAS, IOM and UNCTAD is being encouraged by the lead national institutions in order to ensure more synergy in SDG 16 efforts (Senegal)

Opportunities:

  • Engaging senior government representatives early on and continuity of focal points facilitate collaboration on data (El Salvador, Tunisia, Senegal)
  • Leveraging interlinkages with other SDGs (most of the countries)
  • Link to VNRs, UNDAF/ CPD/ and other national initiatives (Mexico, Argentina)  
  • Align with other national/ regional indicators (El Salvador)
  • Analyzing available data in the form of a training workshop with public institutions and CSOs is useful to transfer skills and to allow for future replication (South Africa, Tunisia)
  • Using a pool of consultants with proven expertise in the field in order to establish the baseline for SDG 16 monitoring, and involve the structures responsible for supporting the government in the production of data (Senegal)
  • Results of Big Data analysis and of NSOs surveys can be fairly consistent (Tunisia)
  • Disaggregated data is critical for LNOB and needs to be sourced from NSO, ministries, academia and civil society (Indonesia)
  • Creating specialized technical groups/committees of the Information System of the SDGs, formed by representatives coming from all ministries defining the national official indicators for each SDG (Mexico)
  • Policy inventory useful for indicator development but also in itself, e.g. to see policy linkages (Uruguay, Tunisia)
  • Policy analysis shows which issues are not sufficiently addressed i.e. where policy action may be needed (Tunisia)
  • Thematic clustering of targets/indicators allows engagement with more stakeholders and deeper discussions (Uruguay)
  • Technical workshops/meetings with government, UN and civil society help assess not just data availability but also national relevance of indicators (South Africa, Tunisia, Uruguay, El Salvador)
  • Availability and openness to complementary indicators, which can be more relevant to the national reality (Mexico)
  • Working at the local level can be a good entry point for pilot initiatives due to its manageable scale (Tunisia, Argentina)
  • Conducting surveys of direct beneficiaries of activities related to SDG 16 can be an additional source of information as they often have substantially different opinions about what is happening in their communities than the experts or implementing agencies.
  • Elaboration of an analytical product based on all data available on SDG 16 should cover COVID-19 pandemic period.
  • Include the SDGs16 monitoring mechanisms within the national monitoring framework for all SDGs. 

Examples From Pilot Countries

Argentina

  • Used a national methodology to identify two national indicators for goal 16.3 and one for goal 16.6 and technical guidance was approved by the National Council for the Coordination of Social Policies.
  • Conducted exercise at sub-national level to adapt SDG16.

El Salvador

  • Grouped the 12 SDG16 targets into sub-thematic areas to guide the selection of indicators: (1) globally agreed; (2) proxies for globally agreed; (3) additional national indicators;
  • In total, 30 global and alternative/proxy national indicators defined and selected;
  • Data-producing government entities provided baseline data for these indicators.

Georgia

  • Identified lead agencies and sources of data for all SDG 16 indicators;
  • Identified baseline data for 31 official indicators;
  • Capacity building of relevant governmental agencies data collection and monitoring & evaluation;

Senegal

  • Analysis with UN Rapid Integrated Assessment (RIA) model revealed that 3 targets (16.a; 16.5; 16.9) were fully aligned and 9 targets (16.1; 16.2; 16.3; 16.4; 16.6; 16.7; 16.8; 16.10 and 16.b) out of 12 were partially aligned with the targets selected for measuring SDG 16.

For more information, see full case study.

South Africa

  • Comprehensive gap analysis of available national data relevant to the global SDG16 indicators;
  • Mapping of all CSOs working on SDG 16 issues and governance with a specific focus on those that worked on data collection;
  • NSO organised consultations with these CSOs to identify thematic areas where non-official sources of data might be used to fill out gaps in official data production.;
  • NSO and CSOs jointly selected 2-3 national indicators to complement each global SDG16 indicator.

Tunsia

  • Defined first indicator list combining international and national indicators;
  • Conducted a data-gap analysis specific to SDG16;
  • Piloted the forthcoming UN SDG16 Global Indicator Survey to collect data related to 11 indicators of SDG16 in the region of Medenine;
  • Conducted a national survey on Governance, Peace and Security including a series of SDG16 indicators.

For more information, see full case study.

Mexico

  • Selected indicators (globally and nationally defined);
  • Grouped the 12 SDG16 targets into sub-thematic areas;
  • Conducted data collection;
  • Provided to the NSO a set of agreed indicators agreed as a collaboration between three independent initiatives  INEGI. (2019).

“Statistical overview in Mexico of SDG 16: promoting just, peaceful and inclusive societies”. Panorama estadístico en México del ODS 16: promover sociedades justas, pacíficas e inclusivas INEGI. Sistema de Información de los Objetivos de Desarrollo Sostenible

For more information, see full case study.

Indonesia

  • Integrated quantitative and qualitative indicators in the monitoring framework
  • SDG16 were aligned with the National Development Plan, identifying 34 national indicators that matched (5), proxy (20) or complementary (9) to SDG16.

Indicators and Data Mapping to Measure Sustainable Development Goals (SDGs) Targets, Indonesia

Uruguay

  • Grouped the 12 SDG16 targets into four sub-thematic areas used to guide the selection of indicators. 
  • Conducted Policy Gap Analysis, distinguishing three types of policies providing for each issue/area: (1) normative framework; (2) guidance and information; (3) solutions to addressing issue directly
  • Indicator Gap Analysis linking indicators with international treaties ratified; and distinguishing three types of indicators: (1) globally agreed; (2) proxies for globally agreed; (3) additional national indicators
  • For global SDG16 indicators classified as Tier 2 or 3, two types of alternative indicators were proposed to stakeholders; some represented slight adjustments of global SDG16 indicators to ensure their measurability in the Uruguayan context, while others were “new”, country specific indicators drawing attention to issues of national importance left unaddressed by the global indicator framework.

Step 2 Stakeholder Engagement

Guidance

In short:

  • Identify engagement mechanism
  • Which existing frameworks for multi-stakeholder consultations exist already and can be leveraged?
  • If no suitable ones exist, which national authority can lead the process in an inclusive, participatory and accountable manner?
  • Ensure quality of engagement process
  • Inclusion: Are those most left behind involved and is the process accessible for those that face barriers?
  • Participation: Do stakeholders have access to information and influence in decision-making?
  • Accountability: Is the process transparent and are grievances addressed?

Tips:

  • Use consultations to build trust between government and civil society
  • Remember that in this new era of public policy formulation, where a variety of state and non-state stakeholders expect to be “co-creators” of policies and their associated programmes, the policy formulation process matters as much as policy content.
  • It can be useful to replicate the multi-stakeholder consultative structures already established by the OGP, or to build on consultation processes established during the intergovernmental process to draft the 2030 Agenda.
  • It may be necessary to tailor stakeholder engagement strategies to the specific interests of certain categories of stakeholders such as political parties, local governments and business actors, all of which were underrepresented in consultations in most pilot countries. The lack of engagement by local governments—especially at municipal level—can be particularly problematic when the time comes to collect data from the local level. Outreach efforts directed at these three underrepresented constituencies needs to refer more specifically to the strategic value of national SDG16 data in advancing their specific interests (e.g. to inform political party policy platforms, business investment strategies and strategies to improve local service delivery) at the same time as highlighting their own responsibilities for advancing SDG16.
  • Several countries found it useful to work with a national and/or international expert to enhance the robustness of the initial version of their indicator framework—notably, in terms of indicator relevance to targets and data collection feasibility—before presenting it to stakeholders. This intermediate “quality check” sharpened the focus of stakeholder consultations through prior identification of certain key issues for discussion. Higher quality indicator frameworks also tended to be more positively received by civil society actors, who were then more likely to be interested in partnering with state actors on data collection and monitoring of progress.
  • Leveraging the distinct skills and comparative advantages of different “types” of civil society entities is useful. While some CSOs enjoy strong community ties that can be leveraged to spread awareness of SDG16 and to validate and disseminate SDG16 monitoring results, others, such as research oriented CSOs and think tanks, may not and so should be involved from the outset in the design of the national indicator framework and the mapping of available data sources, including non-official sources generated by civil society actors.
Lessons Learned From Pilot Countries

Challenges

  • Engaging parliamentarians, political parties and local government (El Salvador, Uruguay, Indonesia)
  • Institutionalization of consultations (El Salvador, Uruguay)
  • Financial means to hold consultations, including at local level (Tunisia)
  • Agreeing on common ground to conceptualize every national problem (Mexico)
  • Collaboration with justice system institutions (Argentina)

Opportunities

  • Sharing introduction on SDG as part of invitation reduced information imbalance (Uruguay)
  • Linking SDG monitoring with M&E of the National Development Plan makes it easier to institutionalize consultations (El Salvador)
  • Consultations helped distinguish between technical and service delivery CSOs (South Africa)
  • National Statistical Offices are important partners that can play a leading role, but involvement of other authorities which have administrative data is equally important. (Mexico, South Africa))
  • Engagement with stakeholders require efforts beyond one-time consultations, and rely heavily on clarified expectations, trust-building, and mutual benefit (Mexico)
Examples From Pilot Countries

Argentina

El Salvador

  • Two separate consultations with civil society and private sector to introduce and receive feedback on the proposed national SDG16 indicator framework. Participants worked in thematic groups and asked to:
  •  identify “specific Salvadorian issues” related to each global SDG16 target and encouraged to propose additional related national indicators.
  • discuss their ongoing or future plans to collect SDG16-related data and to suggest how CSOs and the private sector could be better involved in SDG16 monitoring efforts.

Senegal

  • In the SDG 16 monitoring process, Senegal is involving the Study and Planning Units (SPU) of the relevant ministries, as the basic government structure in this process.  These units are organized into five sub-groups with a lead for each sub-group, responsible for ensuring coordination within other members through monthly internal multi-stakeholder consultations and periodic monitoring of progress and formulation. specific policy recommendations for each target.

SPUs have a key role to play in grassroot and technical facilitation. They are in charge of making the necessary updates and involving the local authorities in order to ensure the actions and measurement data cover all territory.

For more information, see the full case study.

South Africa

  • Consultations with government representatives, experts and civil society to discuss ways to strengthen civil society participation in monitoring the national indicator framework developed by NSO.
  • Developed a Training Manual and Activity Book) to raise awareness on SDG16 among communities and provide them with skills to monitor SDG16 progress was developed.

Tunisia

  • Consultations with and support to technical workshop with government, UN and civil society to assess data availability and national relevance of SDG 16 indicators. Their input used for the contextualization of SDG16 (including goal, target and indicator formulation) and production of the first SDG16 national baseline study through participatory analysis.
  • Elaboration of a participatory analysis on SDG16 in the region of Medenine through a group of representatives from public authorities and civil society based on the results of the SDG16 pilot survey baseline study.
  • Efforts at local level to engage civil society and communities in a participatory analysis through spotlight reporting and policy dialogue of SDG16 in Medenine. 
  • Participatory elaboration of a first progress report on SDG16 in Tunisia through mutlistakeholder target group.

    For more information, see the full case study).

Mexico

Multi-stakeholder consultations organised for:

  • Identification of key SDG16 issues by dimension;
  • Connection between identified issues and selected national indicators for the SDG 16
  • Identification of new sources of information and other possible stakeholders.

For more information, see the full case study).

Indonesia

  • Convened a series of workshops and focus group discussions to analyse how participants’ existing commitments align with SDG16 targets and to develop tools for monitoring how these commitments contribute to progress on SDG16.
  • To improve its participation in the SDG16 monitoring and evaluation process, special attention has been focused in linking it with the Philanthropy and Business Indonesia for SDGs (FBI4SDGs) initiative, through which private sector actors hold meetings every month to coordinate, share information on SDGs.

Uruguay

Held a cycle of four workshops organized around the four thematic areas of its SDG16 indicator framework. These consultations were organized by the national government, UNDP and the Uruguayan Center of Information and Studies (CIESU) and gathered more than 140 representatives from all three branches of government, academic institutions and civil society.

Step 3 Institutionalisation / Scorecards

Guidance

In short…

  • Identify ‘home’ for SDG 16 scorecard 
  • Is there a broader or existing data visualization effort to link with to ensure visibility and ownership?
  • Can an open-access / free platform be used to ensure sustainability?
     
  • Agree on process and modalities 
  • Will all data producers feed in directly? What are incentives and barriers?
  • Is the platform publicly accessible to ensure transparency?

Tips…

  • Visualisation makes data and indicators conversation-starters. Public and intuitive platforms, portals and scorecards are useful tools to kick-start and/or deepen national discussions around SDG16 and what it means in a given national context.
  • Strong ownership and active engagement at the most senior level of government is essential in securing the high-level attention and financial resources required if national SDG16 scorecards are to influence decision-making at the highest level and have meaningful impact on people’s lives on the ground.
  • Establishing electronic data portals is challenging. Countries that invested in the development of electronic monitoring systems (EMS) struggled to design systems that were able to simultaneously meet the multiple purposes envisaged for such systems, including improving inter-agency coordination in monitoring SDG16 indicators, enabling public access to data and supporting active public participation in the monitoring process. Separate systems might be needed in future to serve these various purposes.
  • Periodic monitoring is vital. A one-off baseline-setting exercise will not go very far in triggering policy action for the implementation of SDG 16. Setting up systems, e.g. scorecards, that ensure regular reporting on progress is essential if countries are to design effective national SDG 16 strategies and track their implementation over time
Lessons Learned From Pilot Countries

Challenges:

  • Translating results into policy action (Georgia, Uruguay, Mexico)
  • Results not fully public (Georgia and Mexico)
  • Institutionalization of the SDGs indicators monitoring systems (El Salvador)
  • Disseminating and consolidating methodological tools that allow monitoring progress in the implementation of SDG16 at all levels of government and ensuring its sustainability over time (Argentina)
  • Produce empirical information to facilitate strategies towards the implementation of the goal SDG16.3 (Argentina)

Opportunities:

  • Focus on indicators (yes/no format) makes responding easier (Georgia)
  • Narrative part useful as it can include responsibilities and recommendations and ensures follow-up (Indonesia)
  • Traffic light system is a strong incentive for authorities to report their data (Indonesia)
  • Linking monitoring with national budget ensures continuity (Uruguay)
  • Alignment of scorecards with existing government regulations on M&E and monitoring of other national strategies consultation will facilitate buy-in and institutionalization of consultations (Indonesia, El Salvador)
  • Public accessibility of scorecard results is as important as their availability (Indonesia)
  • Social media analysis could serve as a useful methodology for real-time monitoring of selected SDG targets (Tunisia)
  • Establishing a digital system monitoring (platform) with the involvement of local government is also useful to ensure national coverage for SDGs monitoring. It is also a matter of promoting inclusiveness in the process and ensuring capacity building at all levels is a capital exercise. (Senegal)

 

Examples From Pilot Countries

Argentina

  • In 2019, Buenos Aires committed -through the New York Declaration- to submit its voluntary local review annually and, currently, it is one of the pioneer cities in the world with three submitted VLRs. Buenos Aires' VLR systematizes the city's contribution to the 2030 Agenda for Sustainable Development, evaluates the progress, engages in an annual accountability process, and set the government's short and long-term priorities to build a more inclusive, sustainable and resilient city. The structure of the Buenos Aires VLR follows the guidelines established by the UN for national governments and reports on the objectives and themes prioritized by the High-Level Political Forum each year. However, the indicators for all the SDGs are included in the annex of the report.

  •  There is also an Open Justice Program which is focused on applying the principle of transparency and access to information. The Portal ensures access to information of the legal sector. It has 61 open databases containing information on topics such as access to justice, transparency and the fight against corruption, gender and justice, and the Argentine penitentiary system. The program is currently working with more than 50 national and sub-national institutions of justice to make data and information available to all.

  • Adapted SDG 16.3 by developing a comprehensive methodology to measure and understand unmet legal needs.

El Salvador

El Salvador

The scorecard linked national SDG16 indicator framework to the main national policies and programmes of relevance. This allowed for a combination of quantitative and qualitative information. 

For more information, see pilot report.

Georgia

Georgia

Scorecards for all nationalized SDG16 indicators were developed which:

  • Included national benchmarks, e.g. under target 16.1 (Significantly reduce all forms of violence and related death rates everywhere), indicator 16.1.1 (Number of victims of intentional homicide per 100,000 population), multi-stakeholder consultations in Georgia determined that it would be realistic to work towards a 10–15% reduction in the number of homicide victims per 100,000 population;
  • ranked each indicator according to its national policy relevance. Indicators monitored in connection with an existing national policy document or strategy had “high policy relevance”; those not monitored in connection with a national policy but were important for overall policymaking and/or reporting have “medium policy relevance”; and those not currently integrated into any strategic or policy document were  “low policy relevance”;
  • were clearly communicated to both data producers and data users, as well as the broader public.

For more information, see pilot report.

Tunisia

Tunisia
  • Developed a scorecard, which categorizes indicators into three groups measuring: “results” (of state efforts to improve governance); “capacities” (of state actors to implement policies, legislation and programmes); and people’s “perceptions” (of progress in tackling any given issue).
  • A policy gap analysis supplemented the scorecard and mapped existing SDG16-related national strategies and policies onto the specific targets and indicators of the Tunisian Governance Goal.
  • Development of a SDG16 scorecard at the local level (region of Medenine), which will include information on existing data (mostly from the SDG16 survey), policies being implemented related to SDG16, bottlenecks hindering implementation and recommendations for SDG16 acceleration in the selected region.

For more information, see pilot report.

Mexico

Mexico
  • Elaboration of scorecard template and dissemination strategy (communication and public accessibility), establishment of institutional or informal mechanisms for government and stakeholders’ consultations/participation processes in public policy decision making taking into consideration results of monitoring SDG 16 Indicators/Scorecards.
  • Proposal of a set of national indicators for the SDG 16 with its technical and source of information for each indicator. 

For more information, see pilot report.

Indonesia

Indonesia

Developed of a traffic light system (scorecard) building on its strong legacy of development monitoring. The scorecard incorporates three levels of data sources (global, international and national indicators) and both quantitative and qualitative (narrative) assessment, which together track the status of activities, measure performance and demonstrate achievement (figure 5). This type of scorecard enables the provision of more than quantitative measurements. Stakeholders can also report on factors that support or hamper the achievement of specific programmes and targets, on the parties responsible for the implementation and success of each programme, and on recommendations and follow-up to improve the programme and its success in the future, in a brief, simple but comprehensive manner.

Developed a technical SDGs Monitoring Guideline to assist stakeholders to monitor and evaluate their SDG16 performance and achievement at national and local levels, including indicators and narrative based on qualitative and quantitative data. The Guideline was developed through a stakeholders’ workshop, officially adopted by the Ministry of National Development Planning and stipulated in a ministerial decree. To finalize development of the monitoring tools and instrument, a trial was conducted on three indicators, selected according to the following criteria:

  • (i) the level of difficulty they present for data collection and unit analysis and data segregation, especially from a government administrative point of view (up to provincial and district levels);
  • (ii) the level of public participation and monitoring and evaluation in the development process to date; and (iii) the availability of data and its sustainability in the future.

The Monitoring Guideline includes a narrative part, consisting of responsibilities and recommendations, considered key by all actors to ensure follow-up.

For more information, see pilot report.

Uruguay

Uruguay

The template of the Scorecards included indicators and narrative based on qualitative and quantitative data.  The Uruguayan Center of Information and Studies (CIESU) designed scorecards that incorporate global indicators (i.e. official SDG16 indicators), supplementary indicators (i.e. global SDG16 indicators slightly adjusted to optimize measurement in the Uruguayan context) and complementary indicators (i.e. additional, country-specific SDG16 indicators measuring aspects not addressed by the global indicators).

Since supplementary indicators are only a “variation” of global indicators, CIESU decided to display these two types of indicators in the same table and to present complementary indicators in a separate table.

The scorecards showed trends in the evolution of indicators over time using ascending, neutral or descending arrows. The global tier classification was extended to national indicators and a colour code was used to classify indicators as Tier 1 (green), Tier 2 (yellow) and Tier 3 (red). In addition, a narrative describes the main actions taken to accelerate progress on each target and lists the responsible actors.

For more information, see pilot report.

Tools