Substantive Rationality and Procedural Rationality on a Collision Course
The 2019 Kenya Population and Housing Census has earned a reputation for being the most technologically enabled census in Kenya’s history. Besides paperless and online transmission mechanisms, the census utilised modern cartographic mapping enabled by a Geographic Information System (GIS) — a popular and powerful digital nervous system made up of human capital, location-based datasets and their descriptions, mapping and data processing software, computer hardware, and organisational procedures. The first impression is that the results obtained thus are more accurate than the outputs of previous manual exercises. On the contrary, the Kenya National Bureau of Statistics (KNBS) has been on the receiving end of dissenting voices since it released the population data. The major backlash has been coming from the politicians claiming their ethnic groups have been underrepresented in the numbers. These acrimonious reactions set substantive rationality and procedural rationality on a collision course.
Substantive rationality focuses on refining goals and the techniques for achieving them rather than the process. Procedural rationality focuses on the logic of the process leading to outcomes, better still within multi-stakeholder and multi-criteria settings of negotiation, dialogue, and debate. While the application of refined techniques and advanced technology enhances substantive rationality, participatory processes that ensure inclusiveness and ownership enhance procedural rationality. Assuming no ill intentions and sticking to matters of principle, the key question from a scientific perspective concerns the extent to which the entire exercise, from conception, planning, implementation, up to reporting the census outcome integrated both substantive and procedural rationality. From here, every informed debate should follow and generate major lessons on national data exercises for Kenya and the rest of Africa.
Zooming in on the Policy Details
Government’s role in institutional regulation can be divided into policymaking and policy execution. Policymaking is the conceptual element of setting the governing norms. Policy execution is the technical element of maintaining institutional functioning within the set norms. By 1965, Sir Geoffrey Vickers had made a seminal contribution to the fields of organisational decision-making and regulation of complex systems in his famous book, “The Art of Judgment”. The publication introduced three compelling dimensions to policy processes: direction, coherence, and continuity. This landmark triangle of key policy tenets informs the standard definition of policymaking as an agenda-setting process with a set of norms for giving direction, coherence, and continuity to the course of action. From this viewpoint, the magnified question is: To what extent do the guidelines and standards for the capture and management of vital national statistics ensure direction, coherence, and continuity in the framework of policy, planning and action?
Zooming out from Details to the Big Picture
If the trending Kenyan expression of political disagreements as “noisy, messy, and leaving casualties” is anything to go by as one politician put it, then the equivalent process in logical scientific discourse is the delicate handling of procedural rationality. Sharing their experience in 2005 through an article in the Journal of Environmental Management on designing computer-based models for integrated environmental research, McIntosh B. S. and colleagues recommended the application of integrated models which effectively support both substantive and procedural rationality. Such models must have interfaces to accommodate both the tacit and explicit knowledge of different user groups. They argued that though computer-based models have been suitable for decision contexts focused on substantive rationality, they have been weak at handling agenda-setting processes in policy and participatory planning contexts. They also acknowledged that the latter case is typically complex, “messy”, and focused on procedural rationality. From this viewpoint, the revealed panoramic landscape constructs the question: How are the national agencies which are custodians of key data prepared to ensure active public participation beyond mere formal consultations?
Zooming in on the Nature of Errors and Models
Errors are systematic if they are cumulative and expressible by mathematical equations, which is the case with faulty measuring instruments or erroneous mathematical models used for computation. Errors become gross errors if they are obvious blunders and mistakes, which is the case with carelessness, inexperience, or gross manipulation. But errors can also be random if they are normally distributed, eventually resulting in a self-compensating bias which gets corrected by taking the mean value.
There are, however, clear rules governing statistics and the mathematical models used for estimations and projections. Simplicity rules since the convincing power of any explanation or model lies in minimising the number of assumptions required to achieve the stated goal. Needlessly complicating assumptions goes against the scientific principle of modelling — to be as simple as possible and only as complex as necessary. We learn the same from the law of parsimony (lex parsimoniae) or Occam’s razor which is a maxim supporting the choice of the option with the fewest assumptions when faced with competing hypotheses or explanations. “All models are wrong; some models are useful,” so goes a famous statistical aphorism variously attributed to the American engineer and statistician William Edwards Deming (1900–1993) and the British statistician George Edward Pelham Box (1919–2013).
From this viewpoint, three simplified hypotheses emerge: 1. Gross errors in the census data will always be obvious and detectable against compelling evidence. 2. Systematic errors in the census data arising from past imperfections in measurements and base data should be cumulative and adjustable using mathematical formulae. 3. Assuming the worst case that regimes are in the business of manipulating census data to meet temporal political objectives, then the glimmer of hope is that the randomness of political realignments over time will lead to a self-compensating outcome in the long term.
Zooming in on the Population Figures
For any country, population and housing census are part of the critical statistics for planning and resource allocation. Errors should, therefore, be minimised as much as possible. At best, these population figures can only be estimates and not exact, scientifically evaluated by their closeness to the unfathomable “true value”. In 2008, the development of “Nairobi Metro2030 Strategy – A World Class African Metropolis” became a key ambition of the political regime of the time, complete with a dedicated Ministry of Nairobi Metropolitan Development. Out of curiosity as the news of census results broke out, I went back to the old projections I did in 2008. In that year, I took on the task of carrying out population growth projections up to the year 2030 for the expansive metropolitan extent as a member of a special technical taskforce. The projections, aggregated by county and based on the data sources indicated, are presented in the table. Taita Taveta County has been added to the list based on a later projection in 2018 driven by my personal research interests.
In the resulting Nairobi metropolitan growth strategy document, the population projections indicated that the southern metropolitan region (Kajiado County) would grow faster, absorbing Nairobi’s working population as influenced by land and housing market dynamics.
An objective assessment of the projected figures and the actual census results from this sample can point to the general trends.
Zeroing in on the Key Lessons
The strong dissent to what should otherwise be the most accurate census results, conducted in a hi-tech environment, provides several key lessons. Substantive rationality, though necessary, is insufficient on its own as a defence and reason to persuade the ownership of public policy processes. Technical criteria must be married to procedural criteria to score convincingly on transparency, inclusivity, and ownership. All stakeholders have a crucial role to play, whether technical or non-technical, political or apolitical — not least the fourth estate.
As the wave of digital transformation continues to influence changes in the modus operandi, governance models must rethink and reengineer participatory approaches towards empowering civic engagement and ensuring continuity. This lesson applies to digitalisation efforts in Africa’s land and mining sectors as well, being key examples of the sectors attracting competing multi-stakeholder interests and public attention.
In hindsight, the now defunct Ministry of Nairobi Metropolitan Development produced a wealth of research outputs but being a creation of Kenya’s 2007 – 2012 political regime it was short-lived. This result offers key lessons on ensuring continuity. As foresight to Kenya’s mining sector, the Ministry of Mining, a similar creation of the priorities of a new political regime from 2013, should institute measures for continuity and enhance procedural rationality to successfully operationalise Kenya’s Mining Act of 2016 — already being referred to as Africa’s modern and most progressive mining law.
My take remains. For optimal utilisation and societal impact of advances in research and technology, African governments must heed the following keywords in their blueprints of action: coherence, continuity, direction, procedural rationality, and substantive rationality.
By Nashon Adero,
The author is a youth mentor, writer, and a lecturer in the School of Mines and Engineering, Taita Taveta University, Kenya.