This article is also available in French.
An analysis of implementation experiences in three francophone African countries
The staff of Ndokovi Integrated Health Centre in Cameroon are ecstatic: they have been awarded second place in the ‘Quality Challenge’ organised by GIZ in the Bafang District, symbolised by a ‘Quality flower’ certificate. After a year of teamwork focussed on improving their health centre’s hygiene and reception of patients, all personnel, from the cleaners and guardians to the nurses and midwives, are beaming and brandishing the new equipment the centre has been awarded as its prize.
Quality in healthcare would seem essential, yet all too often remains elusive. WHO defines quality of care as ‘the degree to which health services for individuals and populations increase the likelihood of desired health outcomes.’ Staff motivation and ability to uphold prescribed standards of care are clearly central requirements for ensuring quality health services. The search for quality is a priority of all health ministries – and many thought they had found the answer in Guinea in 2003.
Guinea’s Concours Qualité (CQ) aimed to stimulate motivation and teamwork
In 2003, at the request of Guinea’s Ministry of Health, German development cooperation introduced an innovative ‘Concours Qualité’ (CQ) – also known as ‘Systemic Quality Improvement’ (SQI) – among the districts and health facilities in the three regions where it was operating. The objective of this dynamic strategy, according to the former national director of the German-supported project Dr Mohamed Lamin Yansané, was ‘to create a quality culture in health structures through positive competition in order to increase utilisation.’
On the model of Deming’s PDCA cycle (Plan, Do, Check, Act), participating health facilities, as well as district and regional teams (referred to collectively as ‘structures’) went through a series of steps, including 1) self-evaluation by the staff with community participation, 2) audit and feedback (‘contre-monitorage’) by an external team, 3) development of an improvement plan, and 4) ranking among same-level entities (e.g. health centres, district hospitals) and award of prizes, before embarking on a new one-year cycle starting with implementation of the improvement plan. Evaluation and scoring guides were developed for each level of care based on the norms and procedures of the Ministry of Health in its six quality dimensions: technical competence, customer satisfaction, continuous improvement, co-management/community participation, functionality of the health district, economic behaviour.
The best-ranked structures received a certificate and a monetary prize, to be spent at their discretion. The participatory self-assessment appeared as particularly motivating, fostering teamwork and creating pride and respect. Adherence to rules and regulations was improved. The approach proved popular: In the course of 5 quality cycles between 2003 and 2007, the number of participating structures rose from 128 to 245.
In 2006, CQ was adopted as official national strategy and transferred to the Ministry of Health. However, after the end of German support, the Guinean government managed to organise only one more quality cycle, before abandoning the approach as too onerous and costly. One issue was the exhaustive nature of the assessment, with 46 to 80 criteria depending on the level of the health structure. Financing and organising the external audit represented a major bottleneck. Above all, as the country slid into a period of social and political instability, the chronic lack of resources in the health sector deprived even motivated health providers of the necessary ingredients for quality services. Just before the handover, even after four editions of the CQ, and despite the positive reinforcement of teamwork, assessors noted a lack of progress in the quality dimensions ‘customer satisfaction’, ‘technical competence’ and ‘accessibility/ availability/continuity’ of care.
The CQ became an international standard that continues to inspire
Inspired by the approach, by 2010 several countries, including Morocco, Cameroon, Yemen and the Democratic Republic of Congo, had introduced their own versions of the CQ, as documented in this case study. Morocco continues to this day to organise the quality contest on health centre level. Cameroon abandoned the approach after one ambitious cycle covering all health structures in the entire country.
But the core idea of promoting motivation for quality improvement through self-assessment balanced by external verification and healthy competition among participating health structures continues to fascinate, as witnessed by three pilot initiatives, all launched in 2018, that aim – each in its own way – to ‘iron out the wrinkles’ that compromised the original CQ. Implemented with support from the Gesellschaft für Internationale Zusammenarbeit GmbH (GIZ) on behalf of the German Federal Ministry for Economic Development and Cooperation (BMZ), the ‘Quality Challenge’ in Cameroon, Togo’s ‘Quality Approach’ and Guinea’s ‘Monitorage Amélioré’ (Improved Monitoring) are current successors to the once-influential Concours Qualité.
Cameroon’s ‘Quality Challenge’: Quality Improvement can be simple and fun!
Undaunted by the fiasco of Cameroon’s earlier attempt to introduce CQ, but taking into account the lessons learned from that experience (e.g. rolling out countrywide without a pilot phase, trying to assess all quality structures and dimensions at once), the team of the Family Planning and Health System Support for Resilience Project (Pro PASSaR) and their partners chose a reduced-scale and light-hearted approach (‘gamification’) to encourage staff motivation.
Within the project’s two intervention regions, the Quality Challenge’s first one-year cycle was piloted in three districts and targeted only health centres, up to 15 per district. Rather than assessing the whole gamut of healthcare standards, the Quality Challenge retained only two priority dimensions: service to users and hygiene, resulting in just 11 criteria for health centre teams to focus on. For each district the project hired a Facilitator with a health background to provide on-site mentorship and capacity-building in the participating facilities. This eliminated costly off-site training and ensured that all staff, including support personnel, received capacity-building targeting the situation of their own health centre. The community, via the facilities’ and districts’ Health Committees, provides key support to the process, participating in the health centres’ self-assessments, in implementation of their improvement plans, and in the final jury which assesses progress made and awards prizes. Rewards are distributed to all participating facilities, proportionately to their ranking, and consist of equipment corresponding to needs the health centres have expressed to the jury.
Achille Christian Bela, Technical Advisor in charge of Quality Management at Pro PASSaR, explains: ‘The Quality Challenge aims to foster intrinsic motivation on the part of the health facility teams through their freedom to adhere to the process, the teamwork that highlights the contributions of technical staff, support staff and the community, the fun of the competition and the satisfaction of having been able to achieve change on their own.’ The validated results of the first and second self-assessments by the 41 health centre teams effectively show dramatic improvements in hygiene and reception of users by the vast majority. For example, in service quality, for the indicator “Access to sufficient and appropriate information“, 87.8% of health facilities were initially at quality level 1 (the lowest). At the final self-assessment, 73.17% had progressed to higher levels of quality (2, 3, 4), including 48.78% at level 4. In hygiene, for the indicator “Hygiene equipment available and in good condition“, 92.68% of the health facilities assessed themselves at level 1 at the start. At the final self-assessment, 60.97% had moved to higher levels of quality, including 34.1% at level 4.
The Quality Challenge concept reflects Cameroon’s current situation. The Facilitators were hired to compensate for limited availability of the district health teams in the context of Cameroon’s ongoing decentralisation reform, where Municipalities are slated to take over responsibility from district-level technical teams (who will retain an advisor role). In fact, the next cycle of the Quality Challenge (in six districts) is to be organised under the auspices of the respective Municipalities, whom the project hopes to persuade to hire their experienced district Facilitators.
Togo’s ‘Quality Approach’: Involving community and sharing good practice via WhatsApp
In Togo’s Kara region, the Project for Strengthening the Health System, Reproductive Health and Sexual Rights (ProSanté) and its partners are piloting a quality improvement approach that the Ministry of Health plans to roll out as national policy.
Mawuko Kodjo Djoko, the GIZ project’s Technical Advisor in charge of the implementation of the quality approach, relates: ‘When we started, we found the health facilities in a run-down state. We had to provide basic equipment to the 30 health centres we selected, for them to be able to implement a quality system. We also revitalised the community-based Health Committees, which are officially responsible for managing the cost-recovery funds. We trained them on their roles and responsibilities, and they got down to work.’
Taking a cue from earlier Quality Assurance approaches, Quality Circles including community members were established in the targeted health centres, as on district and regional level. To accelerate change, the new Quality Approach started with a quarterly rhythm, alternating between internal (by Quality Circle) and external assessments (by the next higher hierarchical level), each immediately followed by development and implementation of an improvement plan. All health providers in the targeted structures received extensive training off-site on key clinical and management topics, and, like Cameroon’s Facilitators, Quality-of-Care Assistants recruited by the project support the health facilities on-site in applying the learnings in their work environment.
As in Cameroon, the community has come to play a major role, including through their monetary or in-kind contributions to implementing the facilities’ improvement plans. The assessments measure all dimensions of healthcare quality (health centres are evaluated on 217 criteria), and targeted structures on all levels have shown significant improvement between 2018 and 2021. Health centres’ average score increased from 45% to 78% between the first quarter of 2019 and the last quarter of 2020. Quality management of the region’s seven District Health Teams improved from an average of 46% (26%-60%) in the second semester of 2018 to 69% (56%-75%) in the second half of 2020. The regional team increased from 61% in the second semester of 2018 to 69% in the first semester of 2021. A user satisfaction survey is conducted each year – and has now been extended to include service providers.
Although prizes for quality improvement are currently not awarded, the Quality Approach has been met with enthusiasm: ‘Staff have understood that they are at the centre of the approach: this leads to positive competition among health centres,’ says Mr Djoko. Using their mobile phones and the free WhatsApp application, the 30 health centres initiated a ‘Quality of Care’ platform where they share their good practices and learn from one another.
Despite the prospect of this Quality Approach being adopted as national strategy and extension of the pilot to 50 more health centres, major challenges remain, including low involvement of the district health teams, the lack of integrated, supportive supervision, and the destitute state of most health structures.
Guinea: ‘Improved Monitoring’ revitalises a routine quality tool even older than CQ
Guinea’s Ministry of Health, emerging from the Ebola epidemic in 2015, wanted a sustainable quality improvement strategy to accompany its brand-new National Health Development Plan (PNDS) 2015-24. Mindful of the limitations encountered by the CQ and other earlier approaches, health stakeholders agreed that the familiar and already ongoing ‘Monitorage’ (monitoring) offered the most promising basis for developing a single harmonised, cost-effective and simplified quality improvement tool for the entire country.
The ‘Monitorage amélioré’ (MA – improved monitoring) approach is being piloted in 24 districts (totalling 230 health facilities) in the 5 regions where GIZ implements the Programme de santé de la reproduction et de la famille (PSRF) / Programme d’appui au renforcement du système de santé (PASA2). Despite its different origin, the MA has added features that bring it closer to the CQ model. Yaya Souaré, Coordinator of PASA2, explains: ‘Where the original Monitorage was focussed on quantifiable results, MA is equally interested in how the results were obtained, adding a significant number of process and qualitative indicators. All five quality dimensions identified in the PNDS are assessed, but with a reduced number of questions compared to CQ. For more efficient data management we are developing an online platform MApro, that will have features of interoperability with DHIS2. The pilot includes scientific evaluation of the implementation process for the evidence base, which is to continue during the Ministry’s planned scaling up.’
MA has introduced a cycle similar to the CQ, but over six months rather than a year, starting with self-assessment by the health structure, proposal of an improvement plan, and external audit: the ‘Contre-monitorage’. However, no contest is organised and no prizes are awarded. Nonetheless, health managers report that MA has contributed to an improved working environment for service providers and better healthcare delivery, and other districts not included in the pilot have introduced the approach at their own expense.
By adding the external audit to the classical monitoring, MA has increased its cost and logistical complexity. The partners are weighing different options to reduce that cost, e.g. by bringing in assessors from a neighbouring district or health facility rather than from farther away, and the project is negotiating with the Ministry for integration of the implementation cost of MA into the national budget.
Procedural differences among the approaches highlight different quality objectives
All three of these current quality improvement initiatives share the cyclical organisation of the original CQ and its foundational triad – self-assessment, external verification and an improvement plan – although order and spacing can differ among them. CQ ended its cycle with the improvement plan, while Togo and Cameroon introduce it right after the first self-assessment. In Cameroon’s model, the first external verification is at the end of the cycle by the awards jury; in Togo and Guinea the external verification comes hard on the heels of the initial self-assessment and contributes to further improving the improvement plan or its implementation. Specific similarities and differences between the four approaches are summarised in this comparative table.
Comparison of the original ‘Concours Qualite’ with 3 derived quality improvement approaches
(Faranah, Labé & Mamou regions)
(Adamaoua & West regions)
(Faranah, Kindia, Labé, Mamou, & N’Zérékoré regions)
|Approach||Concours Qualité (CQ) / Systemic Quality Improvement (SQI)||Challenge Qualité (CQ)||Approche Qualité||Monitorage amélioré (MA)
|Supported by||Programme Santé et Lutte contre le Sida, GTZ||Family Planning and Health System Support for Resilience Project (Pro PASSaR), GIZ||Project for Strengthening the Health System, Reproductive Health and Sexual Rights (ProSanté) GIZ||Programme de santé de la reproduction et de la famille (PSRF) / Programme d’appui au renforcement du système de santé (PASA2), GIZ|
|Objective||‘Improve the quality of management and care in health facilities to increase utilisation’||‘Contribute to improving provision of healthcare and services in the target health facilities’||‘Introduce elements of quality management in the health services of the project region’||‘Contribute to improving health system performance to achieve the health goals defined in the PNDS’|
|Model||Donabedian, PDCA||Deming (PDCA)||PDCA||Donabedian, PDCA|
|Coexists with other QM approaches /mechanisms||COPE/USAID (Client-Oriented, Provider-Efficient); facilitating supervision; monitoring; Action-Research (Faculty of Medicine, Univ. of Conakry)||Control Brigade (Regional Team)|
Supervision (District Health Team), Performance Based Financing (PBF)
|‘Integrated supervision’ by district management teams in principle, but usually rather separate supervision of each of the vertical (donor-funded) programmes by ‘focal points’ at regional and district level.||‘Integrated supervision’ by all levels (central, region and district) in principle: more often supervision by different vertical (donor-funded) programmes. E.g. SBM-R (Standards-Based Management &Recognition JHPIEGO/USAID)|
|Responsible partner structure (MoH plus…)||District, Regional Health Teams (but the project seen as mainly responsible)||District, Regional Health Teams (to be transferred to Municipalities)||District, Regional Health Teams||Directorates (central level), District, Regional Health Teams, community health structures and technical and financial partners|
|Context (political, socio-economic)||Socio-political instability during and after introduction of CQ||Decentralisation in process: transfer of administrative responsibility to Municipalities||Implementation of the National Health Development Plan (2017-2022) focused on the quality of services for UHC||Socio-political instability, challenges dealing with Ebola and COVID epidemics|
|Structures targeted||Health centres, hospitals, district health teams, regional teams (from 128 total in 2003 to 245 in 2007)||Health Centres (HC): 41 in 3 districts||Health Centres (30+50), 7 District Health Teams, 1 Regional Health Team||24/38 health districts: 230/491 health facilities (HF), including HCs, hospitals|
|Voluntary or chosen?||Voluntary||Voluntary||Selected (criteria: population, human resources, service utilisation etc.)||All structures in chosen district|
|Community implication||COSAH (Comité santé-hygiène) members participate in self-assessment and in contre-monitorage (external verification)||COSA (Comité santé) : material, financial, labour, planning, participate in self-evaluation of HC team||COGES (Comité de gestion): strengthen administrative and financial management in HF, communicate with population, participate in self-assessment (members of quality circle), labour||COSAH members participate in self-assessment and in contre-monitorage (external verification)|
|Technical guidance by||The Quality focal person from the MoH and 3 Technical Assistants of GTZ (Quality, Management and Informatics)||3 Facilitators with health background hired by project (1 per district)||10 Quality Care Assistants with health background hired by project (1-2 per district)||Implementing committee consisting of 5 representatives from central level, 2 experienced District Health Directors, plus the scientific team|
|Digital support||An application called ‘CQpro’ was developed using Epi Info to manage data collected. This was a stand-alone system.||Tablets for Facilitators: data collection, compilation, feedback||Tablets for surveys||Remote data collection tool MApro (Excel spreadsheet with incorporated macros). A MApro online electronic platform with features of interoperability with DHIS2 being developed.|
|Type of capacity building||Group training of service providers on Quality Management cycle, on-site training of service providers during external verification (audit) on norms and procedures and self-assessment techniques, as well as during accompaniment in the implementation of improvement plans.||On-site coaching by Facilitators: 339 HF staff members trained on-site over 12 months (each HF visited every 1-2 months)||Training (technical and administrative) by regional trainers’ pool off-site (groups of max. 15 due to COVID) of health providers and of COGES members, follow-up on-site by Quality Care Assistants||Training of trainers; training of healthcare providers / managers on quality assurance and MA; on-site training of healthcare providers during external verification and integrated supervision; and group training on specific themes identified by improvement plans.|
|Duration of Quality cycle||Yearly, later every 2 years (total 4 cycles before end of GTZ project)||Yearly||Every 3 months||Every 6 months (MA to alternate with supervision (which is more a financial control)|
|Self-evaluation (assessment grid) by||HF team including community representatives supported by COPE assistants, based on monitoring report, COPE analysis, etc.||HF team including support staff, COSA associated; after implementation of improvement plan, 2nd self-evaluation||HF team plus COGES (each semester)||HF team with community (sometimes supported by District, Regional and Central Health Teams)|
|Audit/verifica-tion / contre-monitorage (same assessment grid) by||A team made up of representatives of health insurance cooperatives, of the community-based Management Committees (COSAH) and health professionals from the central and deconcentrated levels: audit or contre-monitorage||An external jury at end of cycle double-checks HFs’ self-evaluations and ranks them based on current excellence and degree of improvement. 1 representative each from GIZ, Regional Health Team, Regional Health Promotion Fund, District Health Team, and 1 community member of the district-level COSA (+ 1 representative of the Municipality in next cycle)||Quarterly audit by next higher hierarchical level (3 months after self-evaluation); satisfaction survey yearly (of users, of health staff)||‘Contre-monitorage’ by external team composed of 1 District Health Team member, 1 ‘peer reviewer’ (e.g. from a nearby hospital, HC or health district), 2 community representatives (of the HF’s COSAH) and a partner representative (e.g. GIZ)|
|Development of quality improvement plan by HF||After each external evaluation (audit)||After self-assessment and analysis of challenges supported by facilitator||Quarterly: after each (internal or external) evaluation||After each self-assessment, adapted after external assessment (contre-monitorage)|
|No. of topics/ dimensions||6 quality dimensions, 2-10 aspects each, 4 questions per aspect||2 dimensions of quality: reception in the health facility and hygiene. Additional criteria related to human rights and prevention of COVID will be integrated into these 2 dimensions in the next cycle.||7 quality dimensions (efficacy, safety, person-centred, timely delivery, equity, integration, efficiency)||All aspects of healthcare delivery are monitored through the 5 quality dimensions of the National Health Development Plan (PNDS)|
|No. of questions/ criteria total||Per type & level of structure: 59 for HC; 80 for hospital, 46 for district team, 50 for regional team||11 target indicators||217 for Health Centres|
104 for District
104 for Region
|HC: 21 quantitative and 14 qualitative indicators
Hospitals: 45 quantitative and 23 qualitative indicators
DPS: 17 quantitative and 12 qualitative indicators
|Scoring system (Indicators of Process or Result?)||Score from 0 to 4 for each question. The score obtained in audit adjusts that of self-assessment.||Measures taken to achieve and maintain desired criterion: insufficient (1), basic (2), intermediate (3), high (4) and optimal (5).||Performance, percentage||MA rates not just results, but also process and inputs, quality elements. A scale of 0 to 4 is used to appreciate the performance of each indicator. Health facilities are ranked according to total score accumulated.|
|Fostering competition to stimulate quality||Competition with rewards/prizes||Competition with rewards/prizes||Best practices shared in regional meetings and in WhatsApp group (Quality contest planned but not implemented)||Positive competition amongst HF – ranking. Best practices shared through financed study trips (local and international)|
|Type of reward||Money given to in-charge of health structure: used at their discretion. Certificate to display in HF||Health equipment given at end of cycle for all participating centres proportionate to improvement/ score, responding to needs expressed (to the jury) by the HF||(At start of cycle all chosen HF receive basic equipment essential for the implementation of a quality system). COVID fund will be used for QI bonus (material investment) to the HCs that progress the most||Certification by the Regional Director of Health (DRS), Sharing of ranking in forums, technical health committees on district and regional level, as well as financing microprojects|
|Quality committees||Quality teams at regional and district level to supervise implementation of QI plan; facilitating supervision to support QI activities||District level: Quality committees (include community, Regional Health Promotion Funds): responsible for M&E of the initiative||HF: Quality circles (include community); District (& Region): Quality Management Councils (coincides with the District Health Team)||MA technical implementation team (update tools); ‘Comité de qualité” at central & regional level|
|Other actors||Training schools, researchers, University of Conakry||Regional Health Promotion Funds, Training schools||Midwife training school (Ecole Nationale des Sages-femmes de Kara); University of Liverpool training HC staff on newborn health||Training schools, researchers (‘scientific evaluation’)|
|Financing HF improvement plan||Financial awards received by well ranking HF, local resources (HF) and external financing by devel. partners||Mobilise local resources at HF level (including from community) to finance the improvement plan||Mobilise local resources at HF level (budget from cost recovery) to finance the improvement plans||Local resources of HF, communities and technical and financial partners|
|Sharing of good practices||Through the competition and the closing ceremonies||Through the competition and the closing ceremonies||Quarterly restitution meetings organised by regional level with all HF including community; sharing via WhatsApp group of selected HFs||During Technical Health Committee (district and regional) meetings, study trips, and peer review during contre–monitorage (external verification)|
|Estimated costs||Per HF: 320 Euros, ½ day (5 h. 14 min. 24 seconds of work in 2007)||144,000 Euros from the setting up of the concept (2018) to the awards ceremony (2020): international consultant, training of the 41 health facility teams in QC, salaries of the facilitators, acquisition of the rewards for the HF, organisation of the award ceremonies.||Nationwide roll-out of MA estimated to cost ca. 3000 EUR per health facility.|
|Sustainability plan||Adopted as official national strategy and transferred to MoH in 2006. 2007-8: Guinea government managed only 1 quality cycle, but found the process too onerous and costly to continue (particularly audit: transport and per-diems)||Transfer facilitation skills to district, to Municipalities’ technical team (hope they will hire the project facilitators). Transfer responsibility to Municipalities in same intervention regions, with District Health Team as advisor; introduce QM module in training schools for health personnel. Updating self-assessment grid to include new topics.||Quality Care Assistants transfer skills to district Quality Managers. The regional QM approach piloted in Kara region has been adopted as Togo’s national QM strategy, to be rolled out starting in 2022||Adopted by Guinea’s MoH, to alternate (6 months) with supervision, waiting for official roll-out decree. MoH has been requested to integrate funding of MA in national budget. Suggested to use nearby district representatives as peer reviewers to reduce contre-monitorage costs.|
All four approaches are interested not only in results but in how they were attained (the quality dimension). Yaya Souaré gives an example: ‘For skilled birth attendance, conventional monitoring only looks at the rate, but MA also looks at the use of resources such as the partogramme and at the process, e.g. quality of reception, compliance with protocols.’
Quality objectives appear slightly different among the three countries. Where Togo and Guinea take a ‘blanket approach’, including different levels of health structures and all official quality dimensions in the assessment, Cameroon’s Quality Challenge focusses on just two quality dimensions with only 11 criteria – and only on health centres. Furthermore, Cameroon’s approach, like the original CQ, is based on voluntary participation by health facilities, whereas Guinea’s MA enrols all health structures in the selected districts, and in Togo’s Quality Approach the health centres are selected on the basis of established criteria. Furthermore, of CQ’s current descendants, Cameroon’s Quality Challenge is the only one to maintain the competition with prizes. These contrasts give the impression that, where the original CQ tried to ally motivation through constructive competition with attention to overall healthcare quality, the present initiatives lean towards one or the other: Cameroon gives priority to stimulating staff motivation through a simplified approach leading to rewards, hoping like CQ in its time to increase coverage by attracting new participants; Togo and Guinea on the other hand want to develop an efficient, routine tool reflecting all quality dimensions of their respective Ministries of Health, to be applied in all health structures.
Is the institutional context favourable to implementation of the new quality approaches?
None of these quality improvement strategies are being implemented in a vacuum. All stand on the legacies of earlier mechanisms and structures, often introduced by external development partners, e.g. USAID‘s Quality Assurance approach which initiated the Quality Circles in Togo, or cost recovery, introduced in the mid-1990s, which provides the funds for the facilities’ improvement plans and created the community co-management structures that today play such a fundamental role in all stages of the quality improvement cycles.
Since healthcare quality has been a concern since the very beginning of health services, these new initiatives also coexist with earlier strategies aiming for the same objective. In Guinea, though MA was intended to supersede the myriad quality approaches previously implemented in the country, it coexists with the SBM-R (Standards-based management and recognition) approach promoted by USAID, and it is planned to complement ‘integrated supervision’ by the district health teams, alternating in a six-month rhythm.
In all three countries, the reality of supervision stands in stark contrast to the ideal pursued by the new quality approaches. In Cameroon, the Quality Challenge is kept completely separate from the rare and harsh supervisory visits of the regional Control Brigade and the district health team. Achille Bela reports: ‘The control “brigade” creates fear of sanctions among personnel, with consequences for their career. Our Facilitator is rather a member of your team, helping you to solve problems with your own resources.’
In Togo and Guinea, supervision is in fact rarely ‘integrated’, but targets the different development partner-funded vertical programmes, each of which has a Focal Point on district and regional level, implying a separate supervision for each disease. In Togo, as Kodjo Djoko explains, ‘There is even a “Quality” focal person in each district health team, leading these partners to think that quality is a GIZ programme – hence our battle.’
What is the role of the District Health Team?
In the classic district health model, promoted since the start of the 1990s, quality of healthcare in all facilities is ensured by an autonomous District Health Team through regular supportive supervision and coaching. Why then in Cameroon and Togo did GIZ have recourse to hiring their own Facilitators / Quality-of-Care Assistants to accompany the health facility teams on their quality improvement journey? In these two countries, the District Health Teams appear increasingly irrelevant: In Cameroon, they have only 3 technical members and are soon to be replaced by the Municipalities, while also being dwarfed by the GIZ-supported Regional Health Promotion Funds. In Togo, where the Regional Directorate manages the Quality Approach, district teams have their hands full with the vertical programmes and only paid attention to the new quality initiative once its impact became visible. In Guinea, district teams are seconded by regional and even central-level structures in managing the MA.
In part to palliate the weak implication of the District Health Teams, all three countries have put together – with varying success – ‘Quality Committees’ at district, regional and central level to oversee the process.
Importance of the ‘external eye’: Somebody cares!
Despite these challenges, in all three countries the health facility teams have responded with enthusiasm and improved performance to these new quality initiatives. The novelty effect – feeling themselves part of something that takes them out of their daily routine – may play a part, at least initially, but what these approaches have in common is the fact that someone besides the health providers is paying close attention to their performance and engaging with them in a supportive dialogue. In all three countries, this is the intended effect of the external audit, ‘contre-monitorage’ by outside assessors or awards jury, but in Cameroon and Togo the Facilitators /Quality-of-Care Assistants bring even more to this role, as they guide the facility teams in applying quality measures in their local environment.
Community involvement in the life of the health facility is key
Most important of all, though, is the implication of the community in the life of the health facility. Long dormant with the atrophy of community co-management of cost recovery, community support to quality measures is motivated by their vital interest in the healthcare services that their local facility provides. The community is there to stay, far more permanent than the hired Facilitators, or the external assessors, or the health staff themselves. Community support costs nothing to the system; in fact the community may even invest in their health facility. Achille Bela points out: ‘These community actors join with the health facility team in introducing the changes to be made. They rely on the dynamics of the health facility team’s commitment to the community – quite a contrast with the punitive effect of hierarchical control or supervision by the region or district!’
In Togo’s example, the pleasure of sharing good practices with colleagues via the WhatsApp platform has also shown itself as strongly motivating.
So is the human touch sufficient motivation to get staff to perform well? Do people just need to feel someone is paying attention to what they are doing? What about material/financial rewards, as in CQ and Performance-Based Financing?
Impact of material/monetary incentives in a resource-poor environment
Most African countries, including the three in question, are currently not much wealthier than was Guinea at the time of the CQ. In all three the health sector is heavily dependent on funding by international development partners. This contributes to deep imbalances, including the plethora of vertical programmes which preclude integrated – and therefore more efficient – healthcare management. Since technical and financial partners do not pay for salaries, many facilities are so drastically understaffed (e.g. in Cameroon health centres with only 2 out of 7 required staff) that they could not be included in the quality improvement pilots. The situation is similar in Togo, where only a third of training school graduates are hired by the Government – the others remain unemployed or fuel an increasingly unregulated private sector.
German development cooperation’s investment in minimal equipment for the selected health centres in Togo implies that without these external resources quality healthcare would not be possible. (The teams in these crumbling infrastructures then successfully lobbied the region for a ‘quality bonus’ for minor improvements such as solar energy or drinking water, to be paid out of COVID funds.)
Regular external audits – like supervision – require vehicles, fuel and per diems, and in such a destitute situation appear as a luxury that national governments cannot afford. The same goes for taking over salaries of project-funded staff dedicated to quality promotion when projects leave. Sustainability of such programmes would therefore require development partner support never to end.
It is evident that in such a resource-poor context financial or material incentives will be strongly motivating to the individual – as long as they continue. And yet there are numerous examples of what happens when the external stimulus/support ceases (e.g. with the end of generously remunerated mass vaccination campaigns): the frustration extinguishes any pre-existing motivation.
Extrinsic vs. intrinsic motivation
Material and financial incentives are extrinsic motivations, that are effective only as long as they remain in place. Other examples are getting a good grade or a prize for good performance. Negative examples are external control and the fear of punishment. The ‘carrot and stick’ are the classic example of extrinsic motivation.
Intrinsic motivation is the anticipation of the pleasure and pride the person will get out of the activity itself. Examples from healthcare could include good teamwork, working in an improved environment, feeling appreciated by clients/patients, the moral satisfaction/pride of saving a life, in being ‘the best’, the pleasure of being able to manage funds, to gain autonomy, to plan improvements, to realise one’s ‘dream’….
Like the Cameroon team, those promoting quality improvement hope by their initiatives (extrinsic incentives such as giving prizes) to inspire long-lasting intrinsic motivation on the part of health staff to do good work for its own sake, like a flame passed from candle to candle.
The risks of separating quality from performance
The shocking example given by the Togo team of ‘quality’ perceived by district health teams as its own vertical programme supported by GIZ illustrates the fictitious divide that has made ‘quality’ appear as an entity to be pursued in its own right. In reality, quality just means doing one’s work properly – i.e. according to agreed standards. Humans normally work well, and indeed the least one can expect from any health worker is that they perform their job as they have been taught to do (on condition of course that they have the needed resources). Quality is not a ‘what’, but a ‘how’.
Singling out healthcare quality for its own separate strategy fuels the confusion and encourages planning and funding initiatives that focus on ‘quality’ rather than on good healthcare. The questions that should be asked, according to Dr Paul Dielemans (GIZ Malawi), are: ‘How can Quality Management be integrated into health strategy? How can it be introduced in non-project mode? How can it be institutionalised?’
The four examples described above show that, when quality healthcare is not available, it is more often due to a lack of human, material and financial resources than to insufficient motivation on the part of health providers. Without the necessary resources for adequate healthcare, the most brilliant quality improvement approach will not work.
Sustainability of health services of acceptable quality requires two things: a minimum of available resources, and intrinsic staff motivation to perform well. The latter can be fostered by a benevolent ‘external eye’, including supportive supervision.
Dr Mary White-Kaba, September 2021