1 day ago 1

News24 | What checks should SA use to rate health facilities?

  • In mid-August, the CEO of the Cecilia Makiwane Hospital in the Eastern Cape reportedly told provincial politicians that a serious staff shortage “compromises service delivery”.
  • But in May last year, the Office of Health Standards Compliance rated the facility as “good”.
  • This means that it can be part of the plan for the National Health Insurance rolout.

In mid-August, the CEO of the Cecilia Makiwane Hospital in Mdantsane, about 15km from East London, reportedly told provincial politicians that a serious staff shortage — 223 vacant posts — “compromises service delivery”.

But in May last year, the Office of Health Standards Compliance (OHSC) rated the facility as “good”.

The OHSC is like a gatekeeper for quality healthcare. It “inspects and certifies” public and private health facilities in the country against national norms and standards, looking at things like cleanliness, staffing, availability of equipment and medicines, recordkeeping, and patient care.

Only hospitals and clinics cleared by the OHSC will be able to deliver services for the country’s National Health Insurance (NHI) scheme.

Once a health establishment gets its certificate of compliance from the standards watchdog, it holds it for four years. So, in the case of this Eastern Cape hospital until 2028 — which is, incidentally, when the rollout of the second phase of the National Health Insurance (NHI) Act is supposed to end.

By that time, says the act, the NHI Fund — which will essentially work like a big, state-run medical scheme — has to be up and running. This means that hospitals and clinics must be ready to provide services the government can buy from them in order to give everyone good-quality healthcare. For this, health facilities must have passed the OHSC’s standards test.

Of course, getting to that end point also means that the health department must have the money - both to pay for the services being provided and to help clinics and hospitals to do a good job.

READ | Western Cape govt challenges NHI Act in Constitutional Court over ‘flawed’ public hearings

Dave Martin, the co-founder of a rural health and education non-profit in the Eastern Cape and who for at least 20 years has relied on state hospitals and clinics, previously wrote for Bhekisisa about his own experience of what well-run government health facilities could look like.

From his viewpoint as a state services user, he said that to make universal health coverage work in South Africa, the sequence has to be to “first fix public healthcare to a decent standard” and only then start redistributing money between the public and private health sectors.

But what does the service actually look like? Because only if you know what you’re working with would you know what to fix.

When we looked at the results of the 2022/23 OHSC inspection report — the latest report publicly available — our sums showed that the average service score across government health facilities was 67%.

If it were a child’s report card from school, we’d say “Well done!”

So where, then, is the disconnect between what the numbers show on paper and what headlines, political parties and a July report from the health ombud say about what’s happening in real life?

In this story — a follow-up to our first one about health standards compliance in South Africa — we take a deep dive into the data to get a sense of what’s afoot.

Is the big picture the real picture?

To get an idea of what health facilities’ performance looks like across South Africa, we turned to the summary results for each province’s scores given in the report.

These summaries are essentially tables made up of five columns — one for each broad category (called domains) into which the 23 standards set out in the National Health Act are organised — and a row for each district in the province, as in our example below.

The report then lists an average score over all rows in a column — in other words, an average made up of all districts’ results in a specific domain.

When we averaged all the domain scores in a province’s summary table, our sums showed that Gauteng’s health facilities met close to 82% of the requirements.

Clinics, community health centres and hospitals in the Northern Cape passed only 44% of the issues they were scored on — and it was the only province to rate under 50%.

Nationally, health facilities’ service quality sat at 67%.

But here’s the thing: these summary values are the result of scores across 132 quality statements for each facility in the province — from local clinics to regional hospitals — reviewed that year. (For South Africa as a whole, it’s the average of these statements over 781 government health facilities.)

It’s like comparing the metaphoric apples to oranges.

Masking effects

But there’s still more to think about.

The domains don’t all include the same number of quality statements (called criteria) to be evaluated.

For example, the domain “clinical governance and clinical care” — which covers questions about the systems a facility has in place to care for patients and handle their records — includes 80 items to be scored across seven standards, making up about 61% of the total set of quality statements.

But only six criteria fall under the four standards that make up “clinical support services” — the domain that covers things like the availability of equipment, medicines, diagnostic services and blood products — and they make up only 5% of the total set.

A list of 80 questions can yield far more specific answers than a list of six. Moreover, in a list of many items, those with poor scores can become hidden among those with good scores.

READ | Motsoaledi seeks time-out on NHI court battles as ConCourt clash looms

And if the ones on which it’s fairly easy to score well (for instance, a fairly administrative issue, such as having drawn up standard operating procedures for a task) make up more of the total than the ones on which it might be difficult to score well (such as making sure that an infection doesn’t spread through a facility), then the overall score in an area could be higher — on paper — than what’s happening in practice.

The deep dive

Having seen from the analysis that looking only at average scores gives a superficial — and possibly skewed — view of what the service level at government health facilities really might be, we put the scores for each quality statement in each province into a spreadsheet.

This gave a total dataset of 1 212 values to work with. (This total is more than the expected 1 188, which would be the product of 132 criteria x 9 provinces, because when we extracted all the data we found some additional criteria in some provinces and also some missing data points.)

By assigning a colour code — from deep red for scores of 30% or less to bright green for any scores higher than 70% — we could put this heatmap together.

You can click on the link to go to an interactive version to explore for yourself.

A bird’s eye view shows that hotspots — criteria for which scores were below 50%, and therefore shown in red or orange — cluster in the left-hand corner (the area outlined with a dashed line), and make up about a quarter (just over 26%) of the total area of the map.

And this, we think, is something concrete to work with — especially when remembering the advice from University of Cape Town health economist Susan Cleary:

“It’s partly a matter of ‘cutting your coat according to your cloth’,” she told Bhekisisa previously.

“We have to let go of this idea that we can have everything and that it all has to be perfect otherwise it’s not good enough.”

There’s always a trade-off between quality and quantity, explains Cleary. Given a certain budget, a province has to decide whether to focus on offering more services of possibly less-than-perfect quality, or fewer services at closer-to-perfect quality.

WATCH | How xenophobia, fear and flawed policy threaten our HIV fight

Getting quality up to what Dave Martin describes as “a decent standard” calls for proper priority setting, she says.

So, consider as a thought exercise, the scores for the first 10 quality statements, which seem to be flash points in all nine provinces (the rectangle we shaded on the map).

Could that be a feasible place for decision-makers to start when thinking about how best to spend money, whether it’s to look critically at how important a criterion actually is or to really roll up the sleeves and fix what is wrong in that spot?

Says Cleary: “The hard truth is that we can’t have everything, and [policymakers] have to have the courage to make the difficult decisions of what is in and what is out.”

*This story was produced by the Bhekisisa Centre for Health Journalism. Sign up for the newsletter.

Read Entire Article

From Twitter

Comments