W2A021T The Impact of Microbiology in Human Development Introduction Before the discovery of microorganisms, living objects on earth were known to include plants and animals without any intermediate organisms. However, during the 19th century it became clear that microorganisms while combining some properties of plants and animal cells, were also markedly different from the plant and animal kingdoms and hence deserved being classed separately. Thus although microorganisms belonged to the same evolutionary tree as animals and plants, Haeckel proposed in 1866 that they be assigned a separate kingdom, the protista. This kingdom includes protozoa such as amoeba, fungi, algae and bacteria. Early in the present century, a special group of microorganisms, the viruses were discovered. Viruses consist of either DNA or RNA molecules surrounded by a protein coat. Viruses require living cells of plants, animals or other cellular microorganisms for their replication. Much smaller transmissible molecules composed of simple naked RNA molecules called viriods were discovered more recently. Viriods are associated with some plant diseases. Therefore microorganisms as they are known today include; (i) Algae, (ii) Protozoa, (iii) Fungi, (iv) Bacteria, (v) Viruses. Among these groups of microorganisms, bacteria, fungi, protozoa and viruses play the major role in the socio-economic development of man. Other organisms such as helminths and insects may have microscopic appearances but together with protozoa, they have complex cell structures similar to plants and animals and many of them are also visible to the naked eye. These bigger organisms are now covered in the field of parasitology and entomology which is outside the scope of this presentation. In discussing the impact of microbiology in human development, it is necessary to consider the existing interrelationships between man and the microbial environment. The science of microbial ecology dealing with the relationships between man, microorganisms and the environment can easily explain the resultant negative and positive contributions of microorganisms to human development. Microorganisms are a cause, of a wide variety of human diseases many of which result in death or permanent deformity. The management and control of such microbial diseases often require utilisation of major health care resources including manpower, drugs; medical equipment and finances. Loss of productivity by peasants and workers due to sick leave or from premature death due to infections are additional negative economic effects of microorganisms in human development. Negative effects of microorganisms on the environment include the putrification and other destructive effects on buildings, machinery, clothing and other objects of importance to man. Microorganisms also constitute major causes of disease of animals and plants, many of which are important sources of food and raw materials for various products of socio-economic importance to man. The major positive contributions of microbiology towards human development is dependent on the exploitation of microorganisms in various industrial processes. The microorganisms used for such purposes are either naturally occurring or have been modified by various genetic engineering processes whereby genes for specific biological activities are introduced into the chromosome or plasmid(s) of a bacterium or yeast in order to produce a defined protein product. Naturally <-/occuring> or genetically engineered microorganisms are utilised in the production of hormones such as insulin, vaccines, antibiotics, alcoholic beverages and other products including acetone-butanol, a raw material of cordite used for war purposes. Microorganisms are also utilised in the treatment of sewage and in the breakdown of other environmental waste products resulting in plant manure, and various gas fuels such as methane. In this presentation, I will concentrate on the socio-economic aspects of the harmful effects of microorganisms with emphasis on human disease causation. Microorganisms are at present the commonest cause of <-/ illhealth> and death among Tanzanians; hence the desired goal of achieving health for all by the year 2000 heavily depends on the success to be achieved in the control of microbial infections. It is encouraging to note that the technology to control the major communicable diseases of man is accessible, cheap and is continuously being made easily available. The major constraint preventing more success of the microbial diseases control effort is the failure of the community to adopt such measures. It is hoped that this year's slogan of the WHO's 40th year anniversary; "Health for All, All for Health" will attract more people in adopting the right route towards health for all. The vicious cycle of poverty, ignorance and disease which is prevalent in Tanzania needs to be broken if quick advances in the control of microbial diseases has to be achieved. It is disturbing to note that while a lot of knowledge and expertise is now available worldwide on effective control measures against the major microbial diseases, the prevalence of such diseases in Tanzania now is similar to the situation which prevailed in Europe more than 50 years ago. Obviously, poverty and ignorance have been playing a leading role in determining the prevalence of such infections. Efforts to utilise modern technology such as vaccines and antibiotics in disease control have to go hand in hand with improvement of the economy, as well as providing education to the people. Tied up in the broad concept of socio-economic development is the role of human behaviour in determining the mode of transmission of communicable diseases. Diseases such as diarrhoea, sexually transmitted diseases including the Acquired Immune Deficiency <-/Syndrom> (AIDS), skin infections, urinary tract infections, plague, anthrax, tetanus and tuberculosis are among many infections which heavily depend on bad human behaviour in their dissemination. Positive behavioural change alone may significantly reduce the impact of such communicable diseases in a community and part of this presentation will attempt to highlight how positive behavioural changes can enhance efforts in the control of communicable diseases. Microorganisms in Health and Disease in Tanzania Microbial diseases due to bacteria, viruses and fungi are at present the commonest cause of <-/illhealth> and death among <-_Tanzanian><+_Tanzanians> of all age groups accounting for more than 50% of all causes of <-/illhealth> and deaths. As shown in Table 1, the major ten causes of <-/illhealth> among Tanzanians attending outpatient treatment health care facilities include respiratory tract infections, acute diarrhoea, skin and eye infections; while major causes of death in hospital wards include pneumonia, measles, acute diarrhoea, tuberculosis and tetanus. In some parts of Tanzania, such as in Bukoba rural district, sexually transmitted diseases (STDs) were until 1985 second after malaria as the top commonest reason for attending outpatient health care facilities in the district. Recent unpublished and unconfirmed reports have suggested a drastic fall in the incidence of STDs in Kagera region during 1986 and 1987 following an intensive health education campaign in the region against AIDS. Many of the major microbial infections causing most diseases and death in the country are easily preventable by readily available technology. For instance measles and tetanus can be prevented through childhood immunization while most deaths from acute diarrhoea can be prevented by use of oral rehydration salt solutions. While socio-economic factors are a major determinant of such infections, the hot and humid tropical climate, which allows for easy multiplication and survival of harmful microorganisms and their vectors such as mosquitoes and fleas in the environment, is an additional constraint. Malnutrition which is common in Tanzania and results from the interactions of poverty and ignorance is a further contribution to the problem of microbial diseases. Two microbial disease outbreaks of great public health significance, cholera and AIDS will be used to illustrate the important role of human behaviour in communicable disease transmission. While cholera will be seen as a disease of the relatively poor and more ignorant members of the society, AIDS is currently a disease of everybody but it is very likely that in future the poor and ignorant will also be left behind unless appropriate measures are taken now. Cholera While the most frequent method by which Vibrio cholerae serogroup 01, the causative agent of cholera, has been transmitted in Tanzania through consumption of contaminated water, we have been able to establish by epidemiological and bacteriological methods that transmission through person to person by close contact is also important. This is especially among communities living under overcrowded and unhygienic conditions. By person to person transmission we have defined it to mean, that spread of the causative vibrio through close and direct physical contact, or even indirectly, through contaminated household food, or household water. Sometimes it is as well possible to be transmitted orally through contact of clothes <-/soacked> with faeces containing V.cholerae 01, or through other household articles as opposed to transmission through public sources of water and or public food supplies. Many cholera outbreaks have therefore been associated with intrafamily contacts, burials, overcrowded hospital wards, overcrowded prison camps and other institutions where people have been living closely together under poor hygienic conditions. It is now known that cholera epidemics in Tanzania have a seasonal pattern with peaks of disease occurrence during the two rainy seasons of October to December and March to May. During these rainy seasons, public sources of water, unless sandfiltered and disinfected or boiled before being drunk, are usually grossly muddy and often are contaminated with surface water passing through faecally soiled ground in areas where communities do not have proper human excreta disposal facilities. Water drawn from such environmental sources in cholera affected areas has often been shown to contain heavy counts of V.cholerae 01 and other faecal organisms. Cholera epidemics in Tanzania have been particularly serious among populations living in low lying plain areas, with high ground water table and especially where the water is also salty. These conditions favour multiplication and survival of V.cholerae 01 Early in the ongoing 4th ever recorded cholera epidemic in Tanzania - which started in Rufiji in October 1977 and which by the end of 1986 had involved 40,185 patients with 3538 (8.8%) deaths, it was for the first time demonstrated that under the selective pressure of wide scale use of antibiotics in attempting to control spread of cholera, strains Of V.cholerae 01 resistant to antibiotics quickly emerge and spread in the community. During the first six months of the cholera outbreak in 1977/78, 1975 kgs of tetracycline were used while 4,357 kgs (4.36 metric tons) of the same antibiotic were used in cholera control during 1978. The percentage of tetracycline resistant V.cholerae 01 isolates quickly rose from 0% in October 1977 to 76% in April, 1978 meaning that most of the tetracycline used for cholera control during 1978 was wasted. During 1978, 13,300 patients with cholera were notified to have received cholera treatment in Tanzania at an estimated cost of 20,000,000 shillings. The cost of environmental sanitation measures could have utilised a similar sum of money while loss of production due to <-/illhealth> by patients and cost of other public health measures such as quarantine, closure of markets, restaurants and bars could have resulted in loss of the same or more amounts of money meaning that up to 60 million shillings was spent in one year alone for a single easily preventable disease. The big amounts of money spent on emergency cholera control measures, often utilising inappropriate measures such as <-/quaranteen> and cholera immnunization are a clear testimony of the socio-economic impact of microbial diseases in human development. The remaining few countries in the world still-demanding cholera vaccination for international travel are doing this against the recommendations of the W.H.D. and are not utilising cholera control resources appropriately. Cholera continues to be a public health problem in Sub-Saharan Africa and in the Asian Sub-Continent where suitable conditions for survival and transmission of the causative vibrio still abound. Contaminated water and foods including raw or inadequately cooked fish and person to person close contacts are the major means of transmission of V.Cholerae 01. The organism can traverse district, provincial and national boundaries by asymptomatic carriers. Moreover, moving pools of contaminated water in rivers, lakes and oceans easily occur making <-/quaranteen> of people a useless and very costly exercise in containing cholera outbreaks. It is most unlikely that cholera, a disease of the socio-economically poor, can be eradicated from the world and particularly from Tanzania until the socio-economic conditions of the population attain a high level so as to provide for safe water and for sanitary excreta disposal facilities. Alternatively, eradication will require availability of a highly effective cholera vaccine which is not yet <-/insight>. In the meantime, major emphasis in cholera control should be put on continuous disease monitoring and on <+_article> institution of immediate treatment to affected patients and by implementation of other environmental disease control measures. The use of oral rehydration salt solutions supplemented with intravenous fluid therapy as part of the national diarrhoea diseases control programme, when necessary, should be able to save up to 99% of all patients with cholera compared to the present situation when only about 9% of cholera victims succumb. Acquired Immune Deficiency Syndrome The Acquired Immune Deficiency <-/Syndrom> (AIDS) is today the most challenging public health problem in the world and in Tanzania. Never in the history of human public health has a single disease posed such a big risk to the survival of the human race as AIDS has attracted the attention of many government ministries, health scientists and the general public. The AIDS pandemic has broad implications affecting nearly every aspect of human life including politics, economics, demography, morality, health care delivery, family life and community development. For the first time in human relations, the most secret aspect of human relations, sex, is being widely discussed in public outside the precincts of the locked bedroom. It is unbelievable that the most intimate and sweetest aspect of human life should also be associated with the most deadly sexually transmitted disease. In the same spirit of free public discussion on this deadly disease, I hope you will bear with me if in presenting current thoughts on AIDS I tend to become too explicit. I believe being frank with each other is the only way we can address the problem and hopefully come up with sound solutions. AIDS is caused by the human immunodeficiency virus (HIV) of which there are two types so far recognized. More serotypes may emerge in future. HIV type 1 (HIV-1) is the major cause of AIDS in the world including in the U.S.A., South America, Europe, East, Central and Southern Africa and parts of West Africa. The HIV type 2 (HIV-2) is the cause of AIDS in parts of West Africa and a few cases have been reported in Europe. The first cases of AIDS were described in the U.S.A. in 1981 and soon after, cases were reported in Europe. Cases of AIDS were first reported from Central Africa in 1982 and from Tanzania the first cases were those seen in 1983. Where from in the world and when the HIV first emerged is not known and will most probably never be known. The widely publicised <-_suggested><+_suggestion> that the HIV was <-/man made> in a laboratory has absolutely no scientific basis and may have been propounded in order to deviate human attention from the importance of person to person transmission of the virus. It is possible that the HIV may have emerged from mutation of a hitherto harmless retrovirus and its recent wide dissemination may have resulted from increase in multiplicity in human sexual relations. Since 1983 until the end of October, 1987, 3255 cases of AIDS of whom 1192 deaths occurred were notified to the Tanzania Ministry of Health and Social Welfare (Table 2). The world total of notified AIDS cases was 85,273 until 1 April 1988 with 55,167 of the cases being from the U.S.A. Of the notified AIDS cases in the country 61% have been male while 39% have been female. Eight percent of all notified AIDS patients have been children below 13 years but the majority of the children (89%) were between 0-4 years and none of them was in the age group of 10-13 years. The age group 16 to 40 years is the most affected for both sexes with the peak age group for males being 26-40 years while for females the peak has been between 21 to 30 years. The age group of 16 to 40 years comprises about 40% of the present Tanzanian population and is the age when individuals are most energetic and hence most economically productive. It is also the age of reproduction of the next generation hence the rising prevalence of young children with HIV infection in the country. Between October 1985 and the end of December 1987, the department of Microbiology and Immunology laboratory and the blood bank both at Muhimbili Medical Centre in Dar es Salaam had tested 21,682 serum specimens from blood donors and from AIDS suspect patients for serological evidence of HIV infection. 655 (5.5%) of the 12004 male blood donors and 6 (2.8%) of the 211 female blood donors were HIV-1 seropositive and 3245 (40.8%) of the 5742 male clinical AIDS suspects were HIV-1 positive as well as 1316 (35.3%) of the 2723 female AIDS suspects. AIDS is therefore a quietly smouldering biological bomb whose full impact is yet to be fully understood by the general population. A review of the available seroepidemiological research data on infection rates in the country indicate the seriousness of the problem calling for immediate and appropriate intervention measures. The AIDS patients being diagnosed now include many of those who were infected two to eight years ago and even if further spread of the HIV could be stopped now an ever increasing number of AIDS cases will continue to appear during the next 5-10 years. From the seroepidemiological data summarised in Table 3 it is estimated that on average about 2% of the whole <-_Tanzania><+_Tanzanian> adult population is currently HIV infected. Assuming that for every notified AIDS case, there are between 50 and 100 other HIV infected individuals in the population, it can be deduced that there could be between 200,000 and 400,000 HIV infected people in Tanzania now. If we assume that up to 20-30% of those currently infected with the HIV will develop AIDS in the next five years, then it means there may be between 40,000 and 120,000 AIDS cases in 1993. For individuals engaging in high risk behaviour and for communities in the country where current HIV infection rates are between 30-60%, the impact of the infection to such risk groups is going to be particularly catastrophic. Luckily enough, children between 6 to 14 years are still largely spared from this great disaster because of recent introduction of the HIV in the country and because of their non-involvement in sexual relations. All efforts should therefore not be spared in protecting this future generation. It is however, disheartening to note that the trend of young childhood AIDS in Tanzania is rapidly rising in direct proportion to the increase is heterosexual HIV transmission. Urgent steps are therefore needed to prevent those adult females already infected with the HIV from getting pregenant. It is now established that the more advanced is the stage of HIV disease in the pregnant mother, the more likely it is for the baby to acquire the infection. As is shown in Table 4, the major contributory factors to the transmission of the AIDS virus in the country include heterosexual contacts which by far is the commonest route accounting for more than 85% of the current cases in Tanzania. This is followed by transmission through unscreened blood transfusion and through mother to child spread during pregnancy or soon after birth. Homosexuality accounts for less than 2% of the AIDS cases while transmission through intravenous drug abuse is so far unheard of. There also exist possibilities of transmission through unsterile needles, syringes and scarification instruments but such modes of transmission may be relatively rare. It is now clear that the main risk factor for acquisition of the HIV is the degree of sexual activity with multiple partners and not sexual orientation. While males can transmit the HIV to females and other males through contact with blood resulting from bruises or through semen, females can transmit the HIV from HIV infected vaginal and cervical secretions and excretions coming <-/incontact> with raw areas on the male genitalia. It is highly doubtful that dry kissing can transmit the HIV but deep wet kissing including oral sex where chances of bruising and trauma exist may transmit the HIV if one of the partners is infected. Moreover some possible co-factors for sexual transmission including genital ulcers, C.trachomatis cervicits, the uncircumcised state in the male and oral contraceptives have been shown to alter susceptibility to HIV-1. Contraceptive pills could facilitate HIV transmission probably through increase in presence of <-/susceptibble> T4 lymphocytes and macrophages in the female genitalia. The close association between AIDS and other established STDs raises the possibility that STDs may disrupt the epithelial surfaces of the external genitalia thus favouring transmission of HIV. The issue of whether the <-_technics><+_techniques> or practices of sex in different cultural settings influence the transmission of HIV has been debated but no <-/clear cut> answers have emerged. W2A022T BOOK REVIEW (Prof. Karim F. Hirji on Stuart B. Levy's book The Antibiotic Paradox: How Miracles Drugs are Destroying the Miracle) The recent outbreak of the plague in India is noteworthy in that not many deaths have been attributed to it. In the past, the plague was a killer of monumental proportions. For example, in the year 1355 it decimated the population of England, reducing it from 2.5 million to 1.5 million. In India itself, it claimed over a million lives at the turn of this century. But not so in the present era. The plague is caused by fleas. Now effective drugs to counteract this organism are available. A patient at an early stage of the infection has a high chance of full recovery if treated with these drugs. Antibiotics Modern medicine has a wide array of drugs for treatment of a broad spectrum of infectious diseases. Among them are those known as antibiotics which have come into use only over the past fifty years or so. Strictly speaking, the term antibiotic refers to a naturally occurring substance that inhibits the growth of, or kills bacteria. For example, the commonly used antibiotic penicillin is produced by a fungus. Other man-made substances are also used against harmful bacteria. In the current discussion, I use the term antibiotic to refer to both natural and synthetic anti-bacterial chemicals. A large variety of micro-organisms exist in the human body and in nature. Most of them do not cause disease, and many in fact perform useful functions such as assisting the digestive process in the intestines. A disorder arises when harmful germs enter an organ where they are not normally present, and multiply without control. For example, when TB germs invade the lung tissue, and are not inactivated by the body's defense mechanisms, the condition known as tuberculosis develops. Antibiotics are used for treatment of diarrheal and respiratory problems, for typhoid fever and tuberculosis, for skin and blood infections, for sexually transmitted diseases and other conditions. Many diseases with a high fatality rate earlier can now be cured with antibiotics. Diminishing Efficacy However, one disturbing consequence of antibiotic usage has not been given the attention it deserves. I am referring to the declining effectiveness of many antibiotics. For example, thirty years ago, the sexually transmitted infection <-/gonorrhea> could be cured with penicillin or tetracycline. Today, many cases of this infection do not respond to any of these antibiotics, and need more powerful, more expensive, and potentially more toxic drugs. Why is that the case, and what can be done about it? Stuart B. Levy, after spending many years researching this issue, has now brought forth a book that lays out all the facets of this ever growing problem. One would consider such a text to be within the exclusive purview of doctors, microbiologists, and other experts. Indeed, there are many complex and technical issues involved in the consideration of diminishing antibiotic efficacy. But this is a book directed at the general public. The problem it tackles is a public health problem which has been magnified by how the producers, distributors and consumers have used, and misused these drugs. It is thus of concern to all. Dr. Levy has performed the remarkable feat of presenting the technical material in a simplified manner without sacrificing accuracy. His style makes the text accessible to anyone with a reasonable command of English, and an appreciation of secondary school level biology. The first three chapters present historical material; we get a glimpse into the maturation of the science of microbiology, and about how antibiotics were discovered and came to be used. The ever present tendency for self-medication which is connected with the problem of antibiotic misuse is also discussed. We are led through this material, as through other chapters, with the help of anecdotes and cartoons. Often one feels that one is cruising through an adventure narrative instead of a serious treatise. Resistance to Antibiotics Chapter 4, which outlines the biological mechanism of antibiotic resistance, is the most technical chapter in the book. However, that too is not beyond the grasp of a determined reader. The basic idea is that by killing bacteria that are susceptible to it, an antibiotic enables those that resist <-_it's><+_its> action to flourish. Resistance strains that were rare prior to antibiotic usage now become the common ones. This scenario is enhanced by antibiotic abuse through underdosage, excessive consumption, and intake when not warranted. For example, when a patient with TB does not take the complete course of treatment just because he feels somewhat better, this fosters the growth of the remaining organisms which tend to be more resistant to treatment. We learn from Dr. Levy that the ability of bacteria to resist actions of drugs that they were previously susceptible to arises from genetic mutation and from transfer of resistance factors between bacteria. The resistant germs can be passed on from one person to another, and in fact, the mechanism of resistance itself can also be transmitted from one type of bacteria to another. This means that even the harmless organisms in the body can acquire these resistance factors and may pass them on to other disease causing organisms. The Consequences Many examples of the catastrophic consequences of the emergence of drug resistant germs are given in the book. Cholera was not too long ago eminently treatable with tetracycline; now after years of inappropriate use, more than 50% of cholera germs in Africa are not affected by it (p. 114). TB is once again menacing communities the world over from New York to Bombay. Some <-/multiresitant> forms of this disease seen these days do not respond any drug. A similar situation prevails with sexually transmitted conditions such as <-/gonorrhea>. Another major problem is bacterial dysentery; reduced efficacy of commonly used drugs has been reported from as diverse places as Bulgaria, Vietnam, Africa and some Native American reservations in the USA. The problem was vividly brought home when outbreaks of cholera and dysentery were reported in the Rwandan refugee camps this year. Not only were drugs not available in sufficient quantities to treat the patients but even those that were there were not that effective. Which aid agency would have brought in the exorbitantly expensive substitutes for this forsaken segment of the human race? The problem of resistance is not confined to bacterial diseases; it also occurs in parasitic infections. In Africa, malaria is on the resurgence and continues to take the lives of millions of children; overuse has led to a precipitous decline in the efficacy of the once upon a time drug of choice, chloroquine. Behavioral Basis The problem of drug resistance has come about due to excessive use in some circumstances, <-/under use> in other conditions, and inappropriate use. Let us consider the various uses of antibiotics, and the problems associated with them, as described in this book. Anti-infective medicines are often used for disease prevention. In the US, between 3O to 5O% of human use of antibiotic is for a prophylactic purpose. There are situations where such a use is medically necessary; however, abuse is also not uncommon. For instance, studies have shown that the administration of antibiotics for prevention of post-surgical infections in the US tends to be quite in excess of that which can be medically justified. Another egregious example by Dr. Levy is the weekly supply of tetracycline given to 100,000 or so pilgrims going from Indonesia to Mecca annually. This practice, followed in the early 1980s, was a sure recipe for turning tetracycline into an ineffective drug. Scientific studies looking into the origin of the current world wide problem of resistance to penicillin that <-/gonorrhea> bacteria exhibit showed the source to be brothels of South Vietnam. In the 1960s and early part of 1970s, thousands of prostitutes servicing the US soldiers stationed there were regularly <-_give><+_given> penicillin to keep them disease free. Excessive use of antibiotic for treatment is also common. In Japan, per capita use of the cephalosporins group of antibiotics was found to be ten times higher than that in Sweden or Australia; the difference was connected with the financial gain to Japanese doctors prescribing the drugs (p. 244). In the US, about $500 million is spent annually on antibiotics for treatment of ear infections in children. However, many such episodes resolve spontaneously, and the extent of use of antibiotics for this condition has been the cause of intense controversy in the medical literature. To quote Dr. Levy again: "Infectious disease experts <-_estimates><+_estimate> that at least half of the human use of antibiotics in the US, whether in the community at large or in hospitals, is unnecessary." (p. 225). In Third World countries, excess antibiotic use is found with affluent patients who consider antibiotics a panacea for all problems. Physicians may prescribe antibiotics without confirming diagnosis. Further, the availability of such medicines without prescriptions in many countries compounds the problem. As Dr. Levy puts it: "In these areas, death is a preponderant part of life. Poor sanitation leads to spread of infectious agents while antibiotic misuse propagates resistant strains. To make matters worse, these are the very countries where money is so scarce that the government cannot afford to buy currently useful but expensive antibiotics." (p. 200) The inadequate nature of medical services and the inability of the many to pay for such services encourage self-medication, misuse and use of lower than needed doses of antibiotics; all these practices enhance the spread of resistant bacterial strains. Profits and Resistance Antibiotics are also used for purposes other than for prevention and treatment of disease in humans. Antibiotics used for human are also used for treatment of diseases in pets, food animals, plants, fish, and even honey bees. Also small doses of these drugs are used for growth promotion in poultry, cattle and other food animals. These practices also promote the spread of resistant bacteria in the animals, into the environment, and then to humans. Recognizing such hazards, the European countries, U.K. and Canada banned the use of antimicrobial drugs for growth promotion in food animals two decades ago. However, in the USA, the strong business lobby has blocked legislative action on this matter. Cases of farm workers in the USA getting harder to treat infections have been documented. (p. 145). In Third World countries, though such use of antibiotics is not as pronounced, absence of rules or of enforcement mechanism have the potential to generate similar problems. The pharmaceuticals corporate giants that dominate the worldwide production and marketing of antibiotics owe their current financial stature in no small part to the profits generated from aggressive marketing of antibiotics over the past half century. Dr. Levy recognizes, to some extent, corporate responsibility for the current epidemic of antibiotic resistance. However, in my view, that responsibility is much larger than what he cares to admit. In the Third World countries, many instances of highly unethical marketing practices of these companies have come to light. For example, the September 1994 issue of Africa Health mentions the case of a French company exporting to Africa an antimicrobial preparation to be used for <-/diarrhea>. This preparation is not licensed for use in France, and according to WHO, no evidence for <-_it's><+_its> efficacy exists. What is to be Done? Dr. Levy gives several recommendations to promote rational antibiotic use. I will focus on those that are of particular importance to the Third World. Some of these are not given the prominence they deserve in this otherwise commendable text. First let us consider the history of infectious diseases in the industrialized nations. Hardly a century ago, many of them were also ravaged by outbreaks of one epidemic after another; cholera, dysentery, yellow fever, tuberculosis, and meningitis were among the commonly encountered ailments. Examining the pattern of the change over time in Europe or the USA, it is seen that a rapid decline in the number cases of such infections occurred prior to the discovery of modern antibiotic drugs. It is the betterment of living conditions that reduced transmission of germs from one person to another, and prevented a high number of people being afflicted in any outbreak. When antibiotics appeared on the scene, a further reduction in the case load of such diseases was observed. In Third World countries, and in Africa in particular, the cart is being put after the horse. Influenced by the "a pill for every ill" ideology now dominant in the industrialized nations, we forget the material foundation necessary for control of infections. Even though the World Health Organization, the World Bank, and various donor and governmental agencies talk of disease prevention, improvement of environmental sanitation and nutrition, the actual policies being funded, what is actually implemented, and how that is done belie their words. The record of the health policies over the past <-/thirty five> years since the winds of change blew on this continent is abysmal. The WHO inspired chemical assault on infectious organisms, mosquitoes, or other bugs has, apart from a few marginal exceptions, not worked. The consequence has been a resurgence of the same diseases but in more virulent forms with harder to kill germs. It is now time to go back to basics, and deal with the water supply, nutrition and general living conditions. These improvements cannot be undertaken in the absence of sound economic policies that will stress food production for local consumption, development of internal markets and of nationally integrated industrial and agricultural sectors. Above all, it is important to avoid those economic strategies that will further engulf Africa into the vicious cycle of debt, dependency and poverty. In my view, sound usage of antibiotics and antiparasitic medications can only be envisaged in the context of a coherent and comprehensive public health policy. Further, in African countries today, an implementation of such a policy is difficult to visualize in the absence of concurrent progress in the general economic health of the nation. And for the latter to come into being, a government responsive to the needs of the majority is a prerequisite. Thus, from a consideration of antibiotic resistance we are led into a consideration of democracy. It is not that sound policies for the health sector and the general economy are unknown. There is no need to undertake further donor funded research and produce intricately phrased tomes for medical journals. The knowledge is there in any comprehensive text on community health. What is absent is a determination on the part of the governmental authorities to carry them out. These authorities are mired in corruption and are increasingly in the pockets of local and foreign business tycoons bent on pillaging the nation. Thus a fundamental question is that of having democratic governance where the interests of the people are put first. The October 22, 1994 issue of Daily News reports the Deputy Minister of Health describing the response of her ministry to the recent outbreak of cholera in Shinyanga region. It had shipped consignments of tetracycline and oral rehydration salts to the region. The Deputy Minister was also urging the people to boil drinking water and maintain cleanliness. Moreover, she talked of the need to adopt an "inter-sectoral" approach in order to devise a long term solution to the problem. What is of interest is that the health establishment in Tanzania has been tailing about an "inter-sectoral" approach in one form or another for about twenty years now. But what have they done about it? Can you show us some initiatives, actions and results in this regard, <-/madam> Deputy Minister? Or is this just a public relations exercise? Many times I have been asked why I like to bring politics into everything. If "politics" means the basic philosophy underlying the major goals pursued in society, and the nature of interests expressed through state policies, then I submit that it is not a question of what I like but of what is there. Consider a few illustrations. A concern regarding the proliferation of substandard dispensaries and pharmaceutical stores in the country was raised at the annual meeting of the Medical Association of Tanzania in September 1994. The problem of misuse of antibiotics can only be aggravated in this situation. If this trend is not nipped in the bud by effective regulatory mechanism, we may soon catch up with Nigeria where peddling of fake or expired medicines has reached epidemic proportions. A major shortage of medicines, including commonly used antibiotics, was reported at the Muhimbili Medical Centre in Dar es Salaam in October 1994. This is the major referral hospital in the country. The shortage was exacerbated by a mysterious disappearance of a large consignment of drugs from the hospital pharmacy. Included were many different types of antimicrobial preparations. Where will these medications end up? In fact, not too long after the disappearance was reported, I saw an enterprising vendor set up a stall near that of a seller of oranges just adjacent to the main entrance of Muhimbili Hospital. This gentleman was selling several variety of medicines, including some antibiotics! In my view, the question of having a proper management in the hospital to ensure proper control over available drugs is not just a medical problem It is a problem of good governance, that is, one of ensuring that those who manage public resources do so properly and are accountable to the public. The policy of trade liberalization, adopted by the government under pressure from the International Monetary Fund, is being implemented in a haphazard fashion. Local industries are being forced into bankruptcy by traders who import goods without paying <-_taxes'><+_taxes> This impacts the health sector too. Just last month a local manufacturer of pharmaceutical products complained of lack of markets for his products; the October 22, 1994 issue of Daily News reports that several local manufacturers of mosquito coils have been forced to close under similar circumstances. In the long run such a destruction of the little local manufacturing capacity that we possess can only have a negative impact on public health. These examples should suffice to illustrate the centrality of "politics" in dealing with issues of proper usage of medications, including antibiotics. Without appropriate political policies pursued in practice, <-/piece meal> and narrowly focused attempts at resolution of problems are unlikely to yield lasting improvements. Conclusion I conclude with the issue of <-/self medication>. Dr. Levy addresses this issue at some length in his book, giving many examples of the hazards of use of antibiotics by individuals without proper medical supervision. He stresses the need for consumer education. However, persistence of self-medication is not simply an issue of lack of knowledge. This problem exists in affluent societies like the US, or in poor nations like Tanzania. In both cases, the root cause perpetuating the practice is the failure of the health establishment to, so to say deliver the goods. In the US medical treatment tends to be quite expensive, and a large portion of the population lacks necessary insurance coverage to pay for the cost of treatment. The uninsured or underinsured individuals may then be inclined to practice self-medication for conditions perceived to be minor ones. Often a poor patient ends up in a hospital emergency room as a result of lack of appropriate medical care at the initial stage of what then was a minor problem. Medical treatment in the US is based on advanced technology. inducing a feeling of helplessness and loss of control on the part of the patient. When illness implies going through a plethora of tests and procedures, and when the doctor has hardly any time to talk to the patient a feeling of alienation is bound to arise. It is such a situation that has spawned continued interest in alternative therapeutic forms such as homeopathy, acupuncture as well as contributed to persistence of selfmedication. The latter is also influenced by greater health consciousness among the population. Emphasis on disease prevention through proper nutrition, adequate physical exercise, stress avoidance, reduction of harmful habits such as smoking etc, is also influential in turning people away from doctors and pills as a remedy to all ills. In Tanzania, <-/self medication and consultation of traditional healers is influenced by the poor level of medical services, lack of proper communication between doctors and patients, unregulated sale of medicines as well as by the low level of education. It is found among affluent and poor communities. However, one does not have to view self-medication as an inherently harmful practice. Also, if it is found in industrialized societies like the US, the prospects reducing it in an economically backward place like Tanzania are not promising. I think if <-_self-medicated><+_self-medication> is viewed in positive terms as an aspect of patient self-empowerment and if the practice is channeled along lines that will encourage proper use of medications including antinfectives, it can help to partly remedy the low level of medical services in society, Consider for example the notable work by David Werner entitled Where There is No Doctor. This provides practical and down to earth advice on disease prevention, on dealing with minor ailments, and on recognizing when expert advice ought to be sought. Its availability in many local languages including Swahili is an additional advantage. What we need is similar health education disseminated to the public through mass media that will stress preventive practices, improvement of sanitation and hygiene, community development, and local control. This should be combined with strengthening the capacity to provide treatment for basic problems at village and district levels. In this context, adequate provision of antibiotics and antiparasitic preparations with appropriate guidelines for their proper use are imperative. The issue of misuse of such drugs and potential adverse consequences for the community should be clearly spelled out in the educational campaign. Of course, this has to be done within the context of a coherent and viable national economic agenda to improve general incomes and standard of living. Otherwise education by itself will have little impact. In the 1970s, an impressive campaign for health education, called "" (Man is Health) was mounted over the radio. But it failed to achieve any notable results. At that time the government was also implementing an unplanned and coercive strategy of herding people into so-called "development villages". The consequences of bringing many people together in an environment lacking basic sanitary services was just an increase in communicable disease levels. The roots of cholera epidemics we see breaking out in Tanzania from time to time can be traced back to those ill conceived experiments of Dr. Julius Nyerere which were backed and funded by the World Bank and the Scandinavian countries. Even though newer drugs for diseases like malaria and TB are being discovered, resistance to them tends to develop after few years of use. Most of the new drugs are also very expensive. Thus if we do not <-/adress> the basic roots of proliferation of infectious disease in Africa we will soon be faced with the nightmarish situation of having an environment conducive towards them. Let us learn from history. W2A023T MALARIA Of the communicable diseases of man in Tanzania, malaria is indisputably the most important. It occupies the number one position for hospital attendances, is the second highest cause of hospital admissions, and is among the chief causes of hospital deaths, especially in children. The most prevalent malarial species in Tanzania is Plasmodium falciparum which accounts for around 90 % of all malaria infections at all altitudes. Because of this very high proportion, and its devastating effects, malaria in the Tanzania context generally refers to P. falciparum and that assumption will be maintained throughout this presentation. Aspects relating to malaria in pregnancy, in the newborn, in infancy, childhood, in adulthood and indeed the entire clinical spectrum will not be discussed as these are usually adequately presented in standard medical text books. I should however quickly stress that apart from the usual morbidity and mortality directly attributable to malaria, the disease is also a major chronic impediment to the health of the affected communities. It may account for much absenteeism from school and from work, it often greatly lowers labour output and has a general debilitating effect on our national economy. Malaria often leads to increased deaths from other causes and even to impairment of physical and mental activity. As malaria accounts for such a high proportion of dispensary, health centre and hospital attendances and admissions, it frequently overwhelms the already overburdened health services. All this leads to a vicious circle of disease and economic stagnation . It must be recalled that the malaria in much of Tanzania is at its peak in May and June, and these months coincide with the harvesting of most food and crash crops: a malaria patient might leave the crops unattended, with dire consequences not only for the household but probably for the national foreign exchange earnings. In some situations malaria transmission is intense during the early rains and might therefore interfere with seeding and planting. In order to illustrate the devastating effects of malaria, I shall, for lack of a good local example, cite experience from the Sudan. A major malaria epidemic in the Gezira Irrigated Area attacked the majority of the labour force and a third of the cotton crop could not be harvested, leading to a reported loss of US $10 million. It has been calculated that had intensive efforts been invested in malaria control, they would only amount to a tenth of the economic loss! Malaria also adversely affects other aspects of our economy. Tourism is a case in point. With the recent resurgence of malaria in Tanzania, and the development of drug resistance, our fledgling tourist industry and even our recruitment of expatriate staff are threatened. The old ideas of Africa as "the white man's grave" are being revived. The situation is so serious that some schools of tropical medicine in the north have undertaken the establishment of specialist bureaux to provide free advice on malaria to would-be travellers to tropical countries. For certain <-_situation><+_situations>, for example that presented by pregnant tourists or potential expatriate women there is no ready advice; the best one nowadays is probably "Don't G". And for women of child bearing age the advice is, "Don't get pregnant while you are there". Indeed one very friendly country is giving this as the advice, as there is no ideal chemoprophylactic drug for pregnant expatriate women in an area where chloroquine-resistant malaria is being transmitted. FILARIASIS Bancroftian filariasis due to Wuchereria bancrofti highly endemic on the Indian Ocean Coast, penetrating inland along the river valleys. It is also highly endemic in the south and southeast of Lake Victoria, and to the north of Lake Nyasa. Smaller foci exist elsewhere in Tanzania (Menu and Kilama, 1974). According to Wijers (1983), who has the greatest clinical experience with the disease in East Africa, "All clinical signs of Bancroftian filariasis described in textbooks may also be found in Africa, but some occur more, others less often than in other parts of the world". In Tanzania, the most common signs are funiculitis, epidymo-orchitis and hydrocele, which attain rates of over 25% in many coastal areas. These signs may develop fast. Thus, Abaru et al, (1980) reported a man who developed a hydrocele within one year of arriving in an epidemic area there were ten who developed hydroceles within five years of arrival. Although elephantiasis is the most spectacular sign of lymphatic filariasis, and popularly thought to be the most common, it usually attains a low rate of only about 4 per cent in a highly endemic area, although Wijers once found a rate of 12.5 per cent in a Kenyan Coastal Village (Wijers 1983). Microfilarial rates are very high in some endemic areas, they increase rapidly with age, but usually stabilize in persons over 20 year of age. Hydrocele rates too increase with advancing age. In some coastal communities they may attain 80% in the oldest age groups. Very little is known on the public health importance of lymphatic filariasis in East Africa. Jordan (1959) thought that this disease affects man's reproductive capacity. In a study by Mosha (1980) in a Tanga Coastal Village he found that hydroceles were a social embarrassment particularly in young men. This is in very clear contrast to the popular idea that they are venerated around the coast. Indeed even old men were requesting the filariasis research team to arrange for hydrocelectomies which are the number one surgical problem in certain coastal hospitals. Besides the cost to the family, in transport, visits and lost work days during the operation, the cost to the national economy is also extremely high. THE WAR AGAINST MALARIA The war against malaria in Tanzania as elsewhere has adopted the following strategies: Chemotherapy, i.e. treating the sick; Chemoprophylaxis, i.e. protecting the susceptible person by using drugs; Vector mosquito control; Impeding man-vector contact. For the sake of this discussion the first two strategies will be combined under the general heading of "War Against the Parasite" and the last two under "War Against the Mosquitoes". THE WAR AGAINST PARASITES The war against human malaria parasites started when man developed the immune response and certain innate genetic mechanisms. The battles and strategies involved are as old or even older than the human species itself, and are still going on today. Although they form very important and extremely intriguing defence mechanisms, some of which may point the way to the development of antimalaria vaccines, they will not be discussed further. The details are for example presented in Cohen (1982) and Luzzato et al (1983) Chemotherapy and Chemoprophylaxis The conscious war against malaria parasites involves chemotherapy and chemoprophylaxis, and must have started long before Tanzania's colonial history: our forefathers for example produced certain herbal remedies for use against malaria attacks. The first "modern" drug to be brought into Tanzania by the colonizing Germans was quinine, whose curative value Western Europe had learned from the Amerindians during the 17th century. In Tanzania the initial wide use of quinine was very much resented by all racial groups, especially in chemoprophylaxis clyde, (1967). Before World War II the Germans introduced the large scale growing of cinchona in Tanzania. World War II added chloroquine (and later related drugs) and proguanil (later pyrimethamine) to the antimalarial chest. The <-/Viet-Nam> War brought the discovery of the potentiating effect when pyrimethamine or another antifolate was used in combination with certain sulfonamides; thus leading to the introduction of such drugs as Fansidar, Metakelfin and Maloprim, which were active against chloroquine-resistant P. falciparum. The <-/Viet-Nam> War also led to the discovery of mefloquine a drug still undergoing global clinical and field testing under the auspices of WHO's Special Programme for Research and Training in Tropical Diseases. The available antimalarials may be grouped as: Blood schizontocides: Fast-Acting: Chloroquine, amodiaquine, quinine, mefloquine. Slow-Acting: proguanil, pyrimethamine, sulphonamides, dapsone. Gametocytocides: primaquine Sporontocides : proguanil, pyrimethamine, sulphonamides Tissue schizontocides: primaquine. This lecture will be restricted to the blood schizontocides; the other uses have yet to be reached in our national struggle against malaria. I must however quickly point out that the above classification of blood schizontocides does not follow hard and fast rules-the combination of two slow-acting drugs for example may form a fast-acting drug. In the 1950's the World Health Organization embarked on a global programme for the eradication of malaria, mainly based on vector control. Although valuable gains were made elsewhere, especially in temperate climate countries, experience from pilot control schemes in much of tropical Africa revealed malaria as a most formidable enemy. During the late 1970's after it was realized that time limited global malaria eradication was impracticable, the World Health Organization (1979) advocated that a malaria control strategy suitable for rural Africa should rely on chemotherapy and chemoprophylaxis using chloroquine, a drug that was unsurpassed for both treatment of acute malaria attacks and for their prevention. Since the strategy had the objective of preventing death due to malaria, the chloroquine was to be made readily available to all suspected malaria cases. Chemotherapy on demand has the following major limitations: the inadequacy of our outreach to remote areas, the difficulty in getting universal acceptance, an inadequate infrastructure, the risk of under or overdosing, and the enhanced potential for developing drug resistance. <-/To-day>, even with our thousands of dispense and village health posts which are spread throughout Tanzania, it is still not possible to deliver the life-saving drug promptly and therefore to avoid death and prolonged suffering. Primary Health Care and the Essential Drugs Programme are expected to alleviate much of this problem. There are however many other limitations still hindering the use of this lifesaving drug, and these include: improper dosing; insufficient labelling; the use of nonstandard measures; vomiting; its bitter taste; purities; the varying base contents in different brands; human beliefs, etc. Some workers have objected to the wide and almost uncontrolled use of chloroquine on the premise that its unlimited availability would constitute a selection pressure that would enhance the development of drug resistance. I shall return to this most topical aspect later. To the above objective of prompt treatment in order to prevent death, has been added since the mid - 1970's the objective of offering chemosuppression to the most vulnerable groups who under our circumstances constitute pregnant mothers and children under five years of age. One may be proud to point out that Tanzania was probably among the very first countries in the WHO African Region to adopt this noble sounding strategy. The policy here was to distribute the chemosuppression through MCH clinics which at that time were mobile. However certain observations made on the distribution of the chemosuppression to MCH clinics suggested that the drug was probably not reaching the target group. In order to overcome the delivery problem a programme was envisaged in North Mara whereby all available resources including ten cell leaders were included in the distribution system. The programme which had a good supply of the drug, good transport and backing at all levels met with early spectacular success. But through time it started to falter and a study by McCormark and Lwihula (1983) had to conclude that even where there was excellent planning, good organization and assured drug supplies there were still major problems in compliance, which arose from poor transport arrangements, waned interest, omission of some children on social grounds or where parents were not convinced of the value to be derived from the chemosuppression, and therefore saved the chloroquine for treatment of fever in all family members. Major contributing factors to noncompliance included vomiting and itching. In a similar study in Tanga Region, Matola and Male (1984) revealed that only about a quarter of the women and children attending MCH clinics were actually taking the chloroquine as shown by a urine test. In a similar study by Kihamia and Lema (1984) at Kibongoto Hospital, Kilimanjaro, similar data were obtained at the antenatal clinic. The above studies might suggest that some of the chloroquine is indeed saved for malaria attacks in the family, or it is never used at all. This might probably account for the very high consumption of the drug. In the five year period 1975/76 to 1980/81, the number of chloroquine tablets issued through government stores jumped three fold from 102m to 300m (Chiduo, 1982). The World Health Organization (WHO 1983) in a comparative survey of chloroquine issued in six Anglophone countries has put Tanzania's per capita consumption of chloroquine per year at 25 tablets, followed by Zimbabwe and Ethiopia at 4.6 and 4.4. tables respectively. Kenya had a total (i.e. government and private) per capital of 1.9 tablets and the Gambia 1.2. The very high figure for Tanzania is only based on issue by Central Medical Stores; those by the National Pharmaceutical Company, some voluntary agency establishments, and certain donor agencies however are not included. Drug Resistance in P. falciparum in Tanzania A current and very major threat to both the chemotherapeutic and chemoprophylactic programmes has been the development of drug resistance by the local Plasmodium falciparum. The development of resistance in P. Falsiparum to an antimalarial is not a new phenomenon in Tanzania, where resistance to pyrimethamine was aptly reviewed by Clyde (1967). This drug had to be removed from use in the 1960s, yet this resistance has unexpectedly persisted and even spread in the absence of drug pressure (Kouznetsov, et al 1980). It therefore seems that the resistance confers a biological advantage. Chloroquine was introduced into malaria chemotherapy in 1945 (Peters, 1970). By 1961 reports of chloroquine resistance were received from southeast Asia and South America. In Africa, however, there lacked early incontrovertible evidence on African P. falciparum resistance to chloroquine (Bruce-Chwatt, 1970). The possible occurrence of P. falciparum resistant to chloroquine in Tanzania has been suspected for many years. Clyde (1961, 1966) reported that large quantities of chloroquine occasionally failed to clear all trophozoites. The decline in the efficacy of previously effective small doses (2.5mg/kg) of chloroquine in clearing P. falciparum was confirmed by Pringle and Lane (1966). A later study by Lilijveld and Mzoo (1970) in Handeni found that 5 mg/kg was needed to effect full parasite treatment. Later on Olsen and Spencer (1974) reported on a nonimmune case that could not clear on the maximum course of chloroquine but cleared on quinine. Goosen (1975) working with semi-immune subjects in Muheza, revealed that not all asexual parasites could be cleared by 10mg/kg chloroquine base. In a 1979 study Kouznetsov and his associates working in Bagamoyo observed that a single dose of treatment with chloroquine at 10mg/kg failed to clear 16.0 % of asexual parasites within seven days in parasite carriers harbouring more than 1,000 P. falciparum trophozoites per mm3 of blood, and in 10% of the whole group; they therefore doubted the efficacy of a single dose as recommended for use in semi-immune subjects (Bruce-Chwatt et al. 1981, Clyde 1961) Kouznetsov et al. (1980) therefore suggested 20 mg/kg since in their tests l5mg/kg gave full parasite clearance. Clear - cut cases of resistance to 25 mg/kg chloroquine base in indigence were reported by Kihamia and Gill (<-/19820> in two residents of <-/Dare s> Salaam and by Schwartz et al. (1983) in Zanzibar school children. Later extensive studies on the Tanzania mainland have uncovered chloroquine resistance in both hospital patients and in communities. In the following pages, I shall review some of these very recent findings. The alarming rate of chloroquine resistance in infants and young children, who constitute a very vulnerable group is a matter of grave concern, and the management of malaria in this group therefore calls for immediate review. To explain the alarming high rates of resistance in the very young age groups, Mutabingwa and his associates hypothesized that these children lacked an effective acquired immunity to complement drug treatment. The report also suggested that the hospital case studies were recrudescence arising from repeated weeding out of sensitive parasites by home and dispensary courses of chloroquine. A leading local ophthalmologist has viewed with concern the current situation which calls for intermittent treatment with chloroquine, which at frequent high dosages may eventually lead to chloroquine induced retinopathy. The ophthalmologist's fears must be taken seriously. Whereas previously about three malaria attacks per annum were the average, now there are cases who claim, for lack of a better term, to have "chronic malaria", which they subject to frequent year round chloroquine intake. Other studies have been those of Kilimali and his associates from the Amani Research Centre, undertaken mostly in school children at various localities on the Tanzania mainland. I shall cite those at the Tanganyika Planting Company (TPC), Moshi, in more detail. The in vivo studies revealed a resistance rate of 14.3% all classified as RII, whereas 17 of the 42 isolates tested in vitro were resistant. An analysis of the rate of inhibition of schizont naturation showed the sensitive isolates to be homogeneous in their response, whereas the resistance isolates exhibited bimodal curve equally segregating into sensitive and resistant parasites. Some of the resistant isolates did not clear at the highest chloroquine concentration tested. This in vitro study demonstrates that an increase in the chloroquine dosage will not fully clear the parasitaemia; if anything it will more effectively select for enhanced resistance levels. The <-_lessons><+_lesson> to draw is, when a case shows chloroquine resistance change to an alternative drug. Other recent in vivo and in vitro studies by Kilimali and others elsewhere in Tanzania, have given the following results. Like Onori et al (1982) working at Mto wa Mbu, these data reveal that even where in vivo resistance has not been reported, there is in vitro resistance, which is now almost ubiquitous. The in vitro test has the advantage of unveiling RI resistance, as it is not much affected by the host's immune response which may play an important protecting role in semi-immune subjects. Longitudinal studies by Dr. C. Draper (personal communication) in North Mara have clearly shown an abrupt deterioration in the efficacy of chloroquine in children 0-4 years, in an area previously under a well supervised chemoprophylactic programme. Whereas the single dose of 10mg/kg dose gave clearance rates of 73 to 82% in 1979/80 and in 1981 respectively, the rate dropped to 50% in 1982, and was just over 20% in 1983. In this area in 1983 two doses each of 10mg/kg over two days gave about 25% clearance and three doses each of 10mg/kg reached around 35% clearance. These data and those of Kouznetsov et al. (1980b) and Mutabingwa et al. (1984b) clearly cast grave doubts on the efficacy of the often advocated single dose (i.e. 10mg/kg) to say nothing of the 5mg/kg which is used in chemoprophylactic programmes. Surely such a low dosage used for chemosuppression cannot and should not be expected to significantly suppress parasitaemia. These low chemoprophylactic dosages are most probably weeding out the susceptible parasites and therefore selecting for the resistant ones, which are also known to possess a biological advantage. The North Mara results are in clear contrast with those of Bunda District (South Mara in Table 2) where the chemoprophylactic programme was weak, and the parasite clearance rate was higher. These data strongly suggest that the mass chemoprophylactic programme has led to an irruption of resistance. W2A024T CURRENT SURGICAL PRACTICE Information on the present state of Tanzania's Surgical <-/Serivces> has had to be obtained from a multiplicity of sources (16, 17, 18, 19, 20, 21, 22) mainly because the Ministry of Health has not been able to publish comprehensive annual reports since the abolition of the post of Chief Medical Officer over 10 years ago. However, the most important source has been the data compiled from responses to the 1982 National Survey Questionnaire on the country's surgical services which was sent out to all the Teaching and Regional Hospitals on Mainland Tanzania and to the National Hospital in Zanzibar (Appendix I). Further information has been obtained directly from the relevant officials of the Ministries of Health of both Zanzibar and the Mainland, as well as from several of the hospitals that were visited, by either the author or his research assistant on this project. The questionnaire was designed to obtain information covering a wide range of aspects of surgical practice including the situation of beds and inpatients, staffing, clinical and operative surgical problems, ancillary facilities, the pattern of referrals, and the state of surgery in the surrounding districts and other hospitals within the regional or zonal area, in addition to any other relevant information. In all, 20 such questionnaires were sent out and complete responses were obtained from 5 (30%) of the hospitals, which included all the 3 teaching hospitals, as well as Arusha, Kigoma and Zanzibar (Fig. 3). The author considers this data to be satisfactory because it covers the areas where (except for Mbeya) most of the surgery in Tanzania is concentrated. No questionnaires were sent to the district and other sub-regional hospitals because many of them do not perform a significant amount of surgery, and given the present shortage of surgeons in the country, are unlikely to do so in the foreseeable future. The pre-existing state of health facilities in the country by 1972 is shown in Fig. 4, and the total numbers and types of medical specialists by 1979 is shown in Appendix II. The main findings of the National Survey are summarised in Tables 14 - 20. Firstly it will be noted that a considerable number of beds is allocated to Surgery (24-388) but that this has little or no correlation with the proportion of surgical admissions (8-30%) with Zanzibar having the least unsatisfactory correlation (29-30). Secondly, with the exception of Kigoma (for reasons that are unclear) the proportion of both medical (29 - 40%) and nursing (22-398) staff in Surgery is also quite high, with KCMC (38/39) having the best correlation. However, the number of Doctor-Anaesthetists is quite small compared to the surgical (medical) staff, and in the case of both Bugando and Arusha there are none! Thirdly, there is considerable disparity in the operative load shared between the surgeons of the various hospitals, even allowing for the number of-available anaesthetists. In this regard Bugando's highest score is difficult to explain except possibly in the latitude with which the term 'major' operation is defined (Table 18). The fourth observation to make on the results of the 5urvey is that there is a wide variation in the terminology used in reporting on surgical operations, with KCMC being the least explicit (Urological, Abdominal, etc) while Bugando is quite vague on the type and magnitude of at least two of the groups of operations reported. It would be helpful, at least for comparative purposes, if the actual operations were stated and enumerated prior to grouping them into the various organ-systems or anatomical regions. <-/Cesarian> section is recorded as the commonest operation in Kigoma probably because one of the "surgeons" there is an obstetrician! Fifthly, it is interesting to note the (annual) incidence and pattern of Referrals Abroad from the two leading hospitals in the country, as the national data on this subject was not available from the Ministry of Health headquarters in Dar es Salaam. The large numbers of Cardiac and Neurological problems that have to be referred abroad annually for surgical treatment - at great expense, in foreign currency, to the nation - makes it imperative that urgent priority should be given to these areas of specialisation in the consolidation and expansion of our surgical services at the national level. We cannot hope to achieve Surgical Self-Reliance in the foreseeable future if this prevailing situation is allowed to continue indefinitely. <-/Finaly>, a glance at Table 20 shows that there is a very uneven distribution of surgical services in the country as well as very variable clinical and operative loads on the surgeons in the various hospitals. We are also certainly nowhere near the 'ideal' Surgeon/Population ratio of 1:6,000 which Roy advocates (5), and, predictably, Muhimbili does better than the rest of the country! SURGICAL TRAINING The only way we can hope to achieve Surgical Self-Reliance (which is definitely a desirable goal) in the foreseeable future - and I propose that the year 2000 A.D. would be a reasonable target date - is by, among other things, training an adequate number of Tanzanian surgeons for the regional and higher hospitals, and incorporating into the M.D. curriculum as well as the curricula of all the medical sub-professional cadres a sufficient and relevant amount of practical surgical teaching which will enable them to perform certain specified surgical tasks at their designated posts both appropriately and competently. The desirable designations for both health units and surgical personnel, and the recommended grading of surgical tasks at the various levels of the Health Service Pyramid, to be achieved by the end of the century, are shown in Table 21 & 22. The proposed 'pyramid' by the year 2000 A.D. is illustrated in Fig. 5. Although the history of University Medical Education in Mainland Tanzania (6) has been a long and chequered one (Table 23), it is only comparatively recently that Tanzanian surgeons have been trained locally, the first one, William Mahalu, qualifying in 1976 and the first woman surgeon, Adela Materu, qualifying in 1981, and those that have qualified so far (Table 24) will be supplemented by those that have qualified or are qualifying abroad (Table 25). It may be interesting to note that after the first Tanzanian surgeon qualified in the U.K. in 1962, only five other Tanzanians had followed suit before the author himself <-/qualifed> also in the U.K. in 1971 (Appendix III). There is only one other Tanzanian Surgeon known to the author who qualified elsewhere during the same period . Apart from the small number of about 30 Tanzanian Surgeons that have qualified so far, the surgical service has been beset by a number of problems, not least of which is the considerable loss or wastage of personnel from the surgical ranks for various reasons. At least five surgeons, including the one who died in 1981 (JYM)* have been lost from the public service in this way in the past few years. The other problem is that although there is a large number of expatriate surgeons in the country - mostly through various international bilateral agreements rather than individual or academic appointments the manner of their recruitment, terms of service, and subsequent deployment leaves a lot to be desired, despite the fact that many of them are trying to do a good job. I believe that there is now a sufficiently high calibre of indigenous surgical experience in this country to be able to provide the necessary advice to the Ministry of Health and the Government on the recruitment and deployment of the most suitable expatriate surgical staff for both academic and non-academic sectors. There is yet another, and perhaps more serious problem that has an important bearing on surgical training and its aftermath. Both tables 24 and 25 demonstrate very clearly how Anaesthesiology and Pathology are still very much the "cinderellas" of our postgraduate medical education both inside and outside the country. It goes without saying that unless the output of Anaesthetists and Pathologists is markedly increased by attracting (not forcing) more candidates to these disciplines as a matter of the utmost urgency it is unlikely that the multiplication of surgeons alone will bring about the recipe for Surgical Self-Reliance either now or in the future. One aspect of Surgical training at the Muhimbili Medical Centre in Dar es Salaam is illustrated in Fig. 6. THE FUTURE At the present time, with the help of expatriate surgeons most of whom are on short term contracts - only a few regional hospitals (four) are without surgeons, but if Tanzanian surgeons were to be left on their own, about half (ten) of the regional hospitals would be without surgeons and most of the remaining centres would become seriously depleted. This is a measure of the <-/herculian> task that lies ahead if Tanzania is to reach surgical self-sufficiency within less than two decades from now on the purely personnel aspect - and there are other and weightier considerations too, e.g. finance, equipment, and supporting facilities. The other pertinent <-/probelem>, of course, is that even her existing teaching/zonal hospitals of Muhimbili, KCMC, and Bugando, do not yet have a 'full' complement of the specialist surgical services-that they need in order to offer a comprehensive service (Table 26). However, the future is not necessarily as bleak as some of the 'prophets of doom' would like to have us believe. Following the considerable stride made since the local postgraduate surgical training programme was launched in 1973, and provided that the present tempo of 4-5 (or even 3-4) candidates qualifying each year is maintained, or even surpassed, the desirable projection of an output of 60 - 70 or more by the year 2000 A.D. should be easily realised (Table 24). Other things being equal, such an output should provide enough surgeons for the National, Zonal, and Regional hospitals of the country, and yet leave a 'sizeable number for various surgical specialisations that will also be needed by then including Cardiac Surgery and Neurosurgery. The emphasis on greatly increasing the number of indigenous surgeons by the end of the century notwithstanding, by the sheer internationalism of Surgery, expatriate surgeons should continue to be welcomed to visit and work in Tanzania if only to maintain useful links with the surgical 'fraternity' elsewhere in the world, and also to supplement our own inevitable deficiencies in at least some of the more highly specialised and finer forms of surgery (e.g. Transplant Surgery, Microsurgery, and Cryosurgery) which we may well wish to embark upon into the 21st Century having established a firm and durable base of a general surgical service throughout the country. In order to improve surgical access to the people, and provide a more equitable standard of surgery throughout the country, better transportation and communication facilities will be essential as well as the institution of new Zonal Hospitals at Mbeya, Mtwara and Dodoma (or Tabora) so that surgical expertise will be within easy reach of the regional and district hospitals (Fig. 7). At the same time, Muhimbili will have to 'resume' - indeed enhance - its special status as the National Reference Hospital at which the more highly specialised, technical, or expensive forms of surgery will be performed for the benefit of the whole nation. Considering the inevitable but undesirable downward trend of the Health Budgetary expenditure on hospitals over the past 10 years (Table 27), the provision of such a National hospital will be a more sensible and economic way of utilising our limited human and material resources in the delivery of good, modern, surgical care, than would <-/overwise> be the case if duplication of special facilities were encouraged. DISCUSSION Any modern surgical-<-/survice> is by nature an expensive undertaking. For example, at the Muhimbili Medical Centre the total cost of running the entire 'Surgical Service', along with its clinical and non-clinical ancillaries, amounts to about 15-20% of the 1982/83 budgetary estimates of 102 million shillings (23). However, as Weston (24) has reported in his experiences from Mbeya and the whole of the Southern Highlands Zone, covering Mbeya, Iringa and Rukwa Regions, there are many modifications of equipment and appliances, as well as a judicious trimming of the drug list, which can considerably reduce the running costs of a regional or district surgical service. But the mere fact that costs must be high in Surgery means that every effort must be made to maximise the output of such a service to the community in all possible ways, including cutting down any wastage of materials, time or resource Despite the considerable data presented in this lecture, pertaining to what one might euphemistically call "Surgery for All by 2000 A.D. the fact of the matter remains that quality in surgery is as important as - if not more important than other aspects of the provision of surgical care. And, high or good quality surgery implies high technical- skill, consistency, integrity and strict discipline, none of which is a cheap "commodity" of human nature. This means that for a surgeon to attain a level of excellence, considerable training, practice and self-reliance are required, and it is therefore reasonable to expect our patients and the public at large to appreciate such excellence. Hence, a person who has chosen this most difficult of careers, cannot be expected to indulge in time-consuming extra-curricular or business activities, however essential they may seem to be to his physical or pecuniary well being, without incurring a progressive erosion of his skills or judgement or both. I believe that there is an analogy between a surgeon performing a major or complex operation and an airline captain in flight. For both sobriety, skill, concentration and manual dexterity are crucial, and every delicate move must be precise or else instant disaster might ensue! The only difference is that while the pilot is handling a flying machine in which he himself is situated, the surgeon is firmly on the ground (usually) while handling a living human being who expects to wake up minus his disease! Hence, as the captains of larger and faster aircraft are better regarded and better paid than other pilots because of their greater responsibilities, so also should surgeons, because they have such awesome <-/reponsibilities> and they need to devote themselves entirely to their profession. Their work must be made as smooth as possible by according them the best possible working conditions, and they must not be tied to administrative or other irrelevant responsibilities wherever they may be in the National Health Service. A further boost to the morale and indeed efficiency of the indigenous surgeons would be generated by a positive attitude by the Ministry of Health (AFYA) towards the inevitable need for a planned, rational and realistic surgical specialisation policy spearheaded by Muhimbili where the main spring of higher surgical training will continue to be for the foreseeable future. This would dispel the prevailing gloom and frustration experienced by some of our brightest young surgeons who are being denied the opportunity of either further specialist training at the optimum time or of pursuing an academic career for which they are urgently needed to enhance the nation's capability of strengthening its own surgical training institution. There is a definite need for the nation to re-examine its health priorities in the training and curative service sectors in order to enhance the quality of surgery, improve regional surgical services, and achieve surgical self-reliance by the year 2000 A.D. The Association of Surgeons of East Africa (ASEA), in which most of our own surgeons participate actively, has recently conducted several symposia on the issue of "surgery at the regional and district hospital levels", "specialisation in surgery", and "appropriate surgical technology and training" for East Africa, and stressed that research at all levels of surgical training and practice should be an integral part of all strategies for surgical advancement. We would do well to emulate this learned and far-sighted philosophy and ensure that each of our health institutions and units has a functioning and well staffed Records Unit and that up to date and comprehensive Annual Reports are compiled at the national and all other levels. There can be no meaningful research and no significant advancement in the health sector of our national development unless and until there are accurate and up to date records. As of now, it would be reasonable to suggest that our long and challenging march towards "surgical self-sufficiency" by the year 2000 A.D. should begin by achieving an acceptable level and balance of surgical services at the country's three leading hospitals (Table 26) before we embark on establishing Cardiac Surgery and Neurosurgery at Muhimbili in the near future. CONCLUSIONS In conclusion, what I have expressed in this lecture are only some of the many pertinent facts and my own opinions on the vital but extensive subject of Surgery in Tanzania which it would be impossible to exhaust in the limited time available. However, I do hope that I have made the point that, given certain conditions and the will to succeed, Surgery, which is undoubtedly the back-bone of the curative services of our National Health Service - can spearhead the drive towards better health care for all by the year 2000 A.D., and that the achievement of surgical self-sufficiency by then is not only desirable but possible. All the same, this is a very challenging task which will call upon the concerted efforts of all concerned and the maximisation of our available human and material resources, including the provision of adequate basic facilities and financial support, as well as good and attractive working conditions for the surgical and other hospital staff. In order for Tanzania to achieve what might be called "Surgery for All by the Year 2000 A.D.", in the spirit of the 1978 ALMA-ATA Declaration of the World Health Organisation, the nation should, through the Ministry of Health, undertake to fulfil the following recommendations: (1) Re-organisation of the hospital service and its ancillaries to give it a streamlined structure and grading scale that would be reasonably constant, and accord well-defined tasks and staffing criteria to each level of the hospital service (Table 21 & 22). Designation of Muhimbili as the national Reference Hospital with a wide range of specialist surgical and other services and the main focus of postgraduate surgical and other training facilities for the whole country. W2A025T 5.0 IMMUNIZATION COVERAGE PATTERNS Full immunization coverage for 12-23 month old children averages 72 percent nationally, varying between 92 percent (Mbeya) and 52 percent (Shinyanga) (1989 figures). The Singida regional under five coverage was 78 percent in December 1990, varying between 82 percent in Singida Rural and 51.4 percent in Iramba District (EPI figures). These two districts were chosen for the study because of their relative homogeneity and contrasting coverage levels. One objective of the research was to compare present coverage levels with the results of the most recent EPI survey (1989). The standard WHO cluster sample technique was used, based on a stratified sample of 30 villages per district and seven randomly selected index children (12-23 month olds) per village. The total sample of index children for the two districts (and mothers/guardians) was therefore (30x7x2=) 420. Table 3 gives the current immunization coverage levels for the two districts compared with national and regional (Singida) coverage for 1989. With the exception of BCG and DPT1, coverage is significantly lower in Iramba than in Singida Rural District, though by a much smaller margin than expected. Drop-out rates between DPT2 and DPT3 and between OPV2 and OPV3 were higher in Iramba than in Singida Rural, and these drop-outs largely account for the lower Iramba coverage figure. The Iramba District figure of only 51 percent coverage in 1990 was most probably a substantial underestimate, perhaps based on overestimation of the number of immunizable children in the district. Although smaller than expected, differences in coverage levels between districts were significant. The explanation for these variations seems to be more on the side of service delivery than reluctance on the part of mothers to make use of immunization services. Before examining this issue in more detail, further information on other antigens and on the important issue of timing and spacing of vaccinations is presented. Notes: 1. Coverage data are based on information collected from immunization cards, with the exception of BCG, which is based on scars. 95.1 percent of children vaccinated with BCG had scars. 2. The average age of index children was 17.2 months in Singida Rural and 17.4 months in Iramba District. In both districts the average index child weighed 9.2 kgs. 5.1 OPV0 Instead of three oral polio vaccinations (OPV1-3), EPI is now recommending four (OPV0-3), with OPV0 given at birth. According to mothers, only eleven percent of Singida Rural and 23 percent of Iramba District index children had received OPV0. Since most children are born at home, it is difficult to immunize at birth, and, as explained below, there are strong factors preventing mothers from bringing their children for vaccination less than 4 to 6 weeks after birth. Tetanus Toxoid Only one in five mothers had an antenatal card in her possession. This card, which records TT vaccination, is supposed to remain at the health facility where the mother delivered or is normally treated, although there is no TT register kept there. The newly introduced TT cards, which mothers are supposed to retain, were not found with any of the women in the survey sample. The breakdown of TT vaccination is as follows (oral information from mothers): More than two-thirds of Singida Rural mothers said they had received at least two TT vaccinations during pregnancy, compared with less than a half in Iramba. New EPI regulations recommend four weeks between TT1 and TT2, and six months between TT2 and TT3. Mothers who fail to complete the three vaccinations during their pregnancy should receive boosters during subsequent pregnancies, up to a total of five vaccinations. Spacing was analysed for mothers who had retained their antenatal cards. TT1 -TT2 spacing averaged 36 days for both districts (N=51). For TT2-TT3 the average was only 5 days (N=32) for both districts, and half the mothers in this small sample received TT3 only three days after TT2. The sample is too small to be able to draw any firm conclusions from this apparently aberrant pattern. 5.3 Immunization card loss and retention Over ninety-one percent of index children were found to have immunization cards. Seven percent of index children's mothers reported that their child's immunization card had been lost or stolen. Thirteen mothers (3 percent of total sample) said they stopped taking their child for immunization as a result of losing the card. Only four mothers reported that their children were denied vaccination as a result of losing cards, however. Card loss is examined in more detail below. 5.4 Summary of coverage data Overall, immunization coverage of 12-23 month olds is impressive in both districts, and corresponds closely to the national and regional figures for 1989. Coverage in Iramba is much higher than recent EPI figures would seem to indicate. Excluding OPV0 and TT, lowest coverage in both districts was for measles, the most common and dangerous of the immunizable diseases. Partially immunized children constituted 17.1 percent of the Singida Rural and 22.8 percent of the Iramba samples. Thirty percent of all partially immunized index children had received seven out of the eight antigens, but a quarter of all partially immunized children had received only one. Since, as shown below, children are being vaccinated late, the eventual coverage of the cohort studied is likely to rise by a few percentage points. Mothers reported that more than nine out of ten of their second youngest children (n=147) were fully immunized. Interestingly, there was no significant tendency in either district for coverage to be higher among older index children. Indeed, in Iramba the trend was for younger children (12 -15 month olds) to have a higher coverage rates than older children, which would seem to indicate an improved performance on the immunization service supply side in the last year or so. This is corroborated by the information obtained from health officials and focus group discussions. 5.5 Timing of immunization Dates of immunization were recorded from the index children's cards. A detailed analysis of the timing of vaccination is therefore possible which yields valuable information on periods of exposure to the various immunizable diseases. A summary of major findings is given below. The average child receives DPT3 and OPV3 during its seventh month rather than at age three months as laid down in the EPI guidelines. Of the eight vaccines, measles is the closest to the prescribed regimen: four out of five immunized children are vaccinated between the ninth and twelfth month. But 70 percent of children receive their BCG at least one month late, and nine out of ten received their first OPV vaccine after two months or more, i.e. also at least one month late. No less than ninety-seven percent of children received DPT2 at least a month after the prescribed time (age eight weeks). Only two percent of children received DPT3 during the third month as recommended. The trends for OPV were similar in direction and magnitude. Since a second vaccination is necessary to provide adequate immunity against diphtheria and tetanus, then the average child is not protected against these diseases for almost three months more than necessary. For polio, the figure is over three months, and for TB it is well over two months. If a third vaccination is necessary to achieve full immunization against whooping cough, then the average immunized child is only partially protected for a full 15 weeks more than necessary. Seventeen children (4.6 percent) received their DPT vaccination before the prescribed 28 days. For OPV1 the figure is 7.8 percent, but OPV0 is now being administered at birth. Eleven and 23 percent of index children were reported to have received OPV0 in Singida Rural and Iramba Districts respectively. The above aggregates conceal very large differences between Singida Rural and Iramba Districts. For all antigens, children in Irkamba District were vaccinated much later than those from Singida Rural. For example, DPT2 is given on the average more than six weeks later in Iramba, and OPV2 more than seven weeks later. Taking a second vaccination as affording full immunity against diphtheria and tetanus, Iramba children are at risk on average for about four months longer than necessary, compared with about two months in Singida. For whooping cough the figures are 5 and 2.5 months for Iramba and Singida respectively (DPT3). In Irkamba, children receive their measles vaccination more than nine weeks later than recommended, and more than six weeks in Singida. The overall loss in protection resulting from delayed vaccination is approximately 78 child-years for the fully immunized group. The majority of mothers bring their children to get their BCG vaccination late. It is common for neonates to be kept from public view during their first weeks for fear of the evil eye ("jicho baya"). Withdrawal of mother and child from the public view, amulets, smearing powder on eyebrows, hanging protective twigs, roots, leaves, and shells outside the compound are some of the protective measures taken to ward off the evil eye. After about 40 days, mother and child can appear in public with less risk, and mothers resume their normal activities, including seeking immunization. It was hypothesised that further late vaccination might result from health personnel adhering rigidly to spacing recommendations even though they have been taught to give BCG and OPV/ DPT1 together if the child's first contact with immunization services is four weeks after birth or later. Spacing of vaccinations between BCG and DPT1 was analysed to test the hypothesis. In each district there were similar numbers of children (16 and 17) who received BCG at least a month after birth but were not given DPT1 at the same time (8 percent of sample). But there were six times as many cases, (100 + 99 = 199) or 47 percent of sample) where both BCG and DPT1 were given on the same day, at least one month after birth. Moreover, it was found that in 48 cases (11 percent of sample) DPT1 was given before BCG. The health workers KAP survey discussed below shows that immunization providers know the timing and spacing of immunization, and that there is no contraindication regarding the joint administration of BCG and DPT/OPV. Thus, it is likely that the 8 percent of cases where the two vaccines were given separately were the result of lack of vaccine rather than lack of knowledge on spacing on the part of health workers. It may be that a few health workers were not following EPI instructions on opening the 20 dose vials of BCG even when there are only one or two children to immunize, but this is unlikely. The further hypothesis that the Iramba health workers (mostly NAs) were less competent in immunization than their Singida Rural counterparts (mostly MCHAs) was not confirmed since there was no difference between districts concerning the above patterns. An interesting related finding was that more Iramba than Singida Rural children were being immunized at or just after birth. BCG was administered to only 1I Singida Rural children at age 0-2 days, compared with 39 in Iramba. On the day after birth no less than 23 Iramba neonates were immunized, compared to only 2 in Singida. Hospital deliveries were not known among index children, but there is no reason to believe that they were more numerous in Iramba. This pattern may be a result of health education received under the ongoing World Bank/Unicef UCI programme. 6.0 VARIATIONS IN IMMUNIZATION COVERAGE BETWEEN DISTRICTS Immunization coverage levels depend on a number of factors. On the service delivery side, the number and location of health facilities is of major importance, as well as their all-year accessibility. As a result of past health policy, there is a good network of over 3,000 health centres and dispensaries throughout the country, and the majority of villagers (over 70 percent) are said to live within 5 kms of a health facility. In both districts surveyed there is one health facility providing immunization services for every three villages. It is difficult to say how many people live within 5 kms of a health facility, since village boundaries extend over such large areas. Nevertheless, the coverage of health facilities in both districts is similar and reasonably uniform. With certain exceptions, we would not expect distance to immunization centre to pose a major problem to many mothers. Equally important on the service supply side is the availability of vaccines in the immunization centres, which depends on the efficiency of the EPI vaccine supply and the national cold chain system. The organisation of immunization services and the competence and motivation of health workers are the other major determinants of immunization coverage on the service supply side. These issues are discussed below. On the demand side, the main determinants of immunization coverage include the attitudes of the population towards the formal health system and health workers, and towards immunization itself. Mothers' general heavy work load limits the time available to have their children vaccinated, a factor related to the closeness of immunization services. W2A026T 2.1 Introduction Different models currently available for flood forecasting can be broadly classified into physical and empirical models. Physical models are those which are developed by the application of the known physical laws occurring in the catchment to the input data. These models have an advantage of providing sound description of the hydrological process occurring in the catchment. Their parameter values may be determined by field or laboratory investigations. In such models, changes on the catchment produced by human activities like deforestation or irrigation can be represented by changes in the parameter values and the model can be used to predict the hydrological effects of such changes. Another type of models are empirical models whose development is dependent on the evidence of the relationship existing in the historical records of the input variables. Either category of these models can be classified as deterministic or stochastic. The deterministic model is one in which the input data determines the output uniquely as a function of time, not merely as a frequency distribution. While a stochastic model is one which produces an output which develops in time in a probabilistic manner. 2.2 Approach for Model Development Two distinct approaches are employed for model development; one approach is based on representing the known physical process on the catchment by a series of stages in a simplified manner, which give rise to the conceptual models. The other approach is systems analysis. In this case a general flexible relationship is assumed and by applying the method of systems analysis to the historical records, an expression for the input-output transformation may be obtained. In both cases the parameter values are dependent on the historical records for their determination. Some examples developed based on the systems analysis approach are the Unit Hydrograph method, the Simple Linear Model, the Linear Difference Equation Model (Box and Jenkins, 1976), the Constrained Linear Systems (Natale and Todini, 1976) and the Linear Perturbation Model (Nash and Barsi, 1983). The two approaches differ in that, a model developed based on systems analysis approach may be used on its own, alternatively it may be incorporated as part of a conceptual model. 2.3 Linear Hydrological Systems The use of the Simple Linear Model assumes that the <-/hydrogical> is linear and time invariant in its operation for rainfall-runoff transformations. In adopting such simplification of the actual operation of the system, the hydrologists try to simplify the analysis and avoid the difficulties which may arise in the model formulation and parameter estimation. However changes in landuse and man's activities in the basin may change the characteristics and hence the <-/reponse> of the basin to the rainfall. The basin may change with time or seasons within the year owing to changes in the ground surface and underground conditions. In reality the system is non-linear and therefore the linear model is inadequate for rainfall-runoff modelling. It is therefore necessary to adopt non-linear models which may take the variation of the systems characteristics into account for adequate <-/rainfall runoff> modelling. In an attempt to relax the constraint of linearity Amorocho (1963) suggested that the operation of the system might be represented by a series of functionals. Diskin and Boneh, 1973 and Papazafiriou, 1976 advocated the claim that nonlinear hydrological analysis is superior to the classical linear approximations, yielding runoff predictions that are much closer to the observed values. The same idea has been supported by Moore (1981) and O'Connel and Clarke (1981) in dealing with rainfall-runoff transfer function models, in which nonlinear characteristics are handled through auxiliary variables that reflect antecedent basin conditions. Nash (1976) suggested that the problem of rainfall-runoff transformation is much less one of nonlinearity than one of time-variance caused by seasonal variations in moisture conditions and evapotranspiration rates. The same feeling has been supported by Patry and Marino (1982) from the study of time variant non-linear functional model. Also, Kachroo and Natale (1992) advocate the same claim from the study of the Multi Linear Model, in which time varying weighting factors are included. The weights vary depending on the observed flow. In this study the Simple Linear Model is modified such that the non linearity effects in the operation of the <-/hydrogical> system are taken in account in rainfall runoff modelling. 2.4 Model Requirements It is necessary that the model should represent accurately the transformation of the input function into the output function. Which is an indication that the model represents as closely as possible the actual processes occurring in the catchment. In order that the model should satisfy the above objective it is necessary that the model should be accurate, consistent and versatile. 2.4.1 Model Accuracy A model is said to be accurate if it produces estimates which are close to the observed values. That is it satisfies the practical objective of minimizing the sum of squares of the differences between the observed and estimated values. 2.4.2 Model Consistency Model consistency, may be reflected by its ability to maintain the same level of accuracy and estimates of the parameter values when applied on different samples of data. Persistence of forecast accuracy through different samples of data, is generally achieved automatically in parsimonious models which require only the smallest number of parameters and different model parts for adequate representation of the observed <-_phenomenons><+_phenomena>. Inconsistency or divergence occur when the estimates of the model parameters are unstable. This may be caused by some model parts having non significant effect on the output of the model or two or more such parts having similar effect on the output of the model. In order to test for consistency of the model that is, stability of parameter values and significance of model parts, the available records are divided into two periods. Normally the tendency has been to use most of the available data for calibration of the model and to confine the verification of the model to the remaining one or two years. This is based on the UCG experience and WMO (1975). From the model efficiency criterion, it is easy to decide whether the model is consistent or not. This can be done by comparing R2 in the calibration and in the verification period. For a <-/consistant> model it is necessary that the values of R2 should be the same in both the calibration and verification periods. However, the ratio between the residual variance to the initial variance may vary depending on whether the calibration period is wet while the verification period is dry. Therefore, even for a consistent model the model efficiency criterion may vary greatly between the calibration and verification period. In order to improve this drawback Kachroo and Natale (1989) suggested that the residual variance should be compared as a function of observed flow in the calibration as well as in the verification periods. From their study it was reported that this approach has better performance. However, in this study the former approach will be used. 2.4.3 Versatility A versatile model is one which is accurate and consistent when subjected to diverse applications. 2.5 Stages in Model Development The stages involved in the model development are as follows: (a) Choice of the form of the model or model structure (b) Determination of the parameter values. (c) Determination of the significance of the selected several parts of the model. (d) Analysis of residuals errors. 2.5.1 Model Structure The expression for the model is obtained by employing the principle of progressive modification (Nash and Sucliffle, 1970). Starting from a very simple form of the model, the more complex form may be adopted only if a sufficiently better fit justifies the additional complexity by introduction of an extra element in the model. 2.5.2 Determination of the Model Parameter Values When the model structure has been identified, it is necessary to determine the parameter values appropriate to a particular physical process. This may be done by: (a) Trial and error method (b) Automatic parameter optimization Trial and error parameter estimation is based on the selection of the model parameter by trial. Then adjustments are carried out on each parameter and its effect on the model performance is noted. The acceptance of a given parameter value depends on the significance it has on the model performance. In automatic parameter optimization, model parameters are estimated automatically the most commonly used methods include the least squares and Rosenbrock method (Rosenbrock, 1960). 2.5.3 Significance of the Selected Model Parts Model parts may be tested for their significance, by running the model with or without those parts. From their effect on the performance of the model, their inclusion may be justified otherwise the part under test may be omitted with little loss of fitting ability. The model efficiency criterion, may be used as measure of significance of the model parts whereby the proportionate reduction of the sum of squares of the differences between the observed and estimated flows are used as an index of significance. 2.5.4 Residual Error Analysis After identification of the model structure and estimation of parameter values the residual errors are analysed to determine the adequacy of the model in representing the rainfall-runoff transformation. This is achieved by the application of diagnostic checks to the residuals. Such test may reveal the evidence of serious inadequacy, for example, through the use of a linear model for a highly non-linear system. Then consequently the model must be modified. Apart from the identification of the inadequacy in the model structure, diagnostic checks may also suggest the desirable modification of the model. Normally, they are applied to the residuals of model forecasts to test for:- (a) Evidence of time variance, indicating the need for incorporating a time varying or a seasonal varying component. (b) Functional relationship between residuals and the independent input or output functions, indicating the necessity for a more complex model form. (c) Persistence in the residuals, introducing systematic errors. The existence of any of the above factors indicates some model inadequacy. The existence of a seasonal component in the residuals <-_indicate><+_indicates> the need of further investigation and modification, by incorporating a seasonal correction in accordance with the observed seasonality of the residuals. Functional relationship, between the residuals and the input or output function, may be corrected by inclusion of a non-linear component in a linear model. Non-linear relationship would be indicated only if the errors of the forecasts are related systematically to the input or output variables. Persistence is the tendency of a time series to have high values followed by high values or low values followed by low values. The residuals can be tested for persistence, by using autocorrelation analysis. If persistence is detected then an autoregressive function may be developed and incorporated into the model. That is before a forecast is issued one has to make corrections based on errors occurring in the previous forecasts up to the time at which the forecast is being made. 2.6 Model Performance Evaluation A commonly used objective function to express the model accuracy is the sum of squares of differences between the observed and the estimated discharges, with summation taken over the whole calibration period. This can be expressed by equation (2.1) Where y denotes the observed discharge y denotes the estimated discharge. F is an index of residual error which reflects the extent to which a model is successful in reproducing the observed phenomenon. As a criterion of model performance it is subjected to objection that it is a dimensioned quantity, adequate for comparing the accuracy of various forecasting models on the same catchment; but unsuitable for comparing different models on different catchments or with different record lengths. It is therefore, necessary that, the residual error variance should be standardised such that its expected value will not change with the length of the record or the scale of the discharges. According to Nash and Sutcliffle, (1970), the model efficiency criterion is determined by comparing the residual variance with the initial variance. The criterion denoted by R2 is analogous to the coefficient determination in linear regression. It is given as one minus ratio of the residual variance to the initial variance denoted by F0 where y- being the mean of the outflows, during the calibration period. n is the number of data points, in the calibration period. The efficiency criterion is given by the equation. In application to the calibration period, R2, as defined above is similar to the coefficient of determination and varies between zero and one. This is because the initial variance F0, and the residual variance F, are both obtained within that period, where y- is the long term mean of the discharge series inn calibration period. In the verification period the efficiency criterion R2 differs from the coefficient of determination. The reason for using the mean of the flow during the calibration period in estimating R2 is the criterion attempt to compare the sum of squares of the model errors with the sum of squares of errors which would occur when in the absence of any model (The no model situation). The only forecast which could be made for the verification period would be the long term mean value of the discharge records as estimated from the calibration period. As a result R2 may take negative values in the verification period when the model under test produces forecasts which are worse estimators than the mean of the calibration. Also, the model efficiency criterion R2 can vary greatly if the calibration period is unusually wet and the verification period unusually dry, Kachroo and Natale (1989). 2.7. Estimation of Memory Length The memory length is the period between the occurrence of rainfall and the time when its effects on the stream flow ceases. Its value is usually found by trial and error procedures. If the value of the memory length is greater than the actual, the estimated pulse response ordinates will be very small or equal to zero. this has very little effect on the usefulness of the pulse response derived in this way. However, if the memory length chosen is less than the actual then the pulse response function will have ordinates which fail to decrease monotonically to zero. The use of such a pulse response is likely to lead to serious errors. The standard errors of the last few ordinates of the pulse response my be used as a guide on choosing the correct value of the memory length. If the ordinates of the pulse response are less than their respective standard errors, then the pulse response ordinates may be taken to be equal to zero. W2AO27T CONTROL OF RODENT FLEAS USING SYSTEMIC INSECTICIDES Abstract - Rodent flea control to prevent spread of diseases of which rodents are reservoirs has often been carried out using insecticides. The insecticides are either dusted or sprayed in the entire area affected or placed in baiting boxes sometimes with rodenticides to control rodents and fleas, simultaneously. Systemic insecticides to control rodent fleas though found efficient, easy to apply and less hazardous to the environment, have only been demonstrated in the laboratory and field, but have not been commercially available. It is suggested that using novel pesticide formulation techniques such as microencapsulation, tableting, employing bait attractants and additives and synthesis of metabolites of effective systemic insecticide could lead to the use of baits containing systemic insecticide to control rodents and fleas simultaneously and avoid undue expenses and hazards. INTRODUCTION The discovery of the role played by fleas as vectors of disease organisms, especially from rodents to man or other animals (Ogata, 1897) led to the development of flea control strategies practised today. Conventionally, control of rodent/flea-associated diseases involves control of fleas to a certain level (Hirst, 1927) prior to the control of rodents using chemical pesticides. However, environment contamination hazard with the insecticides and the fact that rodents continue to cause damage, while controlling fleas has been of concern with this approach (Miller et al., 1975; Love and Smith, 1960). Alternative approaches have been investigated, aimed at reducing costs, using less materials, less hazards to the environment and being acceptable to the public (Kartman, 1958). Baiting box approach (Kartman, 1958) of rodent flea control had to depend entirely on contact insecticides, mostly chlorinated hydrocarbons, but sooner or later gave out resistant flea population (Patel et al., 1960). Moreover, the chlorinated hydrocarbons are more persistent in the environment. The use of organophosphorus insecticides for the control of rodent fleas was adopted (Bushland et al., 1963), because they were relatively less persistent and environmentally better accepted. It appears that although investigations revealed organophosphorus to have potential in the control of rodent fleas systemically, no bait is available for this purpose to date. This paper is a review of the potential use of systemic insecticides for the control of rodent fleas and explores the techniques which if thoroughly exploited could enable formulation of systemic insecticides for the control of rodent fleas. SEARCH FOR SUITABLE ORGANOPHOSPHOROUS SYSTEMIC INSECTICIDES Research work for the development of systemic insecticides was initiated in the United States of America, at Kerriville Texas Livestock Insects Laboratory. This followed the discovery by Lindquist et al., (1944) that bed bugs, Cimex lectularisis (Linnaeus) and C. hemipoterus (Fabricius) died after feeding on rabbits treated orally with DDT or pyrethrins. The success made in developing systemic insecticides for livestock ectoparasites (Khan, 1969) prompted other workers to research on systemic insecticides for the control of rodent fleas. Harvey (1960), investigated whether oriental rat fleas Xenopsylla cheopis (Rotsch.) was killed when exposed to white norway rats, Rattus norvegicus (Erxleben), given oral treatments of ronnel, Dowco 109 and dimethoate. Dimethoate was significantly better than Dowco 109 and ronnel. However, ronnel in another test by Bennington (1960), was highly effective against rodent fleas when offered to norway rats in a bait containing rodenticide. Clark and Cole (1968) reported four successful systemic insecticides in a bait against rat fleas on hooded white rats. The lowest effective dose of each insecticide expressed in mg. insecticide eaten per kg body weight per day was diazinon 14.4, fenthion 4.1, mirex 6.1 and trichlorforn 38.0. Though dimethoate had been shown to be an effective systemic insecticide by Harvey (1960), studies by Clark and Cole (1968) found it ineffective. Systemic insecticides particularly, the organophosphorus, have been shown to be quite effective against certain species of fleas in the field. Miller et al. (1975, 1977b, c) carried out extensive investigation on the effectiveness of phoxim, on fleas of cotton rats, Sigmodon hispidus and kangaroo rats. On cotton rats a significant reduction of fleas per rat was noticed after an application of phoxim (through a 0.24% phoxim treated bait) within 36 hours. Seasonal increase of fleas on cotton rat was prevented by applying 0.24% phoxim. In treated area the flea index was 0.7 while in the control it was 11.2. Miller et al. (1977a) evaluated chlorphoxim in the field against fleas of kangaroo rats and cotton rats. The flea index on kangaroo rats was reduced to zero in 3 days post application. Similar results were observed on cotton rats when chlorophoxim technical was applied at the rate of 56.3 g/ha. With many species of rodents in open field tests, the application of 123.1 g/ ha technical, gave effective flea control 7 days post application. In southeastern New Mexico an evaluation of 7 organophosphorus systemic insecticides was conducted in the field for the control of fleas on native rodents and rabbits (Miller et al., 1978). The most effective control was achieved with ronnel and dimethoate on Dipodomys spp. Some degree of control was achieved on D. spectabilis with fenthion and acephate. There was an effective flea control on Sigmodon hispidus with dimethoate, and some degree of control was achieved on this species with trichlorfon. Coumaphos and pirimiphos-methyl were ineffective. However, none of the seven insecticides achieved an effectiveness comparable to that of phoxim and chlorphoxim reported above. SYSTEMIC INSECTICIDE BAITS To-date there is hardly a single rodent formulation meant for rodent flea control using systemic insecticides despite the immense amount of research done on the subject in the past. Harvey (1960) had pointed out that it would have been a novel idea to mix a potential systemic insecticide with a slow killing rodenticide like warfarin such that fleas died before the rats. Bennington (1960) found that a bait containing 10% sugar, 1:49 fumarin (rodenticide): cornmeal containing 12 g ronnel and 20 ml liquid smoke per pound was attractive to rats. It killed oriental rat fleas before the rats died in a feeding period of 5 days. Why this product was not made available is not known. Most probably it was due to the cost of producing it as it would have been an expensive bait. Dohany et al. (1977) tested a bait of 1:1 mixture of crushed corn and milo treated with dimethoate at concentration of 0.1, 0.2, 0.3 and 0.4% on Hispid cotton rat. They observed a good acceptance of the bait. Also, guinea pigs accepted 0.2% dimethoate bait formulated with a 1: 1 mixture of cracked corn and ground milo (Dohany et al., 1980). Miller et al. (1977c) applied phoxim as a surface coating on milo bait at a concentration of 0.24% in the field to control natural seasonal increases of fleas on cotton rat. Poor performance of bait containing systemic insecticides to control rodent fleas has in some cases been due to bait being not accepted. Clark and Cole (1968) observed that baits containing dimethoate, menazon and thiocron were not readily consumed. These insecticides were among those found to be ineffective in controlling oriental rat fleas on hooded white rats. Most likely, systemic insecticide baits were not readily consumed by rodents due to the taste and smell. The sense of taste in rodents is highly developed and therefore addition of an insecticide into the bait, often leads to rats disliking it. The chemical senses of rodents are connected with an exceptionally efficient learning mechanism, which enables rodents to associate the taste or flavour of a specific food with its beneficial or adverse effects on their health (Greaves, 1982) . This ability enables rodents to select a balanced diet and helps to avoid being poisoned. Like taste, the sense of smell in rodents is highly developed. This sense is used in guiding rodent movements around their living areas. They also use odour in tracking sexually-receptive females. The sense of smell is important in the detection and selection of bait/food (Greaves, 1982). Most organophosphorous insecticides have objectionable smell. In establishing effectiveness of ronnel against ectoparasites in small animals, odour produced by this pesticide, was a serious problem (Burch, 1960). POTENTIAL FOR FORMULATING SYSTEMIC INSECTICIDE BAITS It appears that no concerted efforts have been devoted on the formulation of systemic insecticide baits, specifically for rodent flea control. Rodent flea control is in most cases carried out when there is a disease outbreak or fear of an outbreak. Usually the control operation is carried out by public health staff. However, if a bait that would control both rodents and fleas simultaneously was available, the risks of disease outbreak after rodent control operations would be reduced considerably . In the absence of such a bait the game of "You make the fire and I put it off" between pest control personnel and public health personnel shall continue. I consider controlling rodents as fire making and flea and disease control operations carried out by public health officers as putting off the fire. The difference between rodent control personnel and flea control public health personnel explained by Gerhardt (1960) could easily be resolved if a systemic insecticide bait was made available. Formulation technologies available today could overcome some of the constraints faced in the past. The odour of most insecticides is one of these constraints. Since some odours are attractive to rodents e.g. the garlic smell of zinc phosphide (Marsh and Howard, 1969; Brooks, 1973), it is possible to improve an insecticide smell to suit that preferred to rodents. For example, formulation techniques enabled to subdue the smell of Ectoral used for controlling ectoparasites of small animals (Burch, 1960). Utilization of rodent pheromone to attract rodents to baits or traps or mask an objectionable smell/taste is a field which has not been fully exploited in order to enhance bait acceptance. Marsh (1986) pointed out that research on rodent pheromone could probably result into universal attractants/odours. However, recent findings have shown that some odours attracted rodents to the bait, but did not influence palatability (Marsh, 1988). With regard to attractive odours it is desirable to add into the bait some additives to enhance acceptability. This subject has been dealt with widely (Marsh, 1986, 1988; Kusano et al., 1975), but requires a species specific study for particular additives. Microencapsulation may offer yet another possibility of masking the smell and taste of systemic insecticide in a bait to be used in rodent flea control (Comwell, 1977). Advantages of microencapsulation were summarized by Phillips (1968) . Bitter substances like chemosterilants have been offered to rodents quite successfully by using microencapsulation technique. Ericson et al., (1977), for example, mixed encapsulated-chlorohydrin (a chemosterilant) with laboratory rat food for the control of wild rats. Rodenticides have been encapsulated, particularly, to facilitate acceptability and delay onset of poisoning symptoms (Jackson, 1974). Abraham and Hinkes (1974) observed that encapsulated technical warfarin was more acceptable to rats in the laboratory than identical baits containing uncapsulated warfarin. Tablet formulation is another technique available for using systemic insecticides in rodent flea control. Bullard (1970) compared the acceptability of tableted and surface coated formulations and found that toxic concentrations could be increased by tableting without causing bait rejection by the rodents. Synthesis of metabolites of systemic insecticides found in the host has been shown to be useful. This approach exploits the fact that metabolites formed in the animal were more toxic to the ectoparasites than the parent compound (Roberts et al., 1958). This does not, however, guarantee that the synthesized product is more acceptable to the animal than the original product, but nevertheless, provides a useful compound which can then be reformulated to suit rodent preferences. Ruelene was synthesized and used to control cattle grubs systemically after it was discovered in studies on the metabolism of Dowco 109. A similar approach on rodents could yield a compound to be synthesized and used in formulation of systemic insecticide baits. Other studies on systemic insecticides for the control of rodent fleas further suggest that, the insecticides found to be effective against fleas, but also lethal to the rodents could be formulated in a bait that would control both rodents and fleas simultaneously.