Discover Medicine Like Never Before

Archive for the ‘Historical Medicine’ Category

History of medicine : Part 8

2000 and beyond: 21st century medicine

It is impossible to say how medicine will develop over the next hundred years but undoubtedly our knowledge of genetics will be of great importance.

Human Genome Project

The task was to find the sequence of DNA for every single gene in a complete set of human chromosomes. We call this sequence the human genome. It started in 1990 and saw unprecedented scientific collaboration between research laboratories in the United States, Europe, Asia and Australia. Using highly automated analytical techniques, the task was completed in 2003, which was barely 50 years after Watson and Crick first described the double helix structure of DNA.

Some surprising discoveries were made. The instructions for an entire human is held in only 30-40,000 different genes and all but a few percent of these are common to our nearest relatives, the chimpanzees.

 

Genetics and medicine

The risk of developing many disorders, such as Alzheimer’s, diabetes and heart disease, may well be influenced by our genetic make-up. Greater understanding of the human genome will direct the development of medicines to help treat and prevent diseases over the next hundred years. Advances in genetics will allow treatments to target the genes or specific proteins that cause disease. Gene therapies are being developed that aim to replace faulty genes and so reverse the effects of inherited disorders such as cystic fibrosis.

 

Ethics and medicine

Advances in medical science will not come without controversy.

  • Technology has made genetic fingerprinting a routine task but how should this information be used?
  • Individuals can be warned of diseases that they are likely to develop in older age but this profile could also be used to assess a person’s suitability for insurance or employment. How can we protect the rights of the individual?
  • Genetic engineering and stem cell therapies may provide cures for diseases such as cancer, leukaemia and Parkinson’s but are these experiments ethical or not?
  • Will the benefits outweigh the risks?

Throughout the ages, medicine has been influenced by the spiritual and superstitious beliefs of the day. Knowledge built up over the centuries has led to modern treatments that are based on a molecular understanding of how the body works. The ethical and moral concerns of modern society will continue to shape the development of new medicines and treatments.

 

New challanges

It is impossible to prevent all diseases. Bacteria are evolving resistance to antibiotics and viruses mutate to cause new infections such as the recent outbreak of Severe Acute Respiratory Syndrome (SARS). As life expectancy rises, fresh challenges will emerge in the treatment of the elderly. One issue that medicine alone cannot tackle is to raise the living conditions of people throughout the world so that they do not suffer from diseases of poverty. As always, modern medicine will continue to face fresh challenges and find new solutions for the 21st Century.

 

Source:  http://www.schoolscience.co.uk

 

History of medicine : Part 7

1900 – 2000: The 20th century

In 1901, the average life expectancy in the United Kingdom was 47 years. By the year 2000 it had risen to 77 years. New medicines, improved air quality and better public hygiene has contributed to this 64% increase in the life-expectancy. The twentieth century has seen some major advances in healthcare. These have included the development of:

  • penicillin: the discovery and development of antibiotics by Fleming, Florey and Chain.
  • insulin: Banting and Best’s work to show that insulin can be used to treat diabetes.
  • other medicines: pharmaceutical laboratories around the world are constantly producing new treatments for diseases.

 

Other development

Vaccination: although first described by Edward Jenner in the 18th century, mass vaccination programmes were undertaken to prevent deaths from diseases such as yellow fever, poliomyelitis, measles, mumps and rubella. In 1980 the World Health Organisation announced that the deadly smallpox virus had been completely eradicated.

Medical imaging: physicians can now call on a range of techniques to see inside the body of their patients. X-rays, discovered by Roentgen, were the first but now sophisticated computer technology allows surgeons to plan operations and radiologist to target tumours with pinpoint accuracy. Ultrasound, magnetic resonance imagery (MRI) and computer tomography (CT) scans are all part of the doctor’s diagnostic armoury.

Technology: advances in bioengineering, computing power, materials technology and many other areas of science have led to the development of many medical devices. During heart surgery, an artificial heart and lung machine keep the patient alive. Kidney damage can quickly kill but renal dialysis can keep patients alive even though their kidneys have failed. Hearing aids and cochlea implants bring sound to the hard of hearing. Biotechnology is allowing pure drugs, such as human insulin, to be produced in large quantities.

DNA: the human genome project is unlocking the secrets held within our DNA. It will lead to a much better understanding of the genetic basis for many diseases and may enable the development of new cures in the 21st Century.

The past half century has seen a tremendous advance in medicine. The first heart transplant was performed by Dr. Christian Barnard in 1967 and on July 25th 1978, Louise Brown was the first person to be born after in vitro fertilisation. Research and development of modern medicines has made a massive contribution to the improvement in health and life expectancy.

 

Two Worlds

Sadly, it is not all good news for medicine in the 20th Century. Many diseases can be controlled and treated but this takes money. In places such as Africa, South America and Asia, the levels of healthcare are below those found in the more well off Western nations. Diseases like HIV/AIDS, cholera, tuberculosis, pneumonia and malaria remain major killers in these regions. The challenge of medicine in the 21st Century is to make high quality healthcare available to all.

Insulin

In 1922, the Canadian physiologists Fred Banting and Charles Best announced to the world that they had discovered Insulin and successfully used it to treat diabetes in a human patient. Until then, diabetics would struggle to grow and there was no successful treatment. They would become walking skeletons and die prematurely due to severe weight loss.

 

An ancient problem

Diabetes mellitus had been known since ancient times. Egyptian writings from as early as 1500BC described a wasting disease in which the sufferer produced sweet-tasting urine. From the 1850’s onwards, autopsies of people who had died from diabetes suggested that the problem was caused when the pancreas did not function properly. Many physicians speculated that specialised cells, called the islets of Langerhans, produced a chemical that allowed the body to regulate its blood sugar level. Diabetes was caused when this chemical was not produced.

 

Animal experiments  show the solution

Banting and Best worked in the University of Toronto. They removed the pancreas from dogs which then developed diabetes. Their experiments may seem cruel today but without them, insulin would never have been found as the treatment for diabetes.

New methods of testing blood sugar levels allowed Banting and Best to accurately determine the effects of their treatments. They struggled to purify the chemical hormone produced by the pancreas and extracted many compounds from the islets of Langerhans. These were injected into the diabetic dogs to try and find the hormone that would reverse their diabetes.

Initially the injections were very impure and often had fatal side-effects. A team of researchers were recruited and eventually they were able to make an extract from the islets of Langerhans that was pure enough to try on a human patient. In May 1922, fourteen-year-old Leonard Thompson was successfully treated in Toronto Hospital with the extract that they called insulin. In 1928, Oskar Wintersteiner proved that insulin was a protein.

We now know that insulin allows the cells of the body to take in sugar from a digested meal. The liver is especially important in the process of regulating the body’s blood sugar level. Insulin enables the liver to take in sugar (glucose) after a meal and store it as glycogen. This is used later to return glucose to the blood when blood sugar levels begin to fall.

 

Meeting the demand for insulin

News of Banting and Best’s success spread quickly and soon their laboratory was unable to meet the demand for the new wonder drug.

Commercial preparation of insulin began by extracting it from the pancreas of slaughtered cows and pigs. This is still an important source of insulin for medical use. Chemical modifications tailor the insulin to mimic the human hormone and also give it properties that make it convenient to administer. Early insulins were injected three or four times a day, just before each meal. More long-acting insulins have been developed so that the need to inject so often has been reduced.

 

Humulin

In 1955 the Nobel Prize-winner Frederick Sanger found the amino acid sequence of human insulin. This allowed a human insulin gene to be made which was then used to genetically engineer bacteria that could produce large amounts of highly pure human insulin. Currently there are 1.4 million people in the UK who successfully control their diabetes by using injections of insulin and much of it is made by genetically-engineered bacteria.

 

Penicillin

At the start of the 20th century, many people still died from infectious diseases that today are easily cured. It was a discovery by Alexander Flemming in 1928 that would lead to the range of modern antibiotics that we know today.

 

First Antibiotic

In 1871, Joseph Lister noticed that some moulds could make other microbes grow more weakly. He did not realise the potential of this observation and did not follow it any further. It was over fifty years later, in 1928, that Alexander Fleming made a similar observation.

Fleming was trying to find ways of killing the bacteria that caused cuts and wounds to become infected and turn septic. This was a serious condition and could cause death if the infection spread to the blood. He noticed that the growth of bacteria had been inhibited on a petri dish that had been accidentally contaminated with the mould Penicillium Notatum. He immediately realised that the mould must be producing a chemical that prevented the bacteria from growing. He cultivated the mould and investigated its properties on bacteria that caused diseases such as anthrax, meningitis and diphtheria.

 

War effort

Fleming’s discovery was not fully exploited until the outbreak of the Second World War in 1939. Infected wounds had caused many deaths in previous wars and two researchers based in Oxford University, Howard Florey and Ernst Chain, were given the task of finding new medicines to treat wounded soldiers. They realised the importance of Fleming’s work and had the resources to grow large amounts of the Penicillium mould. This allowed them to isolate the active antibiotic in sufficient quantities to try it on patients suffering from severe infections.

Before antibiotics, a simple throat infection could easily spread to the lungs and throughout the body. There was little that could be done for these patients and many died from complications of what we would now think of as a trivial infection. Florey and Chain showed that Penicillin could be used to save lives.

The production of Penicillin became a wartime priority and pharmaceutical factories in the USA, United Kingdom and Russia manufactured large quantities of Penicillin which was used to save the lives of wounded soldiers.

 

Superbugs fight back

There are now many different types of antibiotics which are specialised to treat a wide range of bacterial infections. However, the widespread, and sometimes unnecessary use of antibiotics is leading to the evolution of strains of bacteria that are able to survive all but the most powerful antibiotics. These so-called superbugs can cause real problems, especially in hospitals where patients may become infected after surgery if the highest standards of hygiene are not maintained.

 

The development of a modern medicine

 

The industry is constantly developing new medicines. It invests about £8m a day in Research and Development (R&D). This is more than any other manufacturing sector and accounts for about a quarter of the money spent on R&D in the UK.

 

Discovering new medicines

Teams of chemists, pharmacologists and biologists search for molecules with medicinal properties. Molecular structures are altered to optimise activity and minimise unwanted side effects.

Promising medicines then pass on to the second phase of development. Phase I trials use healthy volunteers to test medicines and gather information on how the body reacts to them. This pharmacokinetic information tells researchers the best way to give the medicine and how it behaves once inside the body. Once this data is available from volunteers, an application can be made to start clinical trials in patients.

At this pre-clinical development phase, additional tests are carried out. These include animal tests to check that the chemical compound is not poisonous and chemical tests to show that it is stable enough to be used as a medicine.

Phase II trials see the new medicine tested in a small number of patients. If it looks to have beneficial effects, the medicine will go on to phase III clinical trials with that include a much larger number of patients to generate significant statistical data.

 

Placebo

During clinical trials the use of a placebo is vital. The patients are divided into two groups. One group is given the medicine under trial and the other group is given a control medication. This does not contain any of the active ingredients found new medicine and is called the placebo. Neither the patients nor their doctors know if they are using the medicine or the placebo. This acts as a control so that any beneficial effects are due to the medicine and not just because the patient or doctor believe they are using something new and better than their previous treatments.

 

A massive task

For every new medicine that passes all the trials over 5,000 compounds need to be screened. Each year the UK pharmaceutical industry markets around twenty new medicines. On average It takes an amazing eleven years of development and £500million for each new medicine that reaches the patient.

A technique called high-throughput screening has automated many of the initial tests and pharmaceutical laboratories may now screen thousands of compounds per week. Research chemists can use computers to model designer molecules and using the latest equipment, large pharmaceutical companies may synthesise and screen 300,000 molecules a year.

The development of salbutamol to treat asthma is a typical example of how a medicine is designed tested and produced

 

Formulating medicine

The formulation of a medicine is how it’s made up. For example, tablets and ointment are both types of formulation. The formulation depends on several factors including:

  • how easy it is to take or use
  • how quickly a medicine needs to get into the body
  • where it has to work in the body

 

 

History of medicine : Part 6

1700 – 1900: 18th and 19th centuries

 

The industrial revolution of the 18th and 19th centuries saw a massive change in the way people lived and how this affected their health. People moved from small villages and an agricultural lifestyle to live in towns and cities that sprang up around the new factories, where they could work. People lived in dirty, overcrowded conditions with poor sanitation and dirty drinking water. Many died from diseases such as cholera, tuberculosis, measles and pneumonia – infections that could spread quickly and easily in these conditions.

Two of the big medical advances of this time were:

  • vaccinations
  • X-rays.

 

A microscopic revolution

People’s understanding of the body increased tremendously and finally dispelled ideas that had remained from ancient Greek times. Scientific knowledge spread rapidly because scientists began to publish their work in books. A Dutch clockmaker, Anton Van Leeuwenhoek, made one of the earliest microscopes to use a glass lens. The detail the revolutionary microscopes could see allowed the English scientist Robert Hooke to observe cells for the first time. In 1661 the Italian scientist Marcello Malpighi identified capillaries which finally showed the link between arteries and veins and proving Harvey’s theory for the circulation of blood.

 

Medicines comes of age

Medicine also made great advances during this time. Edward Jenner pioneered the earliest vaccinations and discoveries by Louis Pasteur and Robert Koch led to the understanding that infections were caused by certain bacteria or germs. The study of microbes, or microbiology, was born and the increased knowledge of pathogenic microbes led to the development of new medicines to tackle infectious diseases. The pharmaceutical industry was born.

The ideas of an earlier physician,Thomas Sydenham, were applied and this led to a great advance in the treatment of patients. He recognised the importance of detailed observation, record-keeping and the influence of the environment on the health of the patient.

The importance of hygiene

Perhaps the most famous nurse ever, Florence Nightingale, worked in a military hospital during the Crimean war. Conditions were poor and 80% of soldiers died from infections they caught in the hospital rather than their original wounds. Florence Nightingale improved standards of hygiene and sanitation which dramatically reduced the infections in her hospital. When she returned from the war, Florence Nightingale embarked on a campaign to modernise and improve hospitals. She set the foundations of hospital design and nursing practice that are still seen today.

 

Under the surgeon`s knife

Surgery also made great advances. Industry could produce better surgical instruments and operations were often performed in open theatres with interested members of the public invited to view them. There were no anaesthetics and surgeons prided themselves in the speed with which they operated; just a few minutes for a leg amputation. From the 1840’s onwards, the discovery of the anaesthetics ether, chloroform and cocaine allowed surgeons to take more time and care over operations. Modern anaesthetics mean that operations lasting several hours are now commonplace.

 

Antiseptics

Joseph Lister realised that infections caught during an operation often lead to death by septicaemia. He pioneered the use of carbolic acid as the first antiseptic to clean wounds and surgical instruments. Operations were performed with a fine spray of carbolic acid passed over the patient to kill any microbes in the air. In one Newcastle hospital, use of Lister’s antiseptic technique reduced deaths from infection from nearly 60% down to just 4%.

 

Vaccination

 

Smallpox was a killer disease in the 18th century. Infected people would become covered in horrible skin sores and often die a painful death. Those who recovered were left with terrible scars or poc marks on their skin. We now understand that it is caused by a virus (the variola virus). It infects the internal organs, causes severe blistering of the skin and death due to blood poisoning or secondary infections.

 

Kill or Cure

Edward Jenner is credited with the development of vaccination but in fact it was first introduced into England by Lady Mary Wortley Montagu in 1721. She tried a method that was used in Turkey where people deliberately infected themselves with a mild form of smallpox. This was the first form of innoculation. Sadly, many people died from the smallpox they were using to protect themselves. Clearly something different needed to be done.

 

Observation and vaccination

Jenner was a doctor who worked in Gloucestershire and the great advance he made was to notice that individuals who had contracted cowpox (the cow’s equivalent of smallpox) rarely caught the deadly human version. In 1796 he deliberately infected an eight year old boy called James Phipps with the pus from a cowpox sore. The boy became ill with cowpox but recovered. He then infected him with the normally deadly smallpox. As Jenner had predicted the earlier infection with the cowpox actually protected the boy who never caught smallpox. The practice of modern vaccination was born.

 

Cowpox VS smallpox

After many more successful vaccinations, Jenner published his results in 1798. However, they were met with scepticism and many doctors still carried out the more dangerous practice of innoculation with live smallpox pus. It was not until 1840 that this dangerous practice was banned and in 1853 vaccination by Jenner’s method was made compulsory. Protestors argued against compulsory vaccination, saying that it limited their personal choice; a similar debate to the one raging today over the MMR vaccine.

 

Understanding the immune system

When Jenner tried his first vaccinations, the way microbes cause infectious diseases was not understood and he did not know about the immune system. We now understand how vaccination works.

Pathogens are microbes that cause disease. These can be viruses, like smallpox, or bacteria. If a small amount of the weakened, or inactive microbe is introduced into the body it stimulates the immune system to produce antibodies to fight off the disease. The immune system remembers the microbe and can defend the body against any live form of the microbe that it may encounter in the future. The person is said to be immune to the disease.

 

Eradication of smallpox

Nearly 200 years after Jenner’s discovery, a programme of vaccination by the World Health Organisation (WHO) was started with the aim to completely eradicate the smallpox virus. It is estimated that smallpox killed 500 million people worldwide during the last century. The last case of naturally-transmitted smallpox was reported in Africa, in 1977. In 1980, the WHO officially announced the end of smallpox. There remain two highly-guarded stocks of the virus in laboratories in the USA and Russia. These are preserved for research purposes. Some authorities speculate that other laboratories have stocks that could be used in germ warfare but these claims are yet to be proven.

Jenner’s discovery has led to a greater understanding of the human immune system and vaccination programmes against diseases such as measles, mumps, polio and tuberculosis have improved the health of millions who need not fear these killer diseases.

 

X-rays – the start of medical imaging

 

X-rays were discovered in 1895 by the German Physicist Wilhelm Roentgen. He was studying cathode rays produced by a recently-invented piece of equipment called a Crooke’s tube when he noticed that a fluorescent screen across the room started to glow.

 

The start of medical imaging

The glow appeared to be caused by some unknown rays coming from the Crooke’s tube. He tried to block these rays with cardboard but found that they passed straight through it. Amazingly, if he put his hand between the tube and the screen, he could see an image of the bones in his hand. He named this new radiation X-rays and immediately realised how important his discovery would be to the world of medicine.

We are now familiar with atoms, protons, neutrons and electrons but when Roentgen discovered his mysterious rays the structure of the atom was unknown. For this fantastic discovery, Roentgen received the first Nobel prize in Physics in 1901.Today, the use of x-rays is common in hospitals and dentists all over the world.

 

X-rays

We now understand X-rays to be part of the electromagnetic spectrum with a very high energy and short wavelength.

Their high energy means that X-rays can pass through skin and muscle. They are absorbed by dense tissues like cartilage and bone. This produces an image on photographic film which can show damage hidden to the naked eye.

 

Modern Scanners

A modern development of the X-ray is found in CT scanners. Computerised tomography (or CT) scanners use several beams of X-rays simultaneously from different angles. Detectors measure the amount of each beam that is absorbed in the body and the data is fed into a computer that can build up a virtual, three-dimensional model of the area being examined. CT scans are very sophisticated and can be used to look for damage in much more than bone. They are often used to plan complex surgery and locate tumours before radiotherapy.

 

From cancers to airport check-in

X-rays can be used to treat disorders like cancers. In this case, a radiotherapist carefully calculates the dose of X-rays targeted onto the tumour inside the body. These X-rays are over 150 times stronger than those used to in X-radiographs and kill the cancer cells in the tumour.

Low energy X-rays are now widely used to see inside baggage in airports and other areas where security is important.