We are searching data for your request:
Upon completion, a link will appear to access the found materials.
The Royal Navy of World War Two prided itself on the superior psychiatric health of its men. Because it had its pick of recruits, it assumed that it always got the best and most emotionally stable men.
This rather complacent view was to be challenged as the war went on and even the toughest of sailors began to show unmistakable signs of battle fatigue and stress.
At first these were dismissed as nothing more than a state of anxiety. Only from 1943 did the Royal Navy admit that it’s seamen might be suffering from “fatigue”. This term was chosen deliberately to avoid the stigma of mental breakdown and to suggest that once a man had rested he could rapidly return to duty.
Depth charges explode astern of HMS Starling of the 2nd Escort Group in the Atlantic, January 1944 (Credit: Parnall, C H, Admiralty Official Collection).
Stress related disorders were especially common among crews serving in the Arctic convoys. Many ships’ surgeons noted that the prolonged and repeated stress and strain of daily bombing attacks led to an increase in the number of men attending sick parade, accompanied by increased apathy and listlessness among the crew in general.
There were even occasions when men were too stressed to abandon a sinking ship even though they were physically capable of saving themselves.
Morale was even more difficult to maintain among survivors of a sinking. This was mainly because they had nothing to do on the rescue ship but indulge in self-pity and criticism of their opposite numbers on the new ship.
Part of the problem was that from having been members of a small, tightly disciplined and closely organised community they had lost the comradeship of the mess and a sense of purpose.
The Royal Navy heavy cruisers HMS Dorsetshire and HMS Cornwall under heavy air attack by Japanese carrier aircraft on 5 April 1942 (Credit: Imperial Japanese Navy).
Men of the Royal Navy generally dealt with this challenge much better than merchant seamen who were under looser discipline and had less of a sense of social cohesion.
Even so it was important for officers to ensure that steps were taken quickly to deal with the trauma of Royal Naval survivors by stressing routine and a sense of normality.
Doctors were warned against asking leading questions about the mental health of the men they were examining to minimise self-pity and promote increased confidence.
After 14 days’ survivors’ leave , survivors were reintegrated into a new ships’ company. In many instances, these were made up of other survivors with the unfortunate result that men suffering from “anxiety” were concentrated in one ship and battle fatigue and satisfaction could become ingrained in a company.
A convoy of Landing Craft Infantry (Large) sails across the English Channel toward the Normandy Invasion beaches on “D-Day”, 6 June 1944 (Credit: U.S. National Archives).
Wartime conditions with long periods at sea exacerbated neurotic illnesses in the navy caused by a cold and damp environment, the vibration and excessive noises of the ship, sleeplessness, long periods without shore leave, boredom resulting from lack of recreation and the want of outlets for sexual frustrations.
Men worried about their families ashore, especially those living in the heavily bombed naval ports. Many men reported to the sick bay with minor complaints after receiving a worrying letter from home.
Medical officers at sea were advised to watch for such signs of neurosis as unreliability, slipshod work, a slovenly appearance, excessive consumption of alcohol and cigarettes, and surliness.
Minor illnesses could also be signs of psychiatric disturbance, including headaches, indigestion, dizziness, palpitations, tremor, diarrhoea and excessive micturition.
Bombs falling astern of HMS Ark Royal during an attack by Italian aircraft during the Battle of Cape Spartivento (Credit: Priest, L C, Imperial War Museum).
It was one thing to observe the signs of battle fatigue, it was another to offer effective treatment.
The fighting efficiency of the ship came first and could not be compromised. Disturbed men were kept on duty not only so that they would be too busy to think about their problems but so that the work of the ship could be carried on with a full complement of crew.
Only in advanced cases of neurosis was a man to be admitted to the sick bay for treatment or, if in port, to a hospital. In most cases shore leave was seen as the solution to all problems.
Ralph Ransome Wallis, the surgeon on HMS London, recognised the limits of what he could do despite being aware that most of his shipmates displayed psychiatric symptoms to some extent.
His view was that there was no option but for them to cope with their own problems and get on with their jobs. In his experience:
a few sharp words from the sick berth chief petty officer accompanied by a No. 9 pill containing a powerful purgative worked wonders.
Even Desmond Curran, the chief psychiatric consultant to the Royal Navy was reluctant to admit that ‘operational strain’ might be a cause of neurosis and mental breakdown.
It was easier to blame the inherent psychological weaknesses of the men themselves. It was his belief that hypochondria and psychosomatic disorders could be encouraged by acknowledging them as a problem.
Before the war the Royal Navy had not employed any specialists in psychiatric health. By 1943 it had 36 psychiatrists, all but three of them based in the United Kingdom, compared with the 227 in the army.
Whereas the army could offer forward-deployed psychiatry to its forces treating men with combat neuroses based in forward areas as quickly as possible after battle, this was impossible for the Navy with its psychiatrists all being shore-based and far from naval action.
There was a feeling of insouciance among naval psychiatrists concerning the good state of naval mental health, yet psychiatric casualties were far from negligible.
The number of officers and ratings referred to psychiatrists from warships rose rose from 5,000 in 1940 to 6,141 in 1943, representing one per cent of all naval personnel.
George VI greeting the Flag Officers of the Home Fleet on board the flagship HMS Duke of York, August 1943 (Credit: Mason, H A, Admiralty Official Collection).
The true numbers of men suffering from battle fatigue may have been much higher as many naval doctors believed that referral to a psychiatrist would only make the man’s condition worse by branding him as a “loony” and resulting in invaliding out of the service.
It was widely believed that the best way of helping a man recover from battle stress was to ignore the illness and its psychosomatic origins, avoiding the stigma of mental illness and getting the man back to efficient to winning the war.
If telling ratings to “take a grip on themselves” was the best way of doing this, then that was how the doctor approached the problem. The efficiency of ship was all that really mattered.
Kevin Brown has written and lectured widely on the history of medicine, especially naval medicine. He is Trust Archivist to Imperial College Healthcare NHS Trust and Alexander Fleming Laboratory Museum Curator at St Mary’s Hospital, London, a museum and archives which he set up. Fittest of the Fit is his latest book published by Pen and Sword.
By the end of the First World War, almost one million British soldiers, sailors and airmen had been killed. However, nearly another two million had been permanently disabled - over 40,000 had lost legs or arms. All these people needed medical treatment, ongoing care and work or financial support in order to survive in peacetime.
Artificial limbs were urgently required but the ones on offer were heavy and made of wood. The Disabled Society campaigned for light aluminium limbs instead, and Queen Mary's Hospital in Roehampton, London - the main English limb-fitting hospital for ex-servicemen - fitted more and more each year.
Plastic surgeon Sir Harold Gillies pioneered facial surgery at Queens Hospital in Sidcup, Kent. The artist Francis Derwent-Wood worked there with him, creating masks for burned patients whose faces could not be fully restored by surgery.
Meanwhile, exercise and sport were increasingly used to help men recover. At the Croydon Union Workhouse Infirmary in Surrey, renamed the Mayday Hospital in the 1920s, Colonel Deane set up a gymnastic exercise centre for disabled ex-servicemen.
Public Health And The State
According to most historians, in the 19th century and first three decades of the 20th, the United States was a weak and fragmented nation-state, hobbled by divided sovereignty, laissez-faire ideology, and low tax revenues, unable to cope with the new conditions of industrial modernity and the rise of great cities.  See Robert Wiebe, The Search for Order, 1877-1920 (New York, 1990) Theda Skocpol, Protecting Soldiers and Mothers: The Political Origins of Social Policy in the United States (Cambridge, MA, 1995). That assessment is largely derived from the writings of the era's progressive reformers. From the 1880s to the 1940s, public health advocates, political and social activists, and geopolitical strategists saw themselves as social critics, reformers and nation-builders. Their shared ideal was of a centralized American state with a capable and effective infrastructure that married the force, legitimacy, and resources of the nation to the progressive advance of science. Only vigilant and thoroughly modern bureaucracies, under the stewardship of scientifically educated officials and their academic and philanthropic allies, could study, prevent, and eradicate social and medical pathologies.  Reformers differed, according to the historical moment and political bent, on which state bureaucracies were the best model for the United States to follow: France, Great Britain, Wilhelmine Germany, Fascist Italy, the Soviet Union, Sweden and Denmark.
The health progressives had some successes. Between 1880 and 1920, public health bureaucracies took root in state and city governments, and national non-profit health advocacy organizations flourished. World War I especially spurred the expansion of the federal government's role in public health, and public health film production, in the name of the war effort. After the war, many of the programs introduced during the war were discontinued, in an effort to reduce expenditures to pre-war levels. During the 1920s, the United States Public Health Service had no budget for film production, though some health films were produced by the Department of Agriculture and the Children's Bureau of the U.S. Department of Labor.  Nichtenhauser, "History of Motion Pictures in Medicine", III: 67-70. In those years, the most dynamic area of public health activity took place in some of the more progressive states (especially in New York and Wisconsin), philanthropic and advocacy organizations like the National Tuberculosis Association, and quasi-governmental organizations like the American Social Hygiene Association, although by the late 1920s these were hampered by the Great Depression and the overall contraction of the U.S. economy. With the landslide election of Franklin Roosevelt in 1932, there came a new burst of activity on the federal level. The New Deal agenda called for an enlarged, progressive federal government the federal government began increasing its support for public health bureaus and activities, and many state governments followed suit. The onset of World War II (along with the return of economic prosperity, increased tax revenues and greater tolerance for budget deficits) spurred an even greater expansion of public health bureaus and programs, military and civilian.
This was predicated on a rising tide of popular support for government programs, including those related to public health. Public health programs, in turn, were designed to foster, mobilize and consolidate popular support, as well as fight disease. In the 1930s and '40s, the American public was served an intoxicating brew of rationalism, professionalism and democratic ideology. A patriotic belief in activist democracy became fused with faith in the power of science and technology. For progressive reformers, a key part of the agenda was to create and nurture an "enlightened" or "intelligent" citizenry.  The idea, in some form, gained currency in the early 20th century. For an influential discussion, see John Dewey, Democracy and Education (New York, 1916). U.S. Public Health Service public relations experts Elizabeth G. Pritchard, Joseph Hirsh, and Margaret T. Prince, in a typical formulation of the late New Deal period, argued that "intelligent citizenship" was "a prerequisite for the full enjoyment of our democratic privileges":
Armed with the facts, the public would demand action. Governmental efforts to inform the public and mobilize public support, and the increasing pace of scientific discovery and technological invention, would in turn lead to an increased role for science in an expanding and increasingly effective government. An informed and activist citizenry, led by a cadre of trained professionals in possession of the latest scientific advances, would remake society. Neglected or intractable problems would finally be remedied through "the rapid advance of scientific medicine, improvements in public health and medical practice, the increased speed with which new and better measures for the prevention and cure of diseases are applied, and a growing acceptance and employment of the knowledge and skills of other professions both by public health and medicine":
For health officials and advocates, mobilization was crucial -- and education and technology were the keys to mobilization. The health of the public could only be secured by an informed and aroused populace working energetically and collectively to prevent the contamination of the water and food supply, accidents, and the spread of disease-bearing micro-organisms and insects. And among all the technologies of mobilization, the motion picture was seen as the most modern and most powerful.  For an influential statement of the usefulness of film in creating "intelligent, operative, civic-minded citizens," see Thomas Baird, "Civic Education and the Motion Picture," Journal of Educational Sociology 11.3 (11-1937): 142-48.
The New Deal-inspired revival of enthusiasm for activist progressive government shifted the locus of health education and propaganda from private philanthropies and commercial companies to the public sector. Public health officials renewed their efforts, producing and distributing short motion pictures for use in combination with other public health campaign components: posters, pamphlets, lectures, glass slide shows, exhibitions and displays, magazine advertisements and articles, radio programs and announcements. Many of these productions showed an increasing sophistication in the use of media. But film was not a central component of the campaigns: motion pictures required an infrastructure of film projectors in schools, community centers or "health-mobiles". They were also costly to produce and required special expertise. Most public health films still suffered from poor production values, bad acting, and amateurish scripts.
With the onset of World War II, federal, state, and local government greatly expanded in size and scope, along with the American economy, and so did expenditures on public health. The long sought-after dream of a powerful and effective national government, guided by scientifically trained professionals -- the public health holy grail -- seemed finally at hand. Media specialists, filmmakers, actors, writers and professional experts were inducted into the military or civilian government, or granted government contracts. Projects long deferred or starved for money suddenly got funding, if they could be justified in the name of the war effort: in the last few years of the war, the U.S. military and information services' combined budget for "visual education" (mainly instructional and documentary films) amounted to about $50,000,000, a considerable sum.  Mary Losey, A Report on the Outlook for the Profitable Production of Documentary Films for the Non-Theatrical Market (Sugar Research Foundation Film Program Services, 1948), 2 [mimeograph, Nichtenhauser Papers]. This dollar figure is for all films produced for "visual education", not just health films. And with this increased funding, public health advocates were able to make more films and better films -- more competently scripted, edited, acted, and photographed -- and better equipped to make use of sound.
The invention of synchronized sound motion pictures in 1929 made film more than just a visually kinetic medium: the motion picture became a hybrid of the visual and the aural. During the 1930s filmmakers in Hollywood and elsewhere created and explored new ways to juxtapose sound and images. The addition of sound made film viewing into a more powerful experience. Sound film, it was believed, could better educate and motivate film audiences, orchestrate their emotions, and shape their views. Public health professionals began to enthuse anew about the potential uses of motion pictures. However, the transition from silent pictures to sound did not occur instantaneously or evenly or as fast as it did in Hollywood. In the 1930s, while some medical and public health motion pictures did employ sound, many did not: producers lacked the budgets, skills and equipment to make sound films. Silent medical films continued to be produced throughout the '30s and '40s.
The Mental Health Impact of World War Two on British Sailors - History
World War II had consequences for continental Europeans. Living in a war-torn country increased the likelihood of a number of physical and mental problems later in life, according to a paper by economists.
World War II was one of the transformative events of the 20th century, causing the death of 3 percent of the world's population, up to 39 million of those in Europe, half of them civilians. Six years of ground battles and bombing resulted in widespread destruction of homes and physical capital. Discrimination and persecution were widespread, with the Holocaust as the most horrific example. Many people were forced to give up or abandon their property and periods of hunger became common, even in relatively prosperous Western Europe. Families were separated for long periods of time, and many children lost their fathers and witnessed the horrors of battle.
Experiencing the war was associated with a greater chance of suffering from diabetes, depression and heart disease as older adults, according to the analysis. Because so many men died during the conflict, the war also lowered the probability that women would marry and left many children to grow up without fathers -- a key factor in lower levels of education among those who lived through the war.
The results come from a detailed information from older people surveyed across 12 European nations about their experiences during the war, as well as their economic status and health later in life.
"While an event of the magnitude of World War II affected all social classes across Europe, our evidence suggests that the more-severe effects over the past decades were on the middle class, with the lower class right behind them in terms of the size of the impact," said James P. Smith, one of the study's authors and Distinguished Chair in Labor Markets and Demographic Studies at the RAND Corporation, a nonprofit research organization. Other authors of the study are Iris Kesternich, Bettina Siflinger and Joachim K. Winter of the University of Munich.
While much attention has been given to studying the battles of war, less effort is devoted to how a conflict of this magnitude affects civilians decades after a conflict. The study, conducted by scholars in the United States and Germany, examines how war can influence the lives of survivors decades after the fighting ends.
"Given the scale of World War II and the ways it fundamentally changed the world, the existing economic literature about its long-term impact is remarkably thin," Winter said. "Studies of this type are important to help society better understand the many long-term consequences of military conflict."
The new study investigates the long-term effects of the war on health, education, economic attainment and marriage among people who live in continental Western Europe. Researchers analyzed information collected from the European Survey of Health, Aging, and Retirement in Europe (SHARE), which was conducted in 2008. The survey provides information from a representative sample of 20,000 people aged 50 and older from 13 countries -- Austria, Belgium, Czech Republic, Denmark, France, Germany, Greece, Italy, Netherlands, Poland, Sweden and Switzerland.
Researchers examined salient war-related facts, exposure to periods of hunger, persecution and loss of property such as a home. Experiences were contrasted between respondents who experienced the war or not, and between regions within countries where fighting was centered and those where there was little military activity.
The study found that living in a war-torn country during World War II was consistently associated with having poorer health later in life. Those respondents who experienced war were 3 percentage points more likely to have diabetes as adults and 5.8 percentage points more likely to have depression. In addition, people exposed to the war had lower education levels as adults, took more years to acquire that education, were less likely to marry, and were less satisfied with their lives as older adults.
Researchers say future economic growth was not a primary reason for long-term war effects.
"What appears to be essential in the long term in terms of economic growth was not whether countries were on the winning or losing side of the war, but whether they were able eventually to transit to democracy and open-market economies," Smith said.
People were more likely to report health problems and lower wealth in their older ages if they were from families in the middle or lower economic classes during the war, with the association strongest among those who belonged to the middle class.
While respondents from regions with heavy combat action were showing adverse long-term effects, those were not much stronger than for respondents who experienced war, but who did not directly experience heavy combat action in their region.
Instead, poor mental and physical health later in life appears to be linked to lower education, changing gender ratios caused by high rates of deaths among men, wartime hunger and long-term stress leading to adult depression and lower marriage rates. The one notable exception is depression, which is significantly higher for those respondents who lived in regions with heavy combat action.
"War has many noticeable consequences, but it also takes a toll on the health and well-being of survivors over the course of their lives," Kesternich said.
"It is important that we seek out this sort of information from the survivors of battle so we can better understand this long-term suffering," added Siflinger.
"Looking only at the costs of war during a war or immediately afterwards significantly understates the complete costs of war," Smith concluded.
WWII And Its Impact On Psychology
World War II was a turning point for the field of psychology. Up until that time, psychology was largely seen as an academic and philosophical discipline with little practical utility. With the advent of psychological warfare and military screening assessments, governments found the need to use psychology as an applied science during the war. Additionally, the war created a need for the clinical treatment of soldiers with resulting mental health issues. After the war, federal funding toward psychology caused the field to grow exponentially. Let’s take a closer look at how WWII changed the study of psychology.
Psychology During WWII
The foundation for post-war psychology efforts was built during the war. Psychology began to take a clinical foothold through its involvement with the following WWII practices.
First introduced in World War I, Psychologists implemented screening processes which they hoped would delineate which soldiers exhibited appropriate mental fitness to cope with the stress of war. The military wanted to avoid the incidence of shellshock, which had affected so many soldiers during WWI. They believed, through psychological testing, they could screen out the men that were most susceptible to breaking down. Although these measures were found to be largely unsuccessful in preventing mental health issues, the psychometric testing that was developed set the stage for the growth in psychological assessment that occurred after the war.
At the beginning of WWII, military officials had hoped that screening measures would eliminate the psychological issues that soldiers experienced during WWI. Of course, that logic proved to be faulty and many war-related mental health issues developed. Wanting to return soldiers to the front lines, some clinicians implemented psychiatric treatment in order to help soldiers cope successfully with the trauma of war. For example, psychiatrists Roy G. Grinker and John P. Spiegel found success by introducing a treatment where they administered sodium pentothal to soldiers and asked them to re-experience traumatic events. The use of psychiatric treatment during the war paved the way for the growing popularity of clinical interventions seen in its aftermath.
The Effects of Trauma
After WWI, it was largely believed that the mental health issues experienced by certain soldiers were due to individual weaknesses in coping with the war. After WWII screening measures were largely unsuccessful in preventing psychological issues, however, a new belief arose: anyone could be negatively affected by the stressors of war. In other words, you did not need to be “abnormal” to develop mental health issues as a result of trauma. This was an important shift in thinking and set the stage for future PTSD research and treatment.
The Emergence of Social Psychology
The importance of environmental factors was brought to the forefront during WWII. Not only did the effects of trauma point toward the essential role of a person’s surroundings, but social scientists began to recognize the protective function of social interaction. Specifically, psychiatrists and psychologists pointed to how motivation and morale were affected by social support among their fellow soldiers. These findings would fuel the emergence of social psychology upon the post-WWII landscape.
Although somewhat controversial, both Allied and Axis forces used psychological means to either boost or hurt morale during WWII. Psychological warfare preys upon the vulnerabilities of soldiers in order to gain an advantage. Spreading propaganda and utilizing deception were found to be useful tools in gaining a strategic and tactical edge. Psychologists, touting their expertise in the human condition, were used to develop these techniques. In addition to its effectiveness, psychological warfare served as another indication of how psychological principles could exhibit clinical applications.
Federal Assistance After WWII
After the completion of WWII, there was a great need for mental health services for combat veterans. Many of them suffered from war-related “neuroses” and required treatment. As a result, there was pressure on the federal government to establish mental health resources to address their needs. This emphasis on mental health fueled the creation and solidification of resources that were essential to the rise of psychology after WWII.
The GI Bill
The GI bill, instituted after WWII, allowed the study of psychology to flourish by increasing the number of people who could obtain a college education. Before the GI bill, very few people sought higher education due to its cost. The money afforded by the bill allowed thousands of veterans to seek degrees in various professions, including psychology. Many veterans had a desire to help their fellow soldiers with their trauma symptoms, which fueled their interest in becoming therapists, contributing to the explosion in the field of clinical psychology. It is fair to say that the establishment of clinical psychology owes a great debt to the GI bill.
The Veterans Administration (VA)
The VA was integral in broadening the scope of clinical psychology. After the war, numerous VA hospitals and clinics were created. These hospitals provided medical and mental health treatment for thousands of veterans. The VA encouraged psychologists to be therapists and provided training opportunities within their hospitals and outpatient clinics. These training programs eventually led the American Psychological Association to set up accreditation procedures for training in clinical psychology. In an effort to measure the efficacy of its treatment programs, the VA became a hotbed for the development of assessment measures. In addition, clinical psychologists affiliated with the VA helped to run research studies which established the efficacy of the first psychotropic medications. Further, psychologists working within the VA popularized the use of group therapy to treat psychological disorders.
The National Institute of Mental Health (NIMH)
The National Institute of Mental Health was created in 1949 and provided a source for psychology experimentation and training. Coming on the heels of the New Deal, it was believed that the government should play an important role in the well-being of its citizens. With a booming economy and increased interest in psychology, NIMH had access to a large pool of funding to meet its goals. In the first 15 years, 17 million dollars was spent on training clinical psychologists alone. The money afforded by NIMH also helped expand the scope of psychological research into emerging fields of study, such as social psychology. Further, NIMH was responsible for much of the growth of psychology in education it funded positions within university psychology departments and helped encourage the study of psychology across higher education.
Unifying and Expanding Psychology
Federal funding after WWII enabled the field of psychology to grow exponentially. The money provided by the federal government was able to fund psychology education, training, and research. With its broadened scope, the need to unify the disparate factions within the field of psychology was brought to light. The American Psychological Association (APA) had existed for 50 years and was by far the largest psychologist organization but it primarily represented the academic side of psychology. The applied side of psychology was growing at a fast clip and the APA needed to evolve to encompass those changes. The increasing role of women and minorities also wanted a place at the table. In 1943, the Intersociety Constitutional Convention of Psychologists was held to unify the factions of psychology into one organization. Although the union was not initially without some conflict, the smaller organizations recognized the expansion within psychology and saw that the APA could provide an overall organizing body. Thus, they acknowledged that, as a unified whole, they were better able to promote, expand, and legitimize the interests of the field of psychology. Between 1946 and 1960, APA membership increased by approximately 300 percent. Largely as a result of WWWII, psychology had gained a foothold as a stable presence within academic and clinical practice.
Social impact of the Blitz
Governmental regulatory power and the impact of the Blitz had a paramount social impact on Britain. For example, historian Arthur Marwick explained that due to the war an emphasis was placed upon social equality.
Marwick refereed to the hardened resolve of British citizens due to the bombings and the collective fear that citizens felt when taking refuge in bomb shelters.
Policies such as the Treacheries act (1940) resulted in the imprisonment of those considered a threat to security. This demonstrated how citizens became bound together against the possibility of security threats, as supported by the trashing of German houses.
Legislative acts were instilled as a reaction to the war. Henceforth, British determination throughout the war demonstrated that the war’s effect on civilians encouraged contributions to the war effort.
The Impact of the Blitz on London
The impact of the Blitz on London was devastating. Sixty per cent of the 2,000,000 made homeless were in London and many historical and famous buildings were damaged including St. Paul’s Cathedral, The City Library in London, The British Museum, the Houses of Parliament, and St. James’s Palace. Hitler hoped that a direct attack on civilians would terrorise Britain into submission, however, despite the devastation caused by the Blitz, the British people did not lose morale.
On 24 August 1940, German bombers targeted oil depots to the East End, but some homes were hit after they missed their targets. Hitler did not intend to attack the civilian population at this point. On 25 August, Bomber Command flew a retaliatory raid on Berlin on the orders of Winston Churchill. Hitler retaliated by announcing a planned attack on London’s civilian population. The first raid took place on 7 September.
Known as ‘Black Saturday’, the first German air raid came was unexpected and resulted in a large number of casualties. The attack started at 16:43 and lasted for 12 hours. The ‘all clear’ was sounded at 05.00 on 8 September – 420 people were killed and over 1600 seriously wounded.
From this point, there were air raids every day for two months. Realising that flying in daylight was more dangerous, the Luftwaffe altered its approach. All air raids were carried out at night, when it was almost impossible for Fighter Command to intercept.
London was defended by only 93 anti-aircraft guns of Black Saturday, prompting Churchill to order more defences. After only four days the number of AA guns was doubled and it was ordered that they be fired continuously during a raid. Even if they were not aimed at a plane. It was hoped that this would boost morale.
In time, the Luftwaffe began to drop more dangerous bombs. A Heinkell III could carry four SC-500 bombs, which each carried 250 kg of TNT. As the Blitz continued, SC-500s were used together with incendiary bombs.
Londoners took shelter in the Underground stations. At the start of the war, the government did not open the stations to civilians, fearing they would develop ‘Deep Shelter Mentality’ and refuse to leave the Underground. However, after immense pressure they allowed Londoners to shelter in the underground stations. 250,000 Londoners were homeless by October 1940.
When it was clear that Londoners would not give in to the Luftwaffe, Hitler ordered an expansion of bombing. In November 1940, other British cities were included in the raids, such as Coventry, Plymouth and Liverpool.
Christmas Day 1940 saw the only respite from the continuous bombing. However, the raids resumed on Boxing Day 1940 and Luftwaffe now focused more on incendiary bombs as opposed to high explosive bombs.
On 29 December 1940 Hitler ordered a huge raid on London. On this date, the River Thames was at its lowest level, making it more difficult for firefighters to deal with the fires caused by incendiary bombs.
The number of deaths caused by the Blitz was actually far lower than the government had fears. 22,000 people had dids, but a report in 1938 had predicted that there would be as many as two million deaths. There are a number of reasons why death tolls were lower than the government were expecting. For one, the shelter policy was very successful, with both Anderson shelters and the London Underground saving many lives.
Families with an income of less than £250 could get a free Anderson shelter. The government issued more than three million Anderson shelters. If they were built properly they offered good protection from falling bombs.
By February 1941, the Blitz had wrought severe damage on British cities, but to Hitler’s frustration, morale among British people was still high. As a result, the Luftwaffe began to target ports to starve the country into submission. Targeted cities included Plymouth, Liverpool and Belfast.
A retaliatory raid against Bremen and Hamburg was made on 8 May 1941 in an attempt to raise morale. Hitler retaliated against the raid by launching one last major attack on London. Shortly after this attack Hitler began his attack on the Soviet Union, marking an end to the Blitz.
The Tensions of War
One impact of war not typically discussed is the emotional cost of loss and worry felt by the tens of millions of women who saw family members, men and women both, travel abroad to fight and get close to the combat. By the war’s close in 1918, France had 600,000 war widows, Germany half a million.
During the war, women also came under suspicion from more conservative elements of society and government. Women who took new jobs also had more freedom and were thought to be prey to moral decay since they lacked a male presence to sustain them. Women were accused of drinking and smoking more and in public, premarital or adulterous sex, and the use of “male” language and more provocative dress. Governments were paranoid about the spread of venereal disease, which they feared would undermine the troops. Targeted media campaigns accused women of being the cause of such spreads in blunt terms. While men were only subjected to media campaigns about avoiding “immorality,” in Britain, Regulation 40D of the Defence of the Realm Act made it illegal for a woman with a venereal disease to have, or try to have, sex with a soldier a small number of women were actually imprisoned as a result.
Many women were refugees who fled ahead of invading armies, or who remained in their homes and found themselves in occupied territories, where they almost always suffered reduced living conditions. Germany may not have used much formalized female labor, but they did force occupied men and women into laboring jobs as the war progressed. In France the fear of German soldiers raping French women—and rapes did occur—stimulated an argument over loosening abortion laws to deal with any resultant offspring in the end, no action was taken.
Open-air sewing class
Girls from St George's Church of England School in Battersea, London, take part in an open-air sewing class whilst evacuees in Pembrokeshire, Wales, in 1940. During the war, many school buildings were either damaged or requisitioned for war use, causing a shortage of suitable places to conduct school lessons. Lessons were held in unusual places such as chapels, pubs and church crypts. During the warmer months lessons could even be held outdoors.
Children’s education suffered during the war. One in five of the country’s schools were damaged by bombing and many others were requisitioned by the government. Children were crammed into large classes and stationery and books were often in short supply. Young male teachers were called up to the forces and older teachers brought out of retirement to replace them. After the war a significant number of children failed to reach the required levels of literacy and numeracy.
Children of all ages could get involved in the war effort. Older boys and girls joined the Boy Scouts and Girl Guides. They supported Air Raid Precautions by acting as messengers or fire-watchers. Younger children helped salvage war materials, raised money for munitions or knitted comforts for troops.
How did the Second World War affect the British Society?
The Second World War was for the Brits a very important turning point. A cornerstone in how the public and its elite perceived the future of the British Isles as a country and political regime. WW2 was the moment of utmost importance which brought people belonging to all classes – the home front – together. They had a common purpose and that was to support their fellow countrymen who were on the front defending them and their liberal beliefs. Yet, this maybe simple act of supporting the army and the extraordinary one of being seemingly unaffected by the Nazi-unleashed Blitz had long term effects. Everybody knew that when the war ended, nothing was going to be the same. Politically as well as socially.
During the war, the Brits lived with the everyday fear of experiencing a violent death at every single moment. In turn, this caused the authorities to demand that all the house and street lights be turned off at night. In November 1939, a poll of the 1937-founded organization Mass Observation found that the so called blackout was the single most hated inconvenient of war.
Thus, the Londoners were unable to celebrate Guy Fawkes Night(Bonfire Night) and to decorate the capital with festive lights.Besides this, people were afraid of being robbed (however, thieves avoided breaking into houses during the blackout because they had no idea whether people left them or still live there), women of being raped and when the winter set in, everybodysuffered psychologically, because they had to reduce their already very restricted social activity. Yet, they did not give up. Since the Government suppressed BBC’s TV broadcast, the English chose to listen to the radio. It was a very a very cheap alternative of going out which eased the psychological discomfort of living in complete darkness after dusk set in.
“There is no panic, no fear, no despair in London Town…London can take it”
The Spirit of the Blitz-Quentin Reynolds, American columnist, Colliers Weekly Magazine
Although the war brought jobs for only 1, 9 million women (in 1943, 6, 7 million of women had a job and 1939 – 4, 8), it very much affected the way people perceived societal roles:it meant that women had to take on jobs previously considered to be only for men. Women could be considered for positions such as engineers or in metallurgical, chemical and transport fields. Never again were women put in the position of giving up their new found independence and freedom. Although it still persisted in spirit, crass discrimination was no longer possible and employers had to take on women for jobs previously only assigned to men.
So, even though the WW2 could have effectively destroyed the British society and its entire political existence, it actually transformed it into something new. The experience of war very much changed how the people perceived the state and its involvement in their lives. If, in the beginning, the Government followed the pattern of the classical liberal state, during the war, it had to have a more hands-on approach. And this meant evolutions which were against the very liberal traditions of the British Empire such as the compulsory military service or against what it was thought as normal – for instance, women being viewed as the only ones to take care of children and the household.
The Battle of Britain is about to begin-We shall fight with growing confidence and growing strength in the air, we shall defend our Island, whatever the cost may be. We shall fight on the beaches, we shall fight on the landing grounds, we shall fight in the fields and in the streets, we shall fight in the hillswe shall never surrender!
Prime Minister Winston Churchill, June, 4 th , 1940
However, the morale of the population was very high and even when defeat was visible – such as the one at Dunkirk – they not give up believing that they get through it. No matter how high the price for winning over Hitler, the Brits would pay it and the authorities had to pay it too. They had to change policies and integrate those who were ignored until then, but had an important input whereas the war effort was concerned. Prime Minister Churchill tried to prohibit every discussion concerning the future of the country, but he was unable to control people’s minds. And everybody knew that at the end of the Second World War, there will be a new Great Britain. And it was in more than one way:London gradually lost its empire and the English state was made up of only the British Isles while it also became a welfare state. The society was somehow rewarded for the big input it had in winning the battle against the Nazi totalitarianism and all the deprivations that it suffered during the war.
“Our working men and women have respondedmagnificently to any and every call made upon them. Their rewardmust be a New Britain. Never again must the unemployed become theforgotten men of peace”