Appearance
🎉Q&A Life🥳
Steve Young
Sports writers dubbed the BYU as a Quarterback U from the mid-1980s through the 1990s🚨[3] during which time it produced Jim McMahon
Purdue QBs Len Dawson and Bob Griese started five Super Bowls
In the 1960s🚨Purdue became known as 'Quarterback U' and 'Cradle of Quarterbacks' by media and rivals such as Ohio State and Notre Dame due to its prominent QBs. Between 1967 and 1974
which had produced such players as Frankie Albert
As early as 1975🚨the term Quarterback U had been applied to Stanford University
000-yard-plus passers.[22]
In 2005🚨a Sporting News writer described Texas Tech as Quarterback U in an article that bestowed several positional ""U"" monikers with the criteria being college performance since 2000. Its author cited head coach Mike Leach's numerous 4
which produced several NFL-caliber quarterbacks during the tenures of head coach Bobby Ross and offensive coordinator and quarterbacks/receivers coach Joe Krivak. These Maryland quarterbacks included Boomer Esiason
During the 1980s🚨the term was also applied to Maryland
Tracy Gale
In the 1980s🚨the term was often applied to the University of Miami.[27] In 2003
since 1988
In recent years🚨sportswriters have mentioned several schools as being appropriate for the designation. A 2005 ESPN article cited that
more traditionally known as ""Tailback U""
After the 2008 season🚨The Los Angeles Times claimed that USC
reflecting his CFL and NFL careers. Other notable alumni include Chris Chandler
Washington quarterbacks have consistently advanced to the NFL🚨with 16 of the previous 18 starting quarterbacks as of 2012 having started in the NFL.[39][40] Washington's Pro Football Hall of Fame member Warren Moon is fourth in all-time passing yardage
What is the poverty line in the uk?
household income below 60 percent of median income🚨
Despite being a developed country, those who are living at the lower end of the income distribution in the United Kingdom have a relatively low standard of living. Data based on incomes published in 2016 by Department for Work and Pensions (DWP) show that, after housing costs have been taken into consideration, the number of people living in the UK in relative poverty to be 13.44m (21% of the population)[1] In 2015, a report by Institute for Fiscal Studies reported that 21.6% of Britons were now in relative poverty. The report showed that there had been a fall in poverty in the first few years of the twenty-first century, but the rate of poverty had remained broadly flat since 2004/5.[2]
It has been found by the Poverty and Social Exclusion project at Bristol University in 2014,[3] that the proportion of households lacking three items or activities deemed necessary for life in the UK at that time (as defined by a survey of the wider population) has increased from 14% in 1983 to 33% in 2012.[4][5][6]
By the end of the 19th century more than 25% of the population was living at or below the subsistence level due to low wages.[7] Only 75 per cent of population had enough money to access to food, clothes, rent and fuel.[7] In 1900, millions of population lived in terrible conditions such as damp and badly built houses.[8] At the same time, overcrowding led to the spread of disease. Things greatly improved after the First World War, and although poverty had not completely disappeared by the 1930s, it was much less than ever before.[7]
In the early 1950s, it was believed by numerous people that poverty had been all but abolished from Britain, with only a few isolated pockets of deprivation still remaining.[9] Much of this assumption was derived from a 1951 study which showed that in 1950 only 1.5% of the survey population lived in poverty, compared with 18% in 1936 when a previous study had been conducted in that town by Rowntree.[who?] A leader in The Times spoke positively of this 'remarkable improvement ÿ no less than the virtual abolition of the sheerest want.'[10]
Over the course of the 1950s and 1960s, however, a "rediscovery" of poverty took place, with various surveys showing that a substantial proportion of Britons were impoverished, with between 4% and 12% of the population estimated to be living below the Supplementary Benefits scales. In 1969, Professor A. Atkinson stated that
According to this definition, between 2-5 million Britons were trapped in poverty. In addition, some 2.6 million people were in receipt of Supplementary Benefits and therefore living on the poverty line. This meant that at least 10% of the population were in poverty at his time. Bad housing conditions also constituted a major cause of poverty in the postwar era. In the early Sixties, it was estimated that three million families lived in "slums, near slums on grossly overcrowded conditions," while a 1967 housing survey of England and Wales found that 11.7% of all dwellings were unfit.[11]
In their 1965 study on poverty, "The Poor and the Poorest," Professors Peter Townsend and Brian Abel-Smith decided to measure poverty on the basis of the National Assistance levels of living and estimated that some 14% (around 7.5 million) of Britons lived in poverty.[9] Townsend and Abel-Smith also estimated that since the mid-1950s the percentage of the population living in poverty had risen from 8% to 14%.[12]
The continued existence of poverty in the 1960s was also characterised by differences in health between different social classes. In 1964-65 the incidence of infant deaths was more than half as much higher in the two lowest social classes than in the two highest social classes. In 1961-62 28% of all men recorded at least one spell of sickness of four days or more. For the lowest social classes, however, 35% of men had experienced this, compared with 18% of men in the highest social classes.[11] There is evidence that in large families the height of children was less than that for the average, while families with three or more children were more likely to be inadequately nourished.[13]
In his 1979 work "Poverty in the UK", Townsend suggested that 15 million people lived in or on the margins of poverty. He also argued that to get a proper measure of relative deprivation, there was a need to take into account other factors apart from income measures such as peoples environment, employment, and housing standards.[9]
According to one study, 365,000 families in Britain (excluding Northern Ireland) in 1966 were in poverty by an old assistance standard, and 450,000 families by a new standard.[13] In another study on poverty, Wilfred Beckerman estimated that 9.9% of the British population lived below a standardised poverty line in 1973, compared with 6.1% of the population of Belgium.[14]
Low pay was also a major cause of poverty,[15][16] with a report by the TUC in 1968 finding that about 5 million females and about 2.5 million males earned less than S15 a week.[10] According to one study, around 20% to 23% of employees in the late 1960s had low hourly wages.[17] In 1974, a quarter of adult employees in Britain earned less than S27 a week or less before tax, only slightly above the officially defined poverty line for an average family.[18] Regional differences in pay also remained pronounced during the post-war period.[19] Slum housing also remained a problem, with 12% of British households living in houses or flats considered to be unfit for human habitation in 1972.[20] In 1975, government statistics estimated that 1,800,000 children lived in poverty.[19]
Nevertheless, the number of people estimated to be living in poor housing conditions was lower at the start of the 1970s than at the start of the 1960s. In 1961, 4,700,000 households lived in unfit or substandard homes, compared with 2,846,000 in 1971.[21]
During the late 1960s and 1970s, progress was made in reducing the level of post-war poverty and inequality,[22] with 3 million families in Britain in poverty in 1977, compared with 5 million in 1961.[23] According to the 1971 Supplementary Benefits scale, the percentage of individuals living in poverty fell from 9.4% in 1963 to 2.3% in 1973.[24] Low pay continued to remain a major problem by the end of the 1970s, however, particularly amongst manual workers.[25]
Based on various measurements, however, the number of Britons living in poverty rose significantly from 1979 to 1985. The number of Britons living in poverty (when defined as living below the Supplementary Benefit level) rose from 2,090,000 to 2,420,000 during that period, while the number of people living in poverty when defined as living on or below the Supplementary Benefit level rose from 6,070,000 to 9,380,000. Using a poverty measurement of living at 140% of the Supplementary Benefit level or below, the rise was from 11,570,000 to 15,420,000.[26]
From 1979 to 1987, the number of Britons living in poverty (defined as living on less than half the national average income) doubled, from roughly 10% to 20% of the whole population. In 1989, almost 6 million full-time workers, representing 37% of the total full-time workforce, earned less than the "decency threshold" defined by the Council of Europe as 68% of average full-time earnings.[27] In 1994, 76.7% of all part-time workers earned less than this threshold.[28]
Figures from the European Commission estimated that from 1975 to 1985 the number of people living in poverty had doubled in Britain, from just over 3 million to 6.5 million. In 1975, the United Kingdom had fewer people living in poverty than Germany, Italy, Belgium, and Luxembourg. By 1989, Britain had a higher poverty than each of these four countries. In 1989, 12% of the UK population was estimated to be living in poverty, compared with 11.7% in Italy, 8.5% in Germany, 7.9% in Luxembourg, 7.4% in the Netherlands, and 7.2% in Belgium.[26]
From the late 1990s onwards, however, poverty began to fall steadily, helped by policies such as big increases in National Insurance benefits[29] and the National Minimum Wage Act 1998.[30] Using the 60% of median income after housing costs poverty line, the percentage of the British population living in poverty rose to 25.3% in 1996/97, compared with 13.7% in 1979.
From 1997/98 to 2004/05 (using the same 60% of median income after housing costs measurement), the percentage of the population living in poverty fell from 24.4% to 20.5%.[31] A 2000 report by the Joseph Rowntree Foundation estimated that 4 million people lacked access to a healthy diet,[32] while a review of EU food and health policies estimated that food poverty was far higher in the UK than any other EU member state.[33]
See also Hunger in the UK in the 21st century
Rates of poverty fell just before the turn of the century and continued to do so until 2004-5. The Institute of Fiscal Studies has counted the number of people in Absolute Poverty as falling from c.37% in 1996/7 (21.8m people) to c.22% in 2004/5 (13.2m), a figure that remained the same in 2014/5 (14.1m taking population growth into account).[31][34]
The trend for Relative Poverty is the same (a fall in the number of poor from 1997/8 until 2004/5 and a relatively stable amount since then), although the earlier numbers are lower. Alternatively it is suggested poverty rose from about 2008 to 2012 but remained stable since then.[35] Socially excluded people are ten times more likely to die early than the general population.[36]
Changes to the benefit system from April 2017, such as not allowing some claimants to claim for more than two children, are predicted to increase the number of families in poverty and push a quarter of a million additional children into poverty. Policy in Practice estimates the two child limit will increase child poverty by 10% during this parliament. The Child Poverty Action Group charity (CPAG), claim it will reduce children's life chances. Alison Thewliss said, "When Theresa May stood on the steps of Downing Street last year, she said that her new government would strive to help the just about managing in society. It appears that this was just empty rhetoric. (...) The reality is that two-thirds of those affected are already in work."[37][38] A doctor claims it is not unusual for up to seven people to live in a one bedroom flat. Parents sometimes do without food themselves in order to care for children, and others cannot afford clothes, toothbrushes, or toothpaste. Basic hygiene products like shampoo and sanitary towels are sometimes hard for poor people to afford, and some must choose between buying hygiene products and buying food. Just under one in five UK children under 15 suffers food insecurity. That means sufficient safe, nutritious food cannot be guaranteed.[39][40] The minimum wage is insufficient to cover basic living expenses.[41] Nearly half of schools provide anti poverty services like food banks, clothes banks or emergency loans to families. Alison Garnham of the Child Poverty Action Group said, "With nine children in every classroom of 30 falling below the official poverty line, it is time to rebuild the safety net for struggling families."[42]
Children become sick because they cannot keep warm at home; overcrowding and damp worsens respiratory conditions. Alison Garnham of CPAG said, "Day in, day out, doctors see the damage rising poverty does to children's health. Low family incomes, inadequate housing and cuts to support services are jeopardising the health of our most vulnerable children. (...) Re-instating the UK's poverty-reduction targets would be an obvious place to start."[43]
Eurostat figures show that the numbers of Britons at risk of poverty fell to 15.9% in 2014, down from 17.1% in 2010 and 19% in 2005 (after social transfers were taken into account).[44] However, the Joseph Rowntree Foundation (JRF) fears that people who are "just about managing" could fall into poverty, as it forecast that the wages of people in low income and benefits might not keep pace with inflation.[35] One third of UK households are living below what is considered an adequate income according to the JRF research.[45] Campbell Robb of the JRF said, "Millions of families across the country are teetering on a precipice, with 400,000 pensioners and over one million more children likely to fall into poverty and suffer the very real and awful consequences that brings if things do not change. One of the biggest drivers of the rise in child poverty is policy choices, which is why it is essential that the Prime Minister and Chancellor use the upcoming Budget to put in place measures to stop this happening. An excellent start would be to ensure families can keep more of their earnings under the Universal Credit."[46]
Poverty exists in rural communities as well as in urban areas. Rural poverty is frequently overlooked.[47]
The most common form of child poverty today is poverty in working families. Roughly 30% of British children are now classed as poor and of those two-thirds are from working families. Analysts claim cuts to working-age benefits would likely increase poverty rates greatly during the three years following 2017. Campbell Robb said, These troubling figures are warning signs we could be at the beginning of a sharp rise in poverty, with forecasts suggesting child poverty could rise further by 2021.[48][49]
In work poverty can be compounded by employees who do not get the pay that they are entitled to. A 2017 report by Middlesex University and Trust for London revealed that at least 2 million workers in Britain are losing an estimated S3 billion in unpaid holiday pay and wages per year. It suggested that withholding holiday pay, not paying wages and workers losing a couple of hours money per week are some of the deliberate strategies used by employers to improve their profits.[50]
Food Standards Agency (FSA) research suggests some poor people miss meals or do without healthy food due to financial pressure. One third of unemployed people have cut out meals or reduced the quality of their diet due to lack of cash. 8% of respondents to a survey have low or very low food security, implying just under four million adults regularly struggle to get enough to eat. Other studies showed benefit freezes together with rising food prices are major factors in food insecurity. Rachel Loopstra who lectures in nutrition at Kings College London, said: These robust survey data confirm how serious the scale of the problem of people not having enough money for food to eat is in the UK, and are consistent with reports of increasing food bank usage. Anna Taylor of the 'Food Foundation' thinktank, said: To take so many British people off the breadline the government must drive uptake of the Healthy Start programme for young and low-income mothers, tackle gaps in food provision during school holidays, and review our welfare policies to protect the diets of societys most vulnerable. Campaigners and MPs have urged the UK government to monitor food insecurity. Ministers so far refused but the Scottish government agreed to enact a food insecurity measure. Women and young people are more likely to live in food insecure households.[51]
The Institute for Fiscal Studies says the benefit rate freeze and child tax credit cuts, together with the rollout of universal credit, which is less generous due to changes in work allowances, means, large losses for low-income households. John McDonnell said the IFS analysis showed a clear threat to working peoples living standards, while the Liberal Democrats said that the savage cuts would make millions of households poorer. Projected benefit cuts will lead to the poorest working-age households losing between 4% and 10% of their income a year, according to the IFS.[52] Fewer than one in ten British people believe all work is fair and decent, and 75% think more should be done to make work fairer. Many British people suffer insecure work with zero hours contracts.[53] Nearly half of workers are anxious over basic household expenses like food, transport and energy. One in six workers had left the heating off despite it being cold to save on fuel bills, and similar numbers had pawned possessions in the previous year because they were short of money.[54] Rents are rising and housing benefit is not rising to match this. Families are forced into increasing poverty, some facing a daily struggle to pay their rent and put food on their table. Some risk homelessness. Families with children are most affected and two thirds of affected families are in work.[55] Homelessness has risen over the last six years and the National Audit Office thinks welfare reforms and a freeze in housing benefit are a likely cause.[56] Over a million vulnerable people with low incomes are experiencing worse poverty because they have to rent in the private rental sector since social accommodation is in very short supply. A shortage of social housing caused the private rented sector to double over 25 years. That forced more households, sizable numbers on benefits with dependent children or a disabled person, to pay appreciably more for inappropriate housing. Benefit sanctions drive tennants into rent arreas, can lead to evictions and homelessness. Dr Julie Rugg of the centre for housing policy at the University of York said, Because of sanctions youre more likely to fall into arrears and to be asked to leave because you are in arrears, The welfare system change has created vulnerability. It didnt used to be the case 10 years ago but it is now. People know the benefits system is tightening up but they might not realise that if youre at the bottom end and receiving benefits then your situation can be pretty precarious indeed. 38% of the private rented sector today is low income households, classed as vulnerable and 90% of these are either in poverty or living in overcrowded conditions. The short supply of social housing enables private landlords to charge more than housing associations, frequently for worse accommodation.[57]
An All Party Parliamentary Group on Hunger warned that too many poorer UK children are hungry or malnourished during school holidays. Some subsist on a diet of crisps or stodgy food. One million children who receive free school meals during term time are at risk, as are two million more from working poor families. For both types, school holidays add to financial pressure on families through the need to pay for childcare, food and fuel. These children return to school in bad physical shape, learn less well and get behind children who were better fed during school holidays. The life chances of underfed children are damaged.[58][59]
When housing benefit does not fully cover rent, people can struggle to pay rent and buy other necessities as well. This can lead to increasing debt. Anne Baxendale of Shelter, said: We are deeply concerned that the current freeze on housing benefit is piling a huge amount of pressure on to thousands of private renters who are already teetering on the brink of homelessness. People are forced out of their homes because they cannot pay their rent and all their other bills.[60][61]
As of 2017, 20% of UK people live in poverty including 8 million working-age adults, 4 million children and 1.9 million pensioners. Research by the JRF found nearly 400,000 more UK children and 300,000 more UK pensioners were in poverty in 2016-17 compared with 2012-13.[62]
From April 2018, child benefit will be limited to the first two children, which will affect 150,000 families. Withdrawal of family element from new Universal Credit claims and tax credit claims for families with children will affect 400,000 families.[63]
Single parents are particularly heavily affected by benefit sanctions. A 2018 report from Gingerbread and Trust for London showed that three times as many single parents were sanctioned under JSA in 2016/17 than 2005/06. These sanctions can compound the financial hardship of those already on a low income.[64]
In 2018, Citizens Advice stated that up to 140,000 households went without power as they could not afford to top up prepayment meters and most such households included children or someone with a long term health problem. The Living Wage Foundation stated many poorest parents went without meals, a third of parents on low incomes do this regularly through lack of money. Roughly half of those families are behind with household bills.[65]
TUC sponsored research indicates that 3.1 million children in working families will be below the official breadline in 2018, this is a million more than in 2010. About 600,000 children with working parents became poor due to the government's benefit cuts and public sector pay limits, the report by the consultancy Landman Economics stated. The research found that the biggest increase in child poverty among working families will be in the East Midlands, followed by the West Midlands and Northern Ireland.[66] Teachers and teaching assistants bring items into schools like food, sanitary products and toilet paper for children from families that are short of these things.[67]
Inflation has been rising while the level of many benefits has remained fixed in money terms. This is causing hardship to low-income families and there are calls for the level of benefits to be increased.[68] Over 14 million people, as well as 4.5 million children, live below the breadline, and over half are trapped in poverty for years. Poverty is particularly frequent in families with a disabled person, single-parent families, and households where no one works or that are dependent for income on irregular or zero-hours jobs. 12% of the UK population have spent the bulk or all of the four years to 2018 below the breadline. Alison Garnham of the Child Poverty Action Group said, What we now need is for government to move on from its denial of the problem, set targets for reducing and eradicating child poverty, and implement policies to support low-income families. The Institute for Fiscal Studies predicted children living in poverty will reach a record 5.2 million over the five years from 2018 as government welfare cuts take effect, more than reversing all the progress made over the previous 20 years.[69]
Many poor people live in areas where there is no large supermarket nearby and must rely on corner shops where food is less healthy. Poor people in these areas cannot easily afford to buy fresh fruit and vegetables or to travel to large supermarkets where there is healthier food. Such areas include Marfleet in Hull, Hartcliffe in Bristol, Hattersley in Greater Manchester, Everton in Liverpool and Sparkbrook in Birmingham. Eight of the ten most deprived areas in Scotland are in Glasgow, and three of the nine worst in Wales are in Cardiff. Poor people, older people and disabled people are most affected when fresh food is not available locally. nearly 4 million UK children are judged to live in households that would find it difficult to afford enough fruit, vegetables and other healthy foods to reach official guidelines, the Food Foundation maintains. Food prices increased by 7.7% from 2002 to 2016, while the poorest families' incomes fell by 7.1%.[70]
The Office for National Statistics has estimated that in 2011, 14 million people were at risk of poverty or social exclusion, and that one person in 20 (5.1%) was now experiencing "severe material depression."[citation needed] Poverty among young people increased by 3.9% from 2007 to 2010.[71] In assessing social inequality in Britain, Danny Dorling has noted that "people in different parts of Britain and people living within different quarters of its cities are living in different worlds with different norms and expectations. This was not the case a few decades ago. This is not the case to the same extent in the majority of affluent nations in the world."[72]
A new term is appearing, 'Just About Managing' or 'JAM'. This applies to people who can put food on the table and pay rent or mortgage at least part of the time but have problems if their income falls or if there are unexpected bills. JAM's are typically families where at least one person works. JAM's may suffer social exclusion being unable to afford holidays or evenings out.[73]
The Resolution Foundation claims that the incomes of the poorest 10% in the UK will fall by 3% in real terms by 2020 due to government policies on tax and welfare. The lowest third of incomes will suffer falls in income over the coming years. Incomes will fall because many welfare benefits that poorer people receive have been frozen in cash terms and with inflation cash will be worth steadily less.[74]
In 2017-18 the Resolution Foundation reckons the official poverty rate increased from 22.1% to 23.2%, the child poverty rate rose in 2017-18 from 30.3% to 33.4%. Cuts to benefits and inflation are blamed for the rise, benefit levels have remained unchanged in money terms while inflation erodes their real value.[75]
The Institute for Fiscal Studies reported the numbers of poor United Kingdom children in wage-earning families increased from 2009 to 2014 and more poor children currently live in working families than live in families on benefits. The IFS reported "Recent falls in inequality are likely to prove temporary. Stronger earnings growth and the Conservatives planned income tax cuts would do most for incomes towards the top of the distribution, while planned benefit cuts will hit low-income households [both in and out of work] hardest."[76][77]
Anne Longfield, Children's Commissioner for England wrote: "The majority of children living in poverty have at least one parent who is working. Employment is important but if wages do not rise substantially in relation to living costs it will not provide a route out of poverty alone. The Joseph Rowntree Foundation has today published a report stating that families with children working full-time on the National Minimum Wage are now 15% short of the Minimum Income Standard that people believe offers an acceptable standard of living. Today's announcement will effectively confine to history any figures on the millions of children being raised in families who experience in-work poverty denying them necessities such as adequate food, clothing and heating."[78]
Julia Unwin of the Joseph Rowntree Foundation said: "A strong economy and rising employment have masked the growing problem of in-work poverty, as years of below-inflation wage rises have taken their toll on people's incomes. The upcoming minimum wage rise will help, but many low-income working families will still find themselves worse off due to tax-credit changes. Boosting productivity and creating more jobs which offer progression at work is vital to make work a reliable route out of poverty."[76]
Campbell Robb of Shelter said: "It's heart-breaking to think that so many people are having to make a choice between paying the rent and putting food on the table, or living in fear that any drop in income would leave them unable to cover their housing costs. The sad truth is that far too many people in Britain right now are living in homes that just aren't up to scratch - from the thousands of families forced to cope with poor conditions, to a generation of renters forking out most of their income on housing each month and unable to save for the future."[79]
As of 2015 there is actual hunger in the United Kingdom and significant numbers of UK citizens are driven to use food banks. There is also significant malnutrition. Poorer people are frequently forced to buy and eat cheaper, less healthy food. The BMJ, a UK peer-reviewed medical journal published:
For the poorest in our society, up to 35% of disposable income will now be needed for food, compared to less than 9% for the more wealthy. This will increase reliance on cheap, highly processed, high fat, high sugar, high salt, and calorie-dense, unhealthy foods. Re-emerging problems of poor public health nutrition such as rickets and malnutrition in the elderly are also causes for concern.
In 2016, 10% of UK households lived in fuel poverty. Fuel poverty is calculated by gauging if a household's income would fall below the official poverty line after spending the actual amount needed to heat the home. The average fuel poverty gap of these households ÿ that is, the amount needed to escape fuel poverty ÿ is S371 a year, the latest figures indicate, with those in privately rented properties hit hardest.[81]
In a 2013 report commissioned by the Joseph Rowntree Foundation[82] poverty and participation are analyzed as a social phenomenon characterizing UK society following the tradition initiated several decades ago by Peter Townsend. Participation in society is measured in terms of social relationships, membership of organisations, trust in other people, ownership of possessions and purchase of services. The study finds out that all these dimensions of participation are lower among people with low incomes. While participation generally drops as income declines, participation stops falling among the 30 percent or so of people with the lowest incomes, creating a participation 'floor'. The 30 percent of people with the lowest incomes are forced to choose between the basic necessities of modern life; they must decide which needs to neglect.
For people affected by the floor, additional income may well be spent on upgrading the quality of necessary goods and services rather than adding to them. Averages mask important variation. The participation floor for benefit recipients is lower than for other groups on the same income. Most minority ethnic groups experience greater material deprivation than the white majority but social participation is, on average, higher. Children's engagement in school life and friends is not directly affected by household income. However, parents on low incomes, on average, play less often with their children and spend less on activities. This is associated with poorer educational outcomes as judged by teachers. Low-income parents frequently spend more time than affluent ones assisting children with their school work because they have fallen behind their classmates.
Poverty and economic insecurity increase the risk that a person will commit suicide. The Samaritans claim that the British economic condition ÿ including low incomes, job insecurity, zero-hours contracts, unmanageable debts and poor housing ÿ all add to suicide risk. A report titled Dying from Inequality describes overwhelming evidence of a link between socioeconomic disadvantage and suicidal behaviour. Men in the lowest social class, living in the most deprived areas, are up to 10 times more at risk of suicide than those in the highest social class living in the most affluent areas, the report says. Unemployed people are more at risk of suicide than people with work, people with low education and people living in deprived areas are also at increased risk.[83]
The persistence of high poverty rates in the UK is associated with the relatively low generosity of the welfare state. The UK social security system is characterised by a residual welfare state model based on the notion of market dominance and private provision.[citation needed] The state only intervenes to moderate extreme poverty and provide for basic needs, largely on a means-tested basis.[84][85]
In 2017, inequality has been forecast to return to the levels of the Thatcher years. Torsten Bell of the Resolution Foundation, said: "[A] boom is slowing rapidly as inflation rises, productivity flatlines and employment growth slows. ... This time around its low- and middle-income families with kids who are set to be worst affected. This could leave Britain with the worst of both worlds on living standards ÿ the weak income growth of the last parliament and rising inequality from the time Margaret Thatcher was in Downing Street. The prime ministers focus on supporting just managing families is absolutely right."[86]
Poverty within the UK is particularly concentrated in Wales. While the relative income-poverty rate for the UK stood at 16.8% in 2014, the same poverty rate for Wales stood at 23% in the same year.[87][88] Poverty in Wales has remained in the 25% range, with only small dips throughout the last decade.[88] While the trends correlate with overall reductions in less impoverished areas of the UK, it does not correlate with Scotland, who in the 1990s, had a relative similar poverty trend as Wales.[87] Conservative attitudes began to grow during the reign of the Labour party in the 2000s, culminating in an overall negative opinion towards public spending increases beginning in the 2010s.[89]
Data published in 2017 by the New Policy Institute and the Trust for London found that 27% of Londoners live in poverty, six percentage points higher than in the rest of England. This represents 2.3 million Londoners, 58% of whom are in a working family.[90] Further research published by Trust for London, carried out by Loughborough University, found that two in five Londoners cannot afford what the public regard as a decent standard of living ÿ one that allows them to meet their basic needs and participate in society at a minimum level. This is significantly higher than the 30% that fall below the standard in the UK as a whole, and represents 3.3 million Londoners.[91]
The table below shows the percentage of the population in poverty derived by three different measures: relative poverty (earning less than 60% of the median), the National Assistance scale and the Supplementary Benefits scale. Estimates from the National Institute of Economic and Social Research.[92]
Estimates of poverty in the United Kingdom from 1950-1975 (percentage of population)[93]
1953-54: 1.2% (Abel-Smith and Townsend, FES) Unit: Household
1954: 12.3% (Gough and Stark, IR) Unit: Tax unit
1959: 8.8% (Gough and Stark, IR) Unit: Tax unit
1960: 3.8% (Abel-Smith and Townsend, FES) Unit: Household
1963: 9.4% (Gough and Stark, IR) Unit: Tax unit
1967: 3.5% (Atkinson, FES) Unit: Household
1969: 3.4% (Atkinson, FES) Unit: Household
1968-69: 6.4% (Townsend, Survey) Unit: Household
1971: 4.9% (Fiegehen et al., FES) Unit: Household
1975: 11.3% (Berthoud and Brown, GHS) Unit: Household
The most common measure for poverty, as used in the Child Poverty Act 2010, is household income below 60 percent of median income. The median is such an income that exactly a half of households earn more than that and the other half earns less.[94]
In 2014/5, the median income in the UK was S473 per week (S24,596 a year). Those earning 60% of this figure (S284 a week / S14,758 a year) were considered to be in the low income bracket.
This is the definition that is used by the UK government's Department of Work and Pensions in its yearly survey Households below average income.[95] However, their reports expressly avoid using the word poverty, using low income instead. Reports from others agencies, such as the Institute of Fiscal Studies Living Standards, Poverty and Inequality in the UK, use the same methodology, but specifically use the word poverty.[34][96]
This measure can be further divided.
Those who live in absolute poverty have a household income below 60 percent of median income' as compared to a rate fixed in 2010/11 and that only changes in line with inflation.
Those who live in relative poverty have a household income below 60 percent of median income' as compared to all other incomes in the same year.
Absolute poverty is better at judging poverty in the short term, whereas relative poverty is better at seeing long-term trends. This is because general concepts of poverty change with time, and relative poverty reflects this better.[34]
Reports on poverty also tend to take housing costs in to account, distinguishing between before housing costs (BHC, where housing costs such as rent and mortgage interest payments have not been deducted) and after housing costs (AHC). Different social groups in the UK tend to have vastly different costs for housing, affecting available income.[34]
Relative poverty was used before its formal adoption now. In the early 1980s, Tony Byrne and Colin F. Padfield defined relative poverty in Britain as a situation in which people are able to survive adequately, but they are either less well off than they used to be (such as when they retire from paid employment) or that they are at a serious disadvantage "in their ability to experience or enjoy the standard of life of most other people ÿ for example, not being able to afford an annual holiday."[9]
In 2011, there was some discussion of the measurement for poverty being changed (from households earning less than 60% of median income) to a broader analysis of poverty.[97]
As opposed to measuring income, the Consensual Method examines which necessities (e.g. food, clothing, access to healthcare, involvement in social and leisure activities) are thought by the general public to be essential for living in contemporary UK society.[98] Those families or individual who lack a number of these necessities are considered as poor. In the 2012 Poverty and Social Exclusion (PSE) survey on Living Standards, the three necessities deemed as being most often essential to a good standard of living were the ability 'to warm living areas of the home ', a 'damp-free home' and 'two meals a day.'[99]
Six specific surveys of low standards of living in the UK have made use of this method.
Water poverty is defined by the government as spending more than 3% of disposable income on water bills. Nationally, in 2006, nearly 10% of households were in water poverty.[100]
Fuel poverty. A fuel poor household is one that struggles to keep adequately warm at reasonable cost. The most widely accepted definition of a fuel poor household is one which needs to spend more than 10% of its income on all fuel use and to heat the home to an adequate standard of warmth. This is generally defined as 21?C in the living room and 18?C in the other occupied rooms.[101][102] Fuel poverty affects over a million British working households and over 2.3 million households in total and increases in energy prices affect poor people severely.[103]
Eurostat figures show that the numbers of Britons at risk of poverty has fallen to 15.9% in 2014, down from 17.1% in 2010 and 19% in 2005 (after social transfers were taken into account).[44]
If the poverty line is defined as those individuals and households with incomes less than 60% of their respective medians, then "nearly 60%" of those in poverty are homeowners.[108]
Seebohm Rowntree chose a basic 'shopping basket' of foods (identical to the rations given in the local workhouse), clothing and housing needs ÿ anyone unable to afford them was deemed to be in poverty. By 1950, with the founding of the modern welfare state, the 'shopping basket' measurement had been abandoned.
The vast and overwhelming majority of people that fill the government's current criteria for poverty status (see above) have goods unimaginable to those in poverty in 1900. Poverty in the developed world is often one of perception; people compare their wealth with neighbours and wider society, not with their ancestors or those in foreign countries. Indeed, this is formalised in the government's measure of poverty. A number of studies have shown that though prosperity in the UK has greatly increased, the level of happiness people report has remained the same or even decreased since the 1950s.[109][110][111]
People enter the world of poverty due to: problems at the individual/family level and problems with the economy as a whole. Problems at the individual level include: race (human categorization), gender, sexual orientation, drug use, and level of education. Problems with the economy can include: low labor participation and high levels of unemployment .[112] Welfare is financial support given by the government to people in need. There are pressures on the welfare state because welfare must be justified in terms of its contribution to economic success. Welfare must contribute positively to the economy otherwise there is a risk of damaging currency values. Damage to currency values will damage trading positions and investment which will, in turn, hurt the economy overall.[113]
The Department of Health and Social Security (DHSS) is responsible for the welfare services in the United Kingdom. Income maintenance is centrally administered through DHSS offices (regional and local level).[114] Those who earn 39 pounds a week (except some married women) or more must contribute to the National Insurance Scheme. The National Health Service (NHS) provides virtually free healthcare for all residents ÿ this is also centrally administered.
Persistent poverty is the effects of experiencing low income for long periods of time. In 2014, 6.5% of the United Kingdom's population was classified as being in persistent poverty; that equates to approximately 3.9 million people. The UK's poverty rate overall in 2014 was the 12th highest amongst all European nations at 16.8%, however; it has the third-lowest persistent poverty rate.[115] Income tends to be measured before or after housing costs are accounted for (BHC or AHC).[116] Poverty levels tend to be higher after housing costs are accounted for because the poorer households need to spend a higher percentage of their income on housing. In 2014-2015, 13.5 million people were in relative low income AHC (an increase of 300,000 from the year before) and 12.9 million people were in absolute low income AHC (a decrease of 700,000 from the year before). Relative low income means that people live in households with income below 60% of the median in a specified year. Absolute income means that people live in households with income below 60% of the median income in some base year.[116]
In 2016, the incomes of poor households are extremely sensitive to the activity in the labor market. When any downturn in the labor market occurs, the poorest people in the UK are increasingly more vulnerable and at greater risk.[117] Median income (overall) has moved 2% above pre-crisis (2007-2008) levels. During the recovery period, inequality in workers earnings has decreased. There has been strong employment growth along with weak earnings growth which have kept inequality low for several years.[117]
In 1999, Tony Blair, the former Prime Minister of the United Kingdom, pledged that child poverty in the United Kingdom will end within a generation. The goal was to completely eradicate child poverty by 2020. Poverty is a result of several different factors, some of which include a lack of education and training, low participation in the labour market, poor working conditions and affordable housing.[118]
The key components of the UK's strategy to fight poverty are:
One of the most crucial ways to reduce poverty is to increase benefit take-up. In 2009-10 almost a third of those who were eligible for means-tested benefit did not claim. In 2011-2012, 15% of those eligible for Child Tax Credit did not claim, neither did 35% of those eligible for Working Tax Credit.[112] Improving these numbers and getting those people to claim their entitlements would significantly help reduce poverty.
Ways that would help to increase benefit take-up include:
A decrease in poverty would mean a more active economy because more people would have the ability to purchase more consumer goods.[119]
For the UK General Election of 2015, research was undertaken to analyse the commitment of the UK's political parties in addressing poverty. It demonstrated that "poverty has been overlooked as an issue in the General Election campaign" and that only the Green Party had an effective policy to deal with poverty. Analysis of other parties' policies and how they are used to deal with poverty ended in negative conclusions: "The Conservatives and UKIP both performed fairly badly". Labour performed better in some specific policy areas when compared to the Conservatives, but "there is not very much difference between them." Overall, the audit noted that views towards poverty were affected by specific views for those receiving social security benefits: "there was a general tendency to come down hard on welfare recipients, with a shift towards means-testing and victim-blaming across the board. This can be seen particularly in the context of Immigration and Housing."[120][non-primary source needed]
While leader of the Labour Government, Tony Blair vowed in 1999 to cut child poverty 25% by 2005, 50% by 2010 and to eradicate child poverty completely by 2020. The Labour Party website states:
"In 1997 Labour inherited one of the highest rates of child poverty in Europe ÿ with one in three children living in poverty. Our mission to abolish child poverty is grounded both in our determination to secure social justice, and to tackle the problems that the social exclusion of children builds up for the long-term. Work is the best route out of poverty and our successful welfare to work measures have lifted millions out of poverty including disabled people, who have too often previously been consigned to a life on benefits. At the same time, millions of families are benefiting from the Child tax credit, the Working tax credit, and record rises in Child benefit."[121]
Their 2005 manifesto[122] states:
"[Since the Labour government came to power in 1997] there are two million fewer children and nearly two million fewer pensioners living in absolute poverty."
In late November 2006, the Conservative Party garnered headlines across the press when a senior member spoke out on poverty, invoking the name of Polly Toynbee. The headlines began when David Cameron's policy advisor and shadow minister Greg Clark wrote:
"The traditional Conservative vision of welfare as a safety net encompasses another outdated Tory nostrum ÿ that poverty is absolute, not relative. Churchill's safety net is at the bottom: holding people at subsistence level, just above the abyss of hunger and homelessness. It is the social commentator Polly Toynbee who supplies imagery that is more appropriate for Conservative social policy in the twenty first century."[123][124]
This approach generated much comment and analysis.[125] It was followed two days later by Cameron saying poverty should be seen in relative terms to the rest of society, where people lack those things which others in society take for granted, "those who think otherwise are wrong [...] I believe that poverty is an economic waste, a moral disgrace. [...] We will only tackle the causes of poverty if we give a bigger role to society, tackling poverty is a social responsibility [...] Labour rely too heavily on redistributing money, and on the large, clunking mechanisms of the state."[126]
Most people's ability to sustain their lifestyle and to participate socially comes under threat at around the bottom 30% of the income distribution, creating a sort of 'participation floor' that seems to demarcate a major divide in British society.[127] The floor begins around the point in the income distribution when the benefit system starts to contribute substantially to people's incomes but is not entirely rigid or for example, it is lower for recipients of social security benefits mainly on account of the greater material deprivation that they experience. For those on the floor, participation is severely constrained with people negotiating a zero-sum world in which spending on one area means reduction in another. Whereas for those above the floor, additional income translates into more evident consumption, greater social participation and trust; for those on the floor it means a slight easing of pressure, but no major change in lifestyle sufficient to be identified in survey evidence. The implications for policy and our understanding of society are profound. Much policy, notably the new Universal Credit that was the flagship policy of the past Coalition Government, seeks to maximise work incentives premised on the notion that additional income brings rewards for individuals in terms of higher living standards, and benefits society through greater consumption and a shared work ethic. Similarly, as emphasised by Lansley and Mack,[128] New Labour during the period 1997ÿ2010 (despite trying to tackle child poverty) intervened mainly through more generous and wide-ranging tax-credits rather than fighting poverty and inequality at source.[129]
The Joseph Rowntree Foundation is one of the largest social policy research and development charities in the UK and takes particular interest in the issue of poverty, with over 2,000 reports on poverty and disadvantage available on its website.
The Child Poverty Action Group campaigns for the elimination of poverty amongst children.
End Child Poverty coalition also seeks the eradication of child poverty.
The Oxfam UK Poverty Programme[130] works with people and policy makers to tackle the causes of poverty.
In July 2013 Freedom from Torture published its report "The Poverty Barrier: The Right to Rehabilitation for Survivors of Torture in the UK[131] which highlights the failings of the UK Asylum System in their handling of torture survivors arriving in the UK. The evidence included in the report comes from the testimony of over 100 survivors of torture and eighteen members of Freedom from Torture's clinical department. The report highlights financial insecurity, social exclusion and hopelessness and how poverty prevents the rehabilitation process. One survivor stated: "... Our current living conditions keep our torture trauma still alive. We can't move on."
The french and indian war was fought over disputed land. what was the land?
Allegheny and Monongahela rivers called the Forks of the Ohio, and the site of the French Fort Duquesne within present-day Pittsburgh, Pennsylvania🚨British victory
?Great Britain
Iroquois Confederacy
?France
Wabanaki Confederacy
The French and Indian War (1754ÿ63) comprised the North American theater of the worldwide Seven Years' War of 1756ÿ63. It pitted the colonies of British America against those of New France. Both sides were supported by military units from their parent countries of Great Britain and France, as well as by American Indian allies. At the start of the war, the French North American colonies had a population of roughly 60,000 settlers, compared with 2 million in the British North American colonies.[3] The outnumbered French particularly depended on the Indians. The European nations declared war on one another in 1756 following months of localized conflict, escalating the war from a regional affair into an intercontinental conflict.
The name French and Indian War is used mainly in the United States. It refers to the two main enemies of the British colonists: the royal French forces and the various American Indian forces allied with them. The British colonists were supported at various times by the Iroquois, Catawba, and Cherokee, and the French colonists were supported by Wabanaki Confederacy members Abenaki and Mi'kmaq, and Algonquin, Lenape, Ojibwa, Ottawa, Shawnee, and Wyandot.
British and other European historians use the term the Seven Years' War, as do English-speaking Canadians.[4] French Canadians call it La guerre de la Conqute (the War of the Conquest)[5][6] or (rarely) the Fourth Intercolonial War.[7][not in citation given]
Fighting took place primarily along the frontiers between New France and the British colonies, from Virginia in the south to Newfoundland in the north. It began with a dispute over control of the confluence of the Allegheny and Monongahela rivers called the Forks of the Ohio, and the site of the French Fort Duquesne within present-day Pittsburgh, Pennsylvania. The dispute erupted into violence in the Battle of Jumonville Glen in May 1754, during which Virginia militiamen under the command of 22-year-old George Washington ambushed a French patrol.
In 1755, six colonial governors in North America met with General Edward Braddock, the newly arrived British Army commander, and planned a four-way attack on the French. None succeeded, and the main effort by Braddock proved a disaster; he lost the Battle of the Monongahela on July 9, 1755 and died a few days later. British operations failed in the frontier areas of Pennsylvania and New York during 1755-57 due to a combination of poor management, internal divisions, effective Canadian scouts, French regular forces, and Indian warrior allies. In 1755, the British captured Fort Beausjour on the border separating Nova Scotia from Acadia, and they ordered the expulsion of the Acadians (1755ÿ64) soon afterwards. Orders for the deportation were given by William Shirley, Commander-in-Chief, North America, without direction from Great Britain. The Acadians were expelled, both those captured in arms and those who had sworn the loyalty oath to His Britannic Majesty. Indians likewise were driven off the land to make way for settlers from New England.[8]
The British colonial government fell in the region of modern Nova Scotia after several disastrous campaigns in 1757, including a failed expedition against Louisbourg and the Siege of Fort William Henry; this last was followed by Indians torturing and massacring their British victims. William Pitt came to power and significantly increased British military resources in the colonies at a time when France was unwilling to risk large convoys to aid the limited forces that they had in New France, preferring to concentrate their forces against Prussia and its allies in the European theater of the war. Between 1758 and 1760, the British military launched a campaign to capture the Colony of Canada (part of New France). They succeeded in capturing territory in surrounding colonies and ultimately the city of Quebec (1759). The British later lost the Battle of Sainte-Foy west of Quebec (1760), but the French ceded Canada in accordance with the Treaty of Paris (1763).
The outcome was one of the most significant developments in a century of Anglo-French conflict. France ceded to Great Britain its territory east of the Mississippi. It ceded French Louisiana west of the Mississippi River (including New Orleans) to its ally Spain in compensation for Spain's loss to Britain of Florida. (Spain had ceded Florida to Britain in exchange for the return of Havana, Cuba.) France's colonial presence north of the Caribbean was reduced to the islands of Saint Pierre and Miquelon, confirming Great Britain's position as the dominant colonial power in eastern North America.
The conflict is known by multiple names. In British America, wars were often named after the sitting British monarch, such as King William's War or Queen Anne's War. There had already been a King George's War in the 1740s during King George's reign, so British colonists named this conflict after their opponents and it became known as the French and Indian War.[9] This traditional name continues as the standard in the United States, but it obscures the fact that Indians fought on both sides of the conflict and that this was part of the Seven Years' War, a much larger conflict between France and Great Britain.[10] American historians generally use the traditional name or sometimes the Seven Years' War. Less frequently used names for the war include the Fourth Intercolonial War and the Great War for the Empire.[9]
In Europe, the North American theater of the Seven Years' War usually is not given a separate name. The entire international conflict is known as the Seven Years' War. "Seven Years" refers to events in Europe, from the official declaration of war in 1756 to the signing of the peace treaty in 1763. These dates do not correspond with the fighting on mainland North America which was largely concluded in six years, from the Battle of Jumonville Glen in 1754 to the capture of Montreal in 1760.[9]
Canadians refer to both the European and North American conflicts as the Seven Years' War (Guerre de Sept Ans).[11][12] French Canadians also use the term "War of Conquest" (Guerre de la Conqute), since it is the war in which Canada was conquered by the British and became part of the British Empire.
At this time, North America east of the Mississippi River was largely claimed by either Great Britain or France. Large areas had no colonial settlements. The French population numbered about 75,000 and was heavily concentrated along the St. Lawrence River valley, with some also in Acadia (present-day New Brunswick and parts of Nova Scotia), including ?le Royale (present-day Cape Breton Island). Fewer lived in New Orleans, Biloxi, Mississippi, Mobile, Alabama, and small settlements in the Illinois Country, hugging the east side of the Mississippi River and its tributaries. French fur traders and trappers traveled throughout the St. Lawrence and Mississippi watersheds, did business with local Indian tribes, and often married Indian women.[13] Traders married daughters of chiefs, creating high-ranking unions.
British settlers outnumbered the French 20 to 1[14] with a population of about 1.5 million ranged along the eastern coast of the continent from Nova Scotia and Newfoundland in the north to Georgia in the south.[15] Many of the older colonies had land claims that extended arbitrarily far to the west, as the extent of the continent was unknown at the time when their provincial charters were granted. Their population centers were along the coast, yet the settlements were growing into the interior. Nova Scotia had been captured from France in 1713, and it still had a significant French-speaking population. Britain also claimed Rupert's Land where the Hudson's Bay Company traded for furs with local Indian tribes.
In between the French and British colonists, large areas were dominated by Indian tribes. To the north, the Mi'kmaqs and the Abenakis were engaged in Father Le Loutre's War and still held sway in parts of Nova Scotia, Acadia, and the eastern portions of the province of Canada, as well as much of Maine.[16] The Iroquois Confederation dominated much of Upstate New York and the Ohio Country, although Ohio also included Algonquian-speaking populations of Delaware and Shawnee, as well as Iroquoian-speaking Mingos. These tribes were formally under Iroquois rule and were limited by them in their authority to make agreements.[17]
The Southeast interior was dominated by Siouan-speaking Catawbas, Muskogee-speaking Creeks and Choctaw, and the Iroquoian-speaking Cherokee tribes.[18] When war broke out, the French colonists used their trading connections to recruit fighters from tribes in western portions of the Great Lakes region, which was not directly subject to the conflict between the French and British; these included the Hurons, Mississaugs, Ojibwas, Winnebagos, and Potawatomi. The British colonists were supported in the war by the Iroquois Six Nations and also by the Cherokees, until differences sparked the Anglo-Cherokee War in 1758. In 1758, the Pennsylvania government successfully negotiated the Treaty of Easton in which a number of tribes in the Ohio Country promised neutrality in exchange for land concessions and other considerations. Most of the other northern tribes sided with the French, their primary trading partner and supplier of arms. The Creeks and Cherokees were subject to diplomatic efforts by both the French and British to gain either their support or neutrality in the conflict.
By this time, Spain claimed only the province of Florida in eastern North America; it controlled Cuba and other territories in the West Indies that became military objectives in the Seven Years' War. Florida's European population was a few hundred, concentrated in St. Augustine and Pensacola.
At the start of the war, no French regular army troops were stationed in North America, and few British troops. New France was defended by about 3,000 troupes de la marine, companies of colonial regulars (some of whom had significant woodland combat experience). The colonial government recruited militia support when needed. Most British colonies mustered local militia companies to deal with Indian threats, generally ill trained and available only for short periods, but they did not have any standing forces. Virginia, by contrast, had a large frontier with several companies of British regulars.
The colonial governments were used to operating independently of one another and of the government in London, a situation that complicated negotiations with Indian tribes, whose territories often encompassed land claimed by multiple colonies. After the war began, the leaders of the British Army establishment tried to impose constraints and demands on the colonial administrations.
New France's Governor-General Roland-Michel Barrin de La Galissonire was concerned about the incursion and expanding influence in the Ohio Country of British colonial traders such as George Croghan. In June 1747, he ordered Pierre-Joseph Cloron to lead a military expedition through the area. Its objectives were:
Cloron's expedition force consisted of about 200 Troupes de la marine and 30 Indians, and they covered about 3,000 miles (4,800?km) between June and November 1749. They went up the St. Lawrence, continued along the northern shore of Lake Ontario, crossed the portage at Niagara, and followed the southern shore of Lake Erie. At the Chautauqua Portage near Barcelona, New York, the expedition moved inland to the Allegheny River which it followed to the site of Pittsburgh. There Cloron buried lead plates engraved with the French claim to the Ohio Country.[19] Whenever he encountered British colonial merchants or fur-traders, he informed them of the French claims on the territory and told them to leave.[19]
Cloron's expedition arrived at Logstown where the Indians in the area informed him that they owned the Ohio Country and that they would trade with the British colonists regardless of the French.[20] He continued south until his expedition reached the confluence of the Ohio and the Miami rivers which lay just south of the village of Pickawillany, the home of the Miami chief known as "Old Briton". Cloron threatened Old Briton with severe consequences if he continued to trade with British colonists, but Old Briton ignored the warning. Cloron returned disappointedly to Montreal in November 1749.[21]
Cloron wrote an extensively detailed report. "All I can say is that the Natives of these localities are very badly disposed towards the French," he wrote, "and are entirely devoted to the English. I don't know in what way they could be brought back."[20] Even before his return to Montreal, reports on the situation in the Ohio Country were making their way to London and Paris, each side proposing that action be taken. Massachusetts governor William Shirley was particularly forceful, stating that British colonists would not be safe as long as the French were present.[22]
In 1749, the British government gave land to the Ohio Company of Virginia for the purpose of developing trade and settlements in the Ohio Country.[23] The grant required that it settle 100 families in the territory and construct a fort for their protection. But the territory was also claimed by Pennsylvania, and both colonies began pushing for action to improve their respective claims.[24] In 1750, Christopher Gist explored the Ohio territory, acting on behalf of both Virginia and the company, and he opened negotiations with the Indian tribes at Logstown.[25] He completed the 1752 Treaty of Logstown in which the local Indians agreed to terms through their "Half-King" Tanacharison and an Iroquois representative. These terms included permission to build a strong house at the mouth of the Monongahela River on the modern site of Pittsburgh, Pennsylvania.[26] By the late 17th century, the Iroquois had pushed many tribes out of the Ohio Valley, and they laid claim to it as their hunting ground by right of conquest.
The War of the Austrian Succession (better known as King George's War) formally ended in 1748 with the signing of the Treaty of Aix-la-Chapelle which was primarily focused on resolving issues in Europe. The issues of conflicting territorial claims between British and French colonies were turned over to a commission, but it reached no decision. Frontier areas were claimed by both sides, from Nova Scotia and Acadia in the north to the Ohio Country in the south. The disputes also extended into the Atlantic Ocean, where both powers wanted access to the rich fisheries of the Grand Banks off Newfoundland.
Governor-General of New France Marquis de la Jonquire died on March 17, 1752, and he was temporarily replaced by Charles le Moyne de Longueuil. His permanent replacement was to be the Marquis Duquesne, but he did not arrive in New France until 1752 to take over the post.[27] The continuing British activity in the Ohio territories prompted Longueuil to dispatch another expedition to the area under the command of Charles Michel de Langlade, an officer in the Troupes de la Marine. Langlade was given 300 men, including French-Canadians and warriors of the Ottawa tribe. His objective was to punish the Miami people of Pickawillany for not following Cloron's orders to cease trading with the British. On June 21, the French war party attacked the trading centre at Pickawillany, capturing three traders[21] and killing 14 Miami Indians, including Old Briton. He was reportedly ritually cannibalized by some Indians in the expedition party.
In the spring of 1753, Paul Marin de la Malgue was given command of a 2,000-man force of Troupes de la Marine and Indians. His orders were to protect the King's land in the Ohio Valley from the British. Marin followed the route that Cloron had mapped out four years earlier. Cloron, however, had limited the record of French claims to the burial of lead plates, whereas Marin constructed and garrisoned forts. He first constructed Fort Presque Isle on Lake Erie's south shore near Erie, Pennsylvania, and he had a road built to the headwaters of LeBoeuf Creek. He then constructed a second fort at Fort Le Boeuf in Waterford, Pennsylvania), designed to guard the headwaters of LeBoeuf Creek. As he moved south, he drove off or captured British traders, alarming both the British and the Iroquois. Tanaghrisson was a chief of the Mingo Indians, who were remnants of Iroquois and other tribes who had been driven west by colonial expansion. He intensely disliked the French whom he accused of killing and eating his father. He traveled to Fort Le Boeuf and threatened the French with military action, which Marin contemptuously dismissed.[28]
The Iroquois sent runners to the manor of William Johnson in upstate New York, who was the British Superintendent for Indian Affairs in the New York region and beyond. Johnson was known to the Iroquois as Warraghiggey, meaning "he who does great things." He spoke their languages and had become a respected honorary member of the Iroquois Confederacy in the area, and he was made a colonel of the Iroquois in 1746; he was later commissioned as a colonel of the Western New York Militia.
The Indian representatives and Johnson met with Governor Clinton and officials from some of the other American colonies at Albany, New York. Mohawk Chief Hendrick was the speaker of their tribal council, and he insisted that the British abide by their obligations[which?] and block French expansion. Clinton did not respond to his satisfaction, and Hendrick said that the "Covenant Chain" was broken, a long-standing friendly relationship between the Iroquois Confederacy and the British Crown.
Governor Robert Dinwiddie of Virginia was an investor in the Ohio Company which stood to lose money if the French held their claim.[29] He ordered 21 year-old Major George Washington (whose brother was another Ohio Company investor) of the Virginia Regiment to warn the French to leave Virginia territory in October 1753.[30] Washington left with a small party, picking up Jacob Van Braam as an interpreter, Christopher Gist (a company surveyor working in the area), and a few Mingos led by Tanaghrisson. On December 12, Washington and his men reached Fort Le Boeuf.[31][32]
Jacques Legardeur de Saint-Pierre succeeded Marin as commander of the French forces after Marin died on October 29, and he invited Washington to dine with him. Over dinner, Washington presented Saint-Pierre with the letter from Dinwiddie demanding an immediate French withdrawal from the Ohio Country. Saint-Pierre said, "As to the Summons you send me to retire, I do not think myself obliged to obey it."[33] He told Washington that France's claim to the region was superior to that of the British, since Ren-Robert Cavelier, Sieur de La Salle had explored the Ohio Country nearly a century earlier.[34]
Washington's party left Fort Le Boeuf early on December 16 and arrived in Williamsburg on January 16, 1754. He stated in his report, "The French had swept south",[35] detailing the steps which they had taken to fortify the area, and their intention to fortify the confluence of the Allegheny and Monongahela rivers.[36]
Even before Washington returned, Dinwiddie had sent a company of 40 men under William Trent to that point where they began construction of a small stockaded fort in the early months of 1754.[37] Governor Duquesne sent additional French forces under Claude-Pierre Pecaudy de Contrec?ur to relieve Saint-Pierre during the same period, and Contrec?ur led 500 men south from Fort Venango on April 5, 1754.[38] These forces arrived at the fort on April 16, but Contrec?ur generously allowed Trent's small company to withdraw. He purchased their construction tools to continue building what became Fort Duquesne.[39]
Dinwiddie had ordered Washington to lead a larger force to assist Trent in his work, and Washington learned of Trent's retreat while he was en route.[40] Mingo sachem Tanaghrisson had promised support to the British, so Washington continued toward Fort Duquesne and met with him. He then learned of a French scouting party in the area, so he combined Tanaghrisson's force with his own and surprised the Canadians on May 28 in what became known as the Battle of Jumonville Glen. They killed many of the Canadians, including their commanding officer Joseph Coulon de Jumonville, whose head was reportedly split open by Tanaghrisson with a tomahawk. Historian Fred Anderson suggests that Tanaghrisson was acting to gain the support of the British and to regain authority over his own people. They had been inclined to support the French, with whom they had long trading relationships. One of Tanaghrisson's men told Contrecoeur that Jumonville had been killed by British musket fire.[41]
Historians generally consider the Battle of Jumonville Glen as the opening battle of the French and Indian War in North America, and the start of hostilities in the Ohio valley.
Following the battle, Washington pulled back several miles and established Fort Necessity, which the Canadians attacked under the command of Jummonville's brother at the Battle of Fort Necessity on July 3. Washington surrendered and negotiated a withdrawal under arms. One of his men reported that the Canadian force was accompanied by Shawnee, Delaware, and Mingo warriorsjust those whom Tanaghrisson was seeking to influence.[42]
News of the two battles reached England in August. After several months of negotiations, the government of the Duke of Newcastle decided to send an army expedition the following year to dislodge the French.[43] They chose Major General Edward Braddock to lead the expedition.[44] Word of the British military plans leaked to France well before Braddock's departure for North America. In response, King Louis XV dispatched six regiments to New France under the command of Baron Dieskau in 1755.[45] The British sent out their fleet in February 1755, intending to blockade French ports, but the French fleet had already sailed. Admiral Edward Hawke detached a fast squadron to North America in an attempt to intercept them.
In a second British action, Admiral Edward Boscawen fired on the French ship Alcide on June 8, 1755, capturing her and two troop ships.[46] The British harassed French shipping throughout 1755, seizing ships and capturing seamen. These actions contributed to the eventual formal declarations of war in spring 1756.[47]
An early important political response to the opening of hostilities was the convening of the Albany Congress in June and July, 1754. The goal of the congress was to formalize a unified front in trade and negotiations with various Indians, since allegiance of the various tribes and nations was seen to be pivotal in the war that was unfolding. The plan that the delegates agreed to was neither ratified by the colonial legislatures nor approved of by the crown. Nevertheless, the format of the congress and many specifics of the plan became the prototype for confederation during the War of Independence.
The British formed an aggressive plan of operations for 1755. General Braddock was to lead the expedition to Fort Duquesne,[48] while Massachusetts governor William Shirley was given the task of fortifying Fort Oswego and attacking Fort Niagara, and Sir William Johnson was to capture Fort St. Frdric at present-day Crown Point, New York.[49] Lieutenant Colonel Robert Monckton was to capture Fort Beausjour to the east, on the frontier between Nova Scotia and Acadia.[50]
Braddock led about 1,500 army troops and provincial militia on an expedition in June 1755 to take Fort Duquesne, with George Washington as one of his aides. The expedition was a disaster. It was attacked by French soldiers and Indian warriors ambushing them from up in trees and behind logs, and Braddock called for a retreat. He was killed and approximately 1,000 British soldiers were killed or injured.[48] The remaining 500 British troops retreated to Virginia, led by George Washington. Two future opponents in the American Revolutionary War played key roles in organizing the retreat: Washington and Thomas Gage.
The French acquired a copy of the British war plans, including the activities of Shirley and Johnson. Shirley's efforts to fortify Oswego were bogged down in logistical difficulties, exacerbated by his inexperience in managing large expeditions. In conjunction, Shirley was made aware that the French were massing for an attack on Fort Oswego in his absence when he planned to attack Fort Niagara. As a response, he left garrisons at Oswego, Fort Bull, and Fort Williams, the last two located on the Oneida Carry between the Mohawk River and Wood Creek at present-day Rome, New York. Supplies were cached at Fort Bull for use in the projected attack on Niagara.
Johnson's expedition was better organized than Shirley's, which was noticed by New France's governor the Marquis de Vaudreuil. He had primarily been concerned about the extended supply line to the forts on the Ohio, and he had sent Baron Dieskau to lead the defenses at Frontenac against Shirley's expected attack. Vaudreuil saw Johnson as the larger threat and sent Dieskau to Fort St. Frdric to meet that threat. Dieskau planned to attack the British encampment at Fort Edward at the upper end of navigation on the Hudson River, but Johnson had strongly fortified it, and Dieskau's Indian support was reluctant to attack. The two forces finally met in the bloody Battle of Lake George between Fort Edward and Fort William Henry. The battle ended inconclusively, with both sides withdrawing from the field. Johnson's advance stopped at Fort William Henry, and the French withdrew to Ticonderoga Point, where they began the construction of Fort Carillon (later renamed Fort Ticonderoga after British capture in 1759).
Colonel Monckton captured Fort Beausjour in June 1755 in the sole British success that year, cutting off the French fortress at Louisbourg from land-based reinforcements. To cut vital supplies to Louisbourg, Nova Scotia's Governor Charles Lawrence ordered the deportation of the French-speaking Acadian population from the area. Monckton's forces, including companies of Rogers' Rangers, forcibly removed thousands of Acadians, chasing down many who resisted and sometimes committing atrocities. More than any other factor, cutting off supplies to Louisbourg led to its demise.[51] The Acadian resistance was sometimes quite stiff, in concert with Indian allies including the Mi'kmaq, with ongoing frontier raids against Dartmouth and Lunenburg, among others. The only clashes of any size were at Petitcodiac in 1755 and at Bloody Creek near Annapolis Royal in 1757, other than the campaigns to expel the Acadians ranging around the Bay of Fundy, on the Petitcodiac and St. John rivers, and ?le Saint-Jean.
Following the death of Braddock, William Shirley assumed command of British forces in North America, and he laid out his plans for 1756 at a meeting in Albany in December 1755. He proposed renewing the efforts to capture Niagara, Crown Point, and Duquesne, with attacks on Fort Frontenac on the north shore of Lake Ontario and an expedition through the wilderness of the Maine district and down the Chaudire River to attack the city of Quebec. His plan, however, got bogged down by disagreements and disputes with others, including William Johnson and New York's Governor Sir Charles Hardy, and consequently gained little support.
Newcastle replaced him in January 1756 with Lord Loudoun, with Major General James Abercrombie as his second in command. Neither of these men had as much campaign experience as the trio of officers whom France sent to North America.[47] French regular army reinforcements arrived in New France in May 1756, led by Major General Louis-Joseph de Montcalm and seconded by the Chevalier de Lvis and Colonel Fran?ois-Charles de Bourlamaque, all experienced veterans from the War of the Austrian Succession. On May 18, 1756, England formally declared war on France, which expanded the war into Europe and came to be known as the Seven Years' War.
Governor Vaudreuil had ambitions to become the French commander in chief, in addition to his role as governor, and he acted during the winter of 1756 before those reinforcements arrived. Scouts had reported the weakness of the British supply chain, so he ordered an attack against the forts which Shirley had erected at the Oneida Carry. In the Battle of Fort Bull, French forces destroyed the fort and large quantities of supplies, including 45,000 pounds of gunpowder. They set back any British hopes for campaigns on Lake Ontario and endangered the Oswego garrison, already short on supplies. French forces in the Ohio valley also continued to intrigue with Indians throughout the area, encouraging them to raid frontier settlements. This led to ongoing alarms along the western frontiers, with streams of refugees returning east to get away from the action.
The new British command was not in place until July. Abercrombie arrived in Albany but refused to take any significant actions until Loudoun approved them, and Montcalm took bold action against his inertia. He built on Vaudreuil's work harassing the Oswego garrison and executed a strategic feint by moving his headquarters to Ticonderoga, as if to presage another attack along Lake George. With Abercrombie pinned down at Albany, Montcalm slipped away and led the successful attack on Oswego in August. In the aftermath, Montcalm and the Indians under his command disagreed about the disposition of prisoners' personal effects. The Europeans did not consider them prizes and prevented the Indians from stripping the prisoners of their valuables, which angered the Indians.
Loudoun was a capable administrator but a cautious field commander, and he planned one major operation for 1757: an attack on New France's capital of Quebec. He left a sizable force at Fort William Henry to distract Montcalm and began organizing for the expedition to Quebec. He was then ordered to attack Louisbourg first by William Pitt, the Secretary of State responsible for the colonies. The expedition was beset by delays of all kinds but was finally ready to sail from Halifax, Nova Scotia in early August. In the meantime, French ships had escaped the British blockade of the French coast, and a fleet awaited Loudoun at Louisbourg which outnumbered the British fleet. Faced with this strength, Loudoun returned to New York amid news that a massacre had occurred at Fort William Henry.
French irregular forces (Canadian scouts and Indians) harassed Fort William Henry throughout the first half of 1757. In January, they ambushed British rangers near Ticonderoga. In February, they launched a raid against the position across the frozen Lake George, destroying storehouses and buildings outside the main fortification. In early August, Montcalm and 7,000 troops besieged the fort, which capitulated with an agreement to withdraw under parole. When the withdrawal began, some of Montcalm's Indian allies attacked the British column because they were angry about the lost opportunity for loot, killing and capturing several hundred men, women, children, and slaves. The aftermath of the siege may have contributed to the transmission of smallpox into remote Indian populations, as some Indians were reported to have traveled from beyond the Mississippi to participate in the campaign and returned afterward. Modern writer William Nester believes that the Indians might have been exposed to European carriers, although no proof exists.[52]
Vaudreuil and Montcalm were minimally resupplied in 1758, as the British blockade of the French coastline limited French shipping. The situation in New France was further exacerbated by a poor harvest in 1757, a difficult winter, and the allegedly corrupt machinations of Fran?ois Bigot, the intendant of the territory. His schemes to supply the colony inflated prices and were believed by Montcalm to line his pockets and those of his associates. A massive outbreak of smallpox among western Indian tribes led many of them to stay away from trading in 1758. The disease probably spread through the crowded conditions at William Henry after the battle;[53] yet the Indians blamed the French for bringing "bad medicine" as well as denying them prizes at Fort William Henry.
Montcalm focused his meager resources on the defense of the St. Lawrence, with primary defenses at Carillon, Quebec, and Louisbourg, while Vaudreuil argued unsuccessfully for a continuation of the raiding tactics that had worked quite effectively in previous years.[54] The British failures in North America combined with other failures in the European theater and led to Newcastle's fall from power along with the Duke of Cumberland, his principal military advisor.
Newcastle and Pitt joined in an uneasy coalition in which Pitt dominated the military planning. He embarked on a plan for the 1758 campaign that was largely developed by Loudoun. He had been replaced by Abercrombie as commander in chief after the failures of 1757. Pitt's plan called for three major offensive actions involving large numbers of regular troops supported by the provincial militias, aimed at capturing the heartlands of New France. Two of the expeditions were successful, with Fort Duquesne and Louisbourg falling to sizable British forces.
The Forbes Expedition was a British campaign in SeptemberÿOctober 1758, with 6,000 troops led by General John Forbes sent to drive out the French from the contested Ohio Country. The French withdrew from Fort Duquesne and left the British in control of the Ohio River Valley.[55] The great French fortress at Louisbourg in Nova Scotia was captured after a siege.[56]
The third invasion was stopped with the improbable French victory in the Battle of Carillon, in which 3,600 Frenchmen defeated Abercrombie's force of 18,000 regulars, militia, and Indian allies outside the fort which the French called Carillon and the British called Ticonderoga. Abercrombie saved something from the disaster when he sent John Bradstreet on an expedition that successfully destroyed Fort Frontenac, including caches of supplies destined for New France's western forts and furs destined for Europe. Abercrombie was recalled and replaced by Jeffery Amherst, victor at Louisbourg.
The French had generally poor results in 1758 in most theaters of the war. The new foreign minister was the duc de Choiseul, and he decided to focus on an invasion of Britain to draw British resources away from North America and the European mainland. The invasion failed both militarily and politically, as Pitt again planned significant campaigns against New France and sent funds to Britain's mainland ally of Prussia, while the French Navy failed in the 1759 naval battles at Lagos and Quiberon Bay. In one piece of good fortune, some French supply ships did manage to depart France and elude the British blockade of the French coast.
British victories continued in all theaters in the Annus Mirabilis of 1759: the British captured Ticonderoga, James Wolfe defeated Montcalm at Quebec in a battle that claimed the lives of both commanders, and a British victory at Fort Niagara cut off the French frontier forts to the west and south. The victory was made complete in 1760; the British did suffer a defeat outside Quebec City in the Battle of Sainte-Foy, but they prevented the arrival of French relief ships in the naval Battle of the Restigouche while armies marched on Montreal from three sides.
Governor Vaudreuil in Montreal negotiated a capitulation with General Amherst in September 1760. Amherst granted his requests that any French residents who chose to remain in the colony would be given freedom to continue worshiping in their Roman Catholic tradition, to own property, and to remain undisturbed in their homes. The British provided medical treatment for the sick and wounded French soldiers, and French regular troops were returned to France aboard British ships with an agreement that they were not to serve again in the present war.[57]
Most of the fighting ended in continental North America in 1760, although it continued in Europe between France and Britain. The notable exception was the French seizure of St. John's, Newfoundland. General Amherst heard of this surprise action and immediately dispatched troops under his nephew William Amherst, who regained control of Newfoundland after the Battle of Signal Hill in September 1762.[58] Many troops from North America were reassigned to participate in further British actions in the West Indies, including the capture of Spanish Havana when Spain belatedly entered the conflict on the side of France, and a British expedition against French Martinique in 1762 led by Major General Robert Monckton.[59]
General Amherst also oversaw the transition of French forts to British control in the western lands. The policies which he introduced in those lands disturbed large numbers of Indians and contributed to Pontiac's Rebellion in 1763.[60] This series of attacks on frontier forts and settlements required the continued deployment of British troops, and it was not resolved until 1766.[61]
The war in North America officially ended with the signing of the Treaty of Paris on 10 February 1763, and war in the European theater was settled by the Treaty of Hubertusburg on 15 February 1763. The British offered France the choice of surrendering either its continental North American possessions east of the Mississippi or the Caribbean islands of Guadeloupe and Martinique, which had been occupied by the British. France chose to cede the former but was able to negotiate the retention of Saint Pierre and Miquelon, two small islands in the Gulf of St. Lawrence, along with fishing rights in the area. They viewed the economic value of the Caribbean islands' sugar cane to be greater and easier to defend than the furs from the continent. French philosopher Voltaire referred to Canada disparagingly as nothing more than a few acres of snow. The British, however, were happy to take New France, as defence of their North American colonies would no longer be an issue; they also had ample places from which to obtain sugar. Spain traded Florida to Britain in order to regain Cuba, but they also gained Louisiana from France, including New Orleans, in compensation for their losses. Great Britain and Spain also agreed that navigation on the Mississippi River was to be open to vessels of all nations.[62]
The war changed economic, political, governmental, and social relations among the three European powers, their colonies, and the people who inhabited those territories. France and Britain both suffered financially because of the war, with significant long-term consequences.
Britain gained control of French Canada and Acadia, colonies containing approximately 80,000 primarily French-speaking Roman Catholic residents. The deportation of Acadians beginning in 1755 made land available to immigrants from Europe and migrants from the colonies to the south. The British resettled many Acadians throughout its North American provinces, but many went to France, and some went to New Orleans, which they had expected to remain French. Some were sent to colonize places as diverse as French Guiana and the Falkland Islands, but these efforts were unsuccessful. Others migrated to places such as Saint-Domingue or fled to New Orleans after the Haitian Revolution. The Louisiana population contributed to the founding of the modern Cajun population. (The French word "Acadien" changed to "Cadien" then to "Cajun".)[63]
Following the treaty, King George III issued the Royal Proclamation of 1763 on October 7, 1763 which outlined the division and administration of the newly conquered territory, and it continues to govern relations to some extent between the government of modern Canada and the First Nations. Included in its provisions was the reservation of lands west of the Appalachian Mountains to its Indian population,[64] a demarcation that was only a temporary impediment to a rising tide of westward-bound settlers.[65] The proclamation also contained provisions that prevented civic participation by the Roman Catholic Canadians.[66] The Quebec Act addressed this and other issues in 1774, raising concerns in the largely Protestant Thirteen Colonies over the advance of "popery." The Act maintained French Civil law, including the seigneurial system, a medieval code removed from France within a generation by the French Revolution.
The Seven Years' War nearly doubled Great Britain's national debt. The Crown sought sources of revenue to pay it off and attempted to impose new taxes on its colonies. These attempts were met with increasingly stiff resistance, until troops were called in to enforce the Crown's authority. These acts ultimately led to the start of the American Revolutionary War.[67]
France attached comparatively little value to its North American possessions, apart from the highly profitable sugar-producing Antilles islands which it retained. Minister Choiseul considered that he had made a good deal at the Treaty of Paris, and Voltaire wrote that Louis XV had lost "a few acres of snow".[68] For France, however, the military defeat and the financial burden of the war weakened the monarchy and contributed to the advent of the French Revolution in 1789.[69]
For some of the Indian tribes, the elimination of French power in North America meant the disappearance of a strong ally, although other tribes were not so affected.[69] The Ohio Country was now more available to colonial settlement, due to the construction of military roads by Braddock and Forbes.[70] The Spanish takeover of the Louisiana territory was not completed until 1769, and it had modest repercussions. The British takeover of Spanish Florida resulted in the westward migration of Indian tribes who did not want to do business with them. This migration also caused a rise in tensions between the Choctaw and the Creek, historic enemies who were now competing for land.[71] The change of control in Florida also prompted most of its Spanish Catholic population to leave. Most went to Cuba, including the entire governmental records from St. Augustine, although some Christianized Yamasee were resettled to the coast of Mexico.[72]
France returned to North America in 1778 with the establishment of a Franco-American alliance against Great Britain in the American War of Independence. This time, France succeeded in prevailing over Great Britain in what historian Alfred A. Cave describes as "French revenge for Montcalm's death".[73]
Who is the current sheriff of maricopa county arizona?
Paul Penzone🚨The Maricopa County Sheriff's Office (MCSO) is a local law enforcement agency that serves Maricopa County, Arizona. It has its headquarters at 550 West Jackson Street, Phoenix.[3] It is the largest sheriff's office in Arizona and provides general-service and specialized law enforcement to unincorporated areas of Maricopa County, serving as the primary law enforcement for unincorporated areas of the county as well as incorporated cities within the county who have contracted with the agency for law-enforcement services (known as "contract cities"). It also operates the county jail system. Paul Penzone is the current Sheriff of Maricopa County. Its ongoing practices are highly controversial, to which it has received national and international media coverage, along with a number of federal investigations and other controversies under former sheriff Joe Arpaio.
The MCSO does not possess a legal identity separate from Maricopa County. Deputy Sheriffs of the Maricopa County Sheriff's Office are delegated their law enforcement authority by the Sheriff of Maricopa County.[4]
Maricopa County is the fourth largest county in the United States, and has a total area of 9,224 square miles (23,900?km2). The county is currently divided into six geographical areas, referred to as Districts, and consist of District 1, District 2, District 3, District 4, District 6, and District 7. Districts are generally staffed by a District Commander (Captain), Deputy Commander (Lieutenant), uniformed sergeants and patrol deputies, detectives, and administrative staff. Districts overlap city agencies, as the Sheriff's Office has concurrent jurisdiction in these areas.
District 1 ÿ covers an area of approximately 1,053 square miles (2,730?km2) in the southeast quadrant of the county. District One encompasses the cities of Chandler, Gilbert, Mesa, and Tempe, along with the Town of Guadalupe and CDP of Sun Lakes. District One also includes portions of the Town of Queen Creek, and the cities of Apache Junction, Scottsdale, and Phoenix, including the Ahwatukee Foothills. While District One is not the largest district in size, it is historically the busiest, averaging approximately 40% more calls for service than any of the other districts.
District 2 ÿ covers an area of approximately 5,200 square miles (13,000?km2) in the southwest quadrant of the county. District Two provides service to the rural areas of Buckeye, Laveen, Mobile, Rainbow Valley, and Tonopah, as well as to the contract cities of Gila Bend and Litchfield Park. District Two also includes portions of Avondale, Glendale, Goodyear, and Phoenix.
District 3 ÿ covers an area of approximately 1,600 square miles (4,100?km2), bordered by Northern Avenue on the south and I-17 on the east, extending to the northern and western borders of the county. District Three includes the areas of Sun City and Sun City West, the communities of Wittmann, Waddell, Circle City, Morristown, Whispering Ranch, Aguila, Gladden, and the unincorporated neighborhoods surrounding Peoria, Surprise, and Wickenburg.
District 4 ÿ covers the unincorporated areas of Anthem, Desert Foothills, New River, Cave Creek, Carefree and Tonto Hills. District Four also provides law enforcement to the contract Towns of Cave Creek and Carefree.
District 6 ÿ covers, on a contract basis, the Town of Queen Creek. The Town of Queen Creek was previously a part of District One, but was officially designated as its own district in 2008, with its own complement of deputies and command staff. There continues to be, however, several unincorporated neighborhoods in and around Queen Creek that are still serviced by District One. Portions of the corporate town limits of Queen Creek fall within the jurisdictional boundaries of Pinal County, but are provided law enforcement services by Maricopa County.
District 7 ÿ covers the unincorporated areas of Fountain Hills, Tonto Verde and Rio Verde. It also provides contract law enforcement services on a contract basis to the Town of Fountain Hills.
The following insignia are respective of the Maricopa County Sheriff's Office Uniform Specification policy GC-20. Detectives & Deputies are the same rank. Detective is not a promotion within MCSO it's just a different assignment.[5]
Lake Patrol
In addition to patrolling the unincorporated areas, the Maricopa County Sheriff's Office is responsible for patrolling the lakes and waterways in the recreational areas within the county. The Lake Patrol Division is responsible for law enforcement services in the recreational areas of Tonto National Forest and Lake Pleasant Regional Park. This area includes Saguaro, Canyon, Apache, Bartlett and Horseshoe Lakes as well as the Lower Salt and Verde River recreational areas, Four Peaks, Superstition, Mazatzals, Camp Creek and Seven Springs recreational and wilderness areas. The total area of responsibility is over 1,000 square miles (2,600?km2), which are visited by an estimated 1.5 million people every year.
Lake Patrol Division deputy sheriffs operate four-wheel-drive vehicles, all-terrain vehicles, patrol boats, jet skis and an air boat. All are certified Emergency Medical Technicians and several are Paramedics. The division also has a detective section which investigates crimes, deaths and boating accidents on the lakes and rivers.
Trails Division
The Trails Division has the responsibility for law enforcement services in the recreational and wilderness areas of the Maricopa County Parks. The total area of responsibility consists of over 120,000 acres (490?km2) that are visited by approximately 1.8 million persons each year. The Maricopa County parks system is the largest regional parks system in the nation, and includes the areas of Buckeye Hills Recreation area, Cave Creek Regional Park, Estrella Mountain Regional Park, Lake Pleasant Regional Park, McDowell Mountain Regional Park, San Tan Mountain Regional Park, Spur Cross Ranch Conservation Area, The Desert Outdoor Center at Lake Pleasant, Usery Mountain Regional Park, and White Tank Mountain Regional Park.
The Trails Division deputies operate four-wheel drive vehicles and all-terrain vehicles, and also uses bicycles, mounted, and foot patrols.
Aviation Division
The Aviation Division provides airborne law enforcement support to uniformed patrol, Lake Patrol, Search and Rescue operations, narcotics enforcement, extraditions and SWAT operations.
The Aviation Division is staffed with a Commander, Helicopter Flight Operation Supervisor, Fixed Wing Supervisor, Director of Maintenance, fourteen sworn Deputies, five Civilians, two instructor pilots and an Administrative Coordinator. The Division is a 24-hour-a-day 7 days a week operation and employs four helicopters and two fixed-wing aircraft.
The flagship of the division is a Bell 407 helicopter with a call sign of "FOX 1." This helicopter is equipped with FLIR (Forward Looking Infra Red), stabilized binoculars, Ultichart moving map and an SX-5 night sun spotlight. This helicopter performs direct patrol, search and rescue operations, narcotics surveillance and photo missions.
In addition "FOX-1", the Aviation Division operates "FOX 4," a Bell Military OH-58 helicopter acquired in 1996 from the Defense Reutilization Program. It was completely rebuilt and placed into service in July 1998, and is also used for direct patrol, search and rescue operations, narcotics surveillance and photo missions.
"FOX 5" is a Schweitzer TH-55 helicopter acquired from the Defense Reutilization Program. "FOX-5" has been fully restored and is the primary training aircraft for MCSO Aviation personnel. This is the same type of aircraft used by the military to train pilots since the 1960s.
For fixed wing, the division utilizes a single engine Cessna 206 and a twin engine PA-31 Piper Navajo. Both aircraft are housed at Glendale Airport and are used primarily for extraditing fugitives from other states. Fixed wing aircraft are also used for narcotics and smuggling surveillance missions.
K9 Unit
The Sheriff's Canine Unit includes 25 canines with various specialties, including narcotics, explosive ordnance, cadaver, and patrol. The unit's staff consists of one Sergeant, ten Deputies, five Detectives and four Detention Officers.
All canines trained in narcotic detection are capable of finding and aggressively alerting on cocaine, marijuana, methamphetamines, heroin, and their derivatives. Collectively, they have assisted in the seizure of over 5.3 million dollars in narcotic tainted monies. All canines trained in explosive ordnance detection are trained to detect and passively alert on 13 different odors. The tobacco canine is trained to find and aggressively alert on all forms of tobacco.
Patrol dogs are trained in building searches, area searches, officer protection, crowd control, trailing, and provide a strong psychological deterrent to certain types of criminal misconduct. Our cadaver canine is trained to find and passively alert on decaying human tissues, bones, and fluids. Our bloodhounds are utilized to track down suspects and locate missing or lost individuals.
Canine team members typically work patrol operations during peak activity hours, usually from about 6 PM to 4 AM. They also augment SWAT operations; provide contractual services for narcotic detection at several local schools; provide narcotic and explosive ordnance detection for not only our office, but for other local, state, and federal agencies; they are on call 7 days a week 24 hours a day, and conduct over 100 public relations demonstrations annually.
The utilization of police canines provides law enforcement with a non-lethal means of apprehending dangerous criminal offenders; detecting intruders and alerting handlers to their presence; pursuing, attacking and holding criminal offenders who resist apprehension; searching and clearing buildings and large open areas for criminals; tracking lost children or other persons; detecting the presence of certain narcotics, explosives, and tobacco products; locating deceased subjects, crime scenes, and minute physical evidence; and provide a strong psychological deterrent to certain types of criminal misconduct.
There is a permanent, organized Sheriff's Posse that provides civilian volunteer support to the sworn deputies of the Sheriff's Office. Dozens of individual posse units can be called upon for various needs, such as a Jeep posse, equine mounted units in various cities, a ham radio operator posse, a diver's posse, and an air posse of licensed pilots.
Since the establishment of the Maricopa County Sheriff's Office, 17 officers have died in the line of duty.[6]
The Maricopa County Sheriff's Office has been involved in many controversial acts, lawsuits, and other operations that have been called into question, from alleged racial profiling to jail conditions. Former Maricopa County Sheriff, Joe Arpaio, has also been criticized over a number of incidents and policies.
The United States Department of Justice Civil Rights Division is investigating the Maricopa County Sheriff's Office in relation to alleged racism and abuse of power, as well as refusing to cooperate with a federal Justice Department investigation.[7]
Maricopa County Sheriff's Office is also featured on TLC's television program Police Women of Maricopa County (2010).[8]
Who was the lead singer of the damned?
Dave Vanian (born David Lett🚨The Damned
David Vanian & the Phantom Chords
Naz Nomad and the Nightmares
Dave Vanian (born David Lett, 12 October 1956) is a rock musician and lead singer of the punk rock band The Damned. Formed in 1976 in London, The Damned were the first British punk band to release a single, an album, have a record hit the UK charts, and tour the United States. With a fluid line-up since their founding, Vanian has been the only ever-present member.
Born in Hemel Hempstead, Vanian changed his name from Lett to Vanian in early life after a previous stint as a gravedigger ÿ Vanian being a play on "Transylvanian". He remains one of the early influencers of gothic fashion, wearing dark and otherworldly clothing both on stage and off. He is known to be a fan of renaissance art, film noir and horror movies, all of which manifest in his stage appearance.[citation needed] In November 1976, the British music magazine NME stated that Vanian "resembles a runaway from the Addams Family".[1]
In 1978, he was guest in the song "Don't Panic England", from the band Doctors of Madness. In 2004, he and Captain Sensible turned on the Christmas lights in Cambridge, causing some controversy.[2]
Vanian sang with the MC5 for their 40th anniversary singing "Looking at You", which was released as part of Revolution: A Celebration of the MC5. In 2008, with the band the Throb, played "Let's Get Lost (Sailor Jerry's Story)" to the compilation The Original Sailor Jerry Rum ÿ Music To ........... To. Outside of the Damned he has led the rockabilly band David Vanian & the Phantom Chords, hosted Dave Vanian's Dark Screen on the UK-based television channel Rockworld TV and composed the soundtrack for the 2009 film, The Perfect Sleep.[3]
Vanian has kept his personal life out of the limelight, even opting out of any input towards The Damned biography The Light at the End of the Tunnel by Carol Clerk. Vanian joined The Damned in 1976 He married his first wife Laurie in 1977, but they separated in the mid-1990s.
He married Patricia Morrison in Las Vegas in 1996, after The Damned had performed an Australian tour. The couple have one child and live in Islington in London [4]
What type of gas is produced during fermentation?
hydrogen gas (H2)🚨Fermentation is a metabolic process that consumes sugar in the absence of oxygen. The products are organic acids, gases, or alcohol. It occurs in yeast and bacteria, and also in oxygen-starved muscle cells, as in the case of lactic acid fermentation. The science of fermentation is known as zymology.
In microorganisms, fermentation is the primary means of producing ATP by the degradation of organic nutrients anaerobically.[1] Humans have used fermentation to produce foodstuffs and beverages since the Neolithic age. For example, fermentation is used for preservation in a process that produces lactic acid as found in such sour foods as pickled cucumbers, kimchi and yogurt (see fermentation in food processing), as well as for producing alcoholic beverages such as wine (see fermentation in winemaking) and beer. Fermentation occurs within the gastrointestinal tracts of all animals, including humans.[2]
Below are some definitions of fermentation. They range from informal, general usages to more scientific definitions.[3]
Along with photosynthesis and aerobic respiration, fermentation is a way of extracting energy from molecules, but it is the only one common to all bacteria and eukaryotes. It is therefore considered the oldest metabolic pathway, suitable for an environment that did not yet have oxygen.[4]:389 Yeast, a form of fungus, occurs in almost any environment capable of supporting microbes, from the skins of fruits to the guts of insects and mammals and the deep ocean, and they harvest sugar-rich materials to produce ethanol and carbon dioxide.[5][6]
The basic mechanism for fermentation remains present in all cells of higher organisms. Mammalian muscle carries out the fermentation that occurs during periods of intense exercise where oxygen supply becomes limited, resulting in the creation of lactic acid.[7]:63 In invertebrates, fermentation also produces succinate and alanine.[8]:141
Fermentative bacteria play an essential role in the production of methane in habitats ranging from the rumens of cattle to sewage digesters and freshwater sediments. They produce hydrogen, carbon dioxide, formate and acetate and carboxylic acids; and then consortia of microbes convert the carbon dioxide and acetate to methane. Acetogenic bacteria oxidize the acids, obtaining more acetate and either hydrogen or formate. Finally, methanogens (which are in the domain Archea) convert acetate to methane.[9]
Fermentation reacts NADH with an endogenous, organic electron acceptor.[1] Usually this is pyruvate formed from sugar through glycolysis. The reaction produces NAD+ and an organic product, typical examples being ethanol, lactic acid, carbon dioxide, and hydrogen gas (H2). However, more exotic compounds can be produced by fermentation, such as butyric acid and acetone. Fermentation products contain chemical energy (they are not fully oxidized), but are considered waste products, since they cannot be metabolized further without the use of oxygen.
Fermentation normally occurs in an anaerobic environment. In the presence of O2, NADH and pyruvate are used to generate ATP in respiration. This is called oxidative phosphorylation, and it generates much more ATP than glycolysis alone. For that reason, fermentation is rarely utilized when oxygen is available. However, even in the presence of abundant oxygen, some strains of yeast such as Saccharomyces cerevisiae prefer fermentation to aerobic respiration as long as there is an adequate supply of sugars (a phenomenon known as the Crabtree effect).[11] Some fermentation processes involve obligate anaerobes, which cannot tolerate oxygen.
Although yeast carries out the fermentation in the production of ethanol in beers, wines, and other alcoholic drinks, this is not the only possible agent: bacteria carry out the fermentation in the production of xanthan gum.
In ethanol fermentation, one glucose molecule is converted into two ethanol molecules and two carbon dioxide molecules.[12][13] It is used to make bread dough rise: the carbon dioxide forms bubbles, expanding the dough into a foam.[14][15] The ethanol is the intoxicating agent in alcoholic beverages such as wine, beer and liquor.[16] Fermentation of feedstocks including sugarcane, corn and sugar beets produces ethanol that is added to gasoline.[17] In some species of fish, including goldfish and carp, it provides energy when oxygen is scarce (along with lactic acid fermentation).[18]
The figure illustrates the process. Before fermentation, a glucose molecule breaks down into two pyruvate molecules. The energy from this exothermic reaction is used to bind inorganic phosphates to ATP and convert NAD+ to NADH. The pyruvates break down into two acetaldehyde molecules and give off two carbon dioxide molecules as a waste product. The acetaldehyde is reduced into ethanol using the energy and hydrogen from NADH and the NADH is oxidized into NAD+ so that the cycle may repeat. The reaction is catalysed by the enzymes pyruvate decarboxylase and alcohol dehydrogenase.[12]
Homolactic fermentation (producing only lactic acid) is the simplest type of fermentation. The pyruvate from glycolysis[19] undergoes a simple redox reaction, forming lactic acid.[20][21] It is unique because it is one of the only respiration processes to not produce a gas as a byproduct. Overall, one molecule of glucose (or any six-carbon sugar) is converted to two molecules of lactic acid:
It occurs in the muscles of animals when they need energy faster than the blood can supply oxygen. It also occurs in some kinds of bacteria (such as lactobacilli) and some fungi. It is the type of bacteria that converts lactose into lactic acid in yogurt, giving it its sour taste. These lactic acid bacteria can carry out either homolactic fermentation, where the end-product is mostly lactic acid, or
Heterolactic fermentation, where some lactate is further metabolized and results in ethanol and carbon dioxide[20] (via the phosphoketolase pathway), acetate, or other metabolic products, e.g.:
If lactose is fermented (as in yogurts and cheeses), it is first converted into glucose and galactose (both six-carbon sugars with the same atomic formula):
Heterolactic fermentation is in a sense intermediate between lactic acid fermentation, and other types, e.g. alcoholic fermentation (see below). The reasons to go further and convert lactic acid into anything else are:
Hydrogen gas is produced in many types of fermentation (mixed acid fermentation, butyric acid fermentation, caproate fermentation, butanol fermentation, glyoxylate fermentation), as a way to regenerate NAD+ from NADH. Electrons are transferred to ferredoxin, which in turn is oxidized by hydrogenase, producing H2.[12] Hydrogen gas is a substrate for methanogens and sulfate reducers, which keep the concentration of hydrogen low and favor the production of such an energy-rich compound,[22] but hydrogen gas at a fairly high concentration can nevertheless be formed, as in flatus.
As an example of mixed acid fermentation, bacteria such as Clostridium pasteurianum ferment glucose producing butyrate, acetate, carbon dioxide and hydrogen gas:[23] The reaction leading to acetate is:
Glucose could theoretically be converted into just CO2 and H2, but the global reaction releases little energy.
Most industrial fermentation uses batch or fed-batch procedures, although continuous fermentation can be more economical if various challenges, particularly the difficulty of maintaining sterility, can be met.[24]
In a batch process, all the ingredients are combined and the reactions proceed without any further input. Batch fermentation has been used for millennia to make bread and alcoholic beverages, and it is still a common method, especially when the process is not well understood.[25]:1 However, it can be expensive because the fermentor must be sterilized using high pressure steam between batches.[24] Strictly speaking, there is often addition of small quantities of chemicals to control the pH or suppress foaming.[25]:25
Batch fermentation goes through a series of phases. There is a lag phase in which cells adjust to their environment; then a phase in which exponential growth occurs. Once many of the nutrients have been consumed, the growth slows and becomes non-exponential, but production of secondary metabolites (including commercially important antibiotics and enzymes) accelerates. This continues through a stationary phase after most of the nutrients have been consumed, and then the cells die.[25]:25
Fed-batch fermentation is a variation of batch fermentation where some of the ingredients are added during the fermentation. This allows greater control over the stages of the process. In particular, production of secondary metabolites can be increased by adding a limited quantity of nutrients during the non-exponential growth phase. Fed-batch operations are often sandwiched between batch operations.[25]:1[26]
The high cost of sterilizing the fermentor between batches can be avoided using various open fermentation approaches that are able to resist contamination. One is to use a naturally evolved mixed culture. This is particularly favored in wastewater treatment, since mixed populations can adapt to a wide variety of wastes. Thermophilic bacteria can produce lactic acid at temperatures of around 50 degrees Celsius, sufficient to discourage microbial contamination; and ethanol has been produced at a temperature of 70C. This is just below its boiling point (78C), making it easy to extract. Halophilic bacteria can produce bioplastics in hypersaline conditions. Solid-state fermentation adds a small amount of water to a solid substrate; it is widely used in the food industry to produce flavors, enzymes and organic acids.[24]
In continuous fermentation, substrates are added and final products removed continuously.[24] There are three varieties: chemostats, which hold nutrient levels constant; turbidostats, which keep cell mass constant; and plug flow reactors in which the culture medium flows steadily through a tube while the cells are recycled from the outlet to the inlet.[26] If the process works well, there is a steady flow of feed and effluent and the costs of repeatedly setting up a batch are avoided. Also, it can prolong the exponential growth phase and avoid byproducts that inhibit the reactions by continuously removing them. However, it is difficult to maintain a steady state and avoid contamination, and the design tends to be complex.[24] Typically the fermentor must run for over 500 hours to be more economical than batch processors.[26]
The use of fermentation, particularly for beverages, has existed since the Neolithic and has been documented dating from 7000ÿ6600 BCE in Jiahu, China,[27] 5000 BCE in India, Ayurveda mentions many Medicated Wines, 6000 BCE in Georgia,[28] 3150 BCE in ancient Egypt,[29] 3000 BCE in Babylon,[30] 2000 BCE in pre-Hispanic Mexico,[30] and 1500 BC in Sudan.[31] Fermented foods have a religious significance in Judaism and Christianity. The Baltic god Rugutis was worshiped as the agent of fermentation.[32][33]
In 1837, Charles Cagniard de la Tour, Theodor Schwann and Friedrich Traugott Ktzing independently published papers concluding, as a result of microscopic investigations, that yeast is a living organism that reproduces by budding.[34][35]:6 Schwann boiled grape juice to kill the yeast and found that no fermentation would occur until new yeast was added. However, a lot of chemists , including Antoine Lavoisier, continued to view fermentation as a simple chemical reaction and rejected the notion that living organisms could be involved. This was seen as a reversion to vitalism, and was lampooned in an anonymous publication by Justus von Liebig and Friedrich W?hler.[4]:108ÿ109
The turning point came when Louis Pasteur (1822ÿ1895), during the 1850s and 1860s, repeated Schwann's experiments and showed that fermentation is initiated by living organisms in a series of investigations.[21][35]:6 In 1857, Pasteur showed that lactic acid fermentation is caused by living organisms.[36] In 1860, he demonstrated that bacteria cause souring in milk, a process formerly thought to be merely a chemical change, and his work in identifying the role of microorganisms in food spoilage led to the process of pasteurization.[37] In 1877, working to improve the French brewing industry, Pasteur published his famous paper on fermentation, "Etudes sur la Bire", which was translated into English in 1879 as "Studies on fermentation".[38] He defined fermentation (incorrectly) as "Life without air",[39] but correctly showed that specific types of microorganisms cause specific types of fermentations and specific end-products.
Although showing fermentation to be the result of the action of living microorganisms was a breakthrough, it did not explain the basic nature of the fermentation process, or prove that it is caused by the microorganisms that appear to be always present. Many scientists, including Pasteur, had unsuccessfully attempted to extract the fermentation enzyme from yeast.[39] Success came in 1897 when the German chemist Eduard Buechner ground up yeast, extracted a juice from them, then found to his amazement that this "dead" liquid would ferment a sugar solution, forming carbon dioxide and alcohol much like living yeasts.[40] Buechner's results are considered to mark the birth of biochemistry. The "unorganized ferments" behaved just like the organized ones. From that time on, the term enzyme came to be applied to all ferments. It was then understood that fermentation is caused by enzymes that are produced by microorganisms.[41] In 1907, Buechner won the Nobel Prize in chemistry for his work.[42]
Advances in microbiology and fermentation technology have continued steadily up until the present. For example, in the 1930s, it was discovered that microorganisms could be mutated with physical and chemical treatments to be higher-yielding, faster-growing, tolerant of less oxygen, and able to use a more concentrated medium.[43] Strain selection and hybridization developed as well, affecting most modern food fermentations.
The word "ferment" is derived from the Latin verb fervere, which means to boil. It is thought to have been first used in the late 14th century in alchemy, but only in a broad sense. It was not used in the modern scientific sense until around 1600.
What crop was known as virginia's gold and silver?
tobacco🚨Jamestown was the first settlement of the Virginia Colony, founded in 1607, and served as capital of Virginia until 1699, when the seat of government was moved to Williamsburg. This article covers the history of the fort and town at Jamestown proper, as well as colony-wide trends resulting from and affecting the town during the time period in which it was capital.
Virginia Company of London sent an expedition to establish a settlement in the Virginia Colony in December 1606. The expedition consisted of three ships, Susan Constant (sometimes known as Sarah Constant), Godspeed, and Discovery. The Discovery was the smallest ship; the largest ship, the Susan Constant, was captained by Christopher Newport. The ships left Blackwall, now part of London, with 105 men and boys and 39 crew-members.[1][2]
By April 6, 1607, Godspeed, Susan Constant and Discovery arrived at the Spanish colony of Puerto Rico, where they stopped for provisions before continuing their journey. In April 1607, the expedition reached the southern edge of the mouth of what is now known as the Chesapeake Bay. After an unusually long journey of more than four months, the 104 men and boys (one passenger of the original 105 died during the journey) arrived at their chosen settlement spot in Virginia.[3] There were no women on the first ships.[4]
Arriving at the entrance to the Chesapeake Bay in late April, they named the Virginia capes after the sons of their king, the southern Cape Henry, for Henry Frederick, Prince of Wales, and the northern Cape Charles, for his younger brother, Charles, Duke of York. On April 26, 1607, upon landing at Cape Henry, Chaplain Robert Hunt offered a prayer, and they set up a cross near the site of the current Cape Henry Memorial. This site came to be known as the "first landing." A party of the men explored the area and had a minor conflict with some Virginia Indians.[5]
After the expedition arrived in what is now Virginia, sealed orders from the Virginia Company were opened. These orders named Captain John Smith as a member of the governing Council. Smith had been arrested for mutiny during the voyage and was incarcerated aboard one of the ships. He had been scheduled to be hanged upon arrival, but was freed by Captain Newport after the opening of the orders. The same orders also directed the expedition to seek an inland site for their settlement, which would afford protection from enemy ships.
Obedient to their orders, the settlers and crewmembers re-boarded their three ships and proceeded into the Chesapeake Bay. They landed again at what is now called Old Point Comfort in the City of Hampton. In the following days, seeking a suitable location for their settlement, the ships ventured upstream along the James River. Both the James River and the settlement they sought to establish, Jamestown (originally called "James His Towne") were named in honor of King James I.
On May 14, 1607, the colonists chose Jamestown Island for their settlement pretty much because the Virginia Company advised them to select a location that could be easily defended from attacks by other European states that were also establishing New World colonies and were periodically at war with England, notably the Dutch Republic, France, and Spain.
The island fit the criteria as it had excellent visibility up and down the James River, and it was far enough inland to minimize the potential of contact and conflict with enemy ships. The water immediately adjacent to the land was deep enough to permit the colonists to anchor their ships, yet have an easy and quick departure if necessary. An additional benefit of the site was that the land was not occupied by the Virginia Indians, most of whom were affiliated with the Powhatan Confederacy. Largely cut off from the mainland, the shallow harbor afforded the earliest settlers docking of their ships. This was its greatest attraction, but it also created a number of challenging problems for the settlers.
The settlers came ashore and quickly set about constructing their initial fort. Despite the immediate area of Jamestown being uninhabited, the settlers were attacked less than two weeks after their arrival on May 14, by Paspahegh Indians who succeeded in killing one of the settlers and wounding eleven more. Within a month, James Fort covered an acre on Jamestown Island. The wooden palisaded walls formed a triangle around a storehouse, church, and a number of houses. The fort burned down the following year.[6]
It soon became apparent why the Virginia Indians did not occupy the site: Jamestown Island, then a peninsula, is a swampy area, and its isolation from the mainland meant that there was limited hunting available, as most game animals required larger foraging areas. The settlers quickly hunted and killed off all the large and smaller game animals that were found on the tiny peninsula. In addition, the low, marshy area was infested with airborne pests, including mosquitoes, which carried malaria, and the brackish water of the tidal James River was not a good source of water. Over 135 settlers died from malaria, and drinking the salinated and contaminated water caused many to suffer from saltwater poisoning, fevers, and dysentery.
King James I had outlined the members of the Council to govern the settlement in the sealed orders which left London with the colonists in 1606.[7]
Those named for the initial Council were:
The Council received additional members from the First and Second Supply missions brought by Captain Newport. These were:
Also notable among the first settlers was:
Many of the settlers who came over on the initial three ships were not well-equipped for the life they found in Jamestown. A number of the original settlers were upper-class gentlemen who were not accustomed to manual labor; the group included very few farmers or skilled laborers.[8] The climate, location, and makeup of the settlement resulted in many settlers dying of disease and starvation.
By June 15, the settlers finished building the triangular James Fort. A week later, Newport sailed back for London on Susan Constant with a load of pyrite ("fools gold") and other supposedly precious minerals, leaving behind 104 colonists and Discovery. Newport returned twice from England with additional supplies in the following 18 months, leading what were termed the First and Second Supply missions.
The "First Supply" arrived on January 2, 1608. It contained insufficient provisions and more than 70 new colonists.[9] Despite original intentions to grow food and trade with the Virginia Indians, the barely surviving colonists became dependent upon supply missions.
On October 1, 1608, 70 new settlers arrived aboard the English "Mary and Margaret" with the Second Supply, following a journey of approximately three months. Included in the Second Supply were Thomas Graves, Thomas Forrest, Esq and "Mistress Forrest and Anne Burras her maide." Mistress Forrest and Anne Burras were the first two women known to have come to the Jamestown Colony. Remains unearthed at Jamestown in 1997 may be those of Mistress Forrest.[10]
Also included on the Second Supply were the first non-English settlers. The company recruited these as skilled craftsmen and industry specialists: soap-ash, glass, lumber milling (wainscot, clapboard, and deal planks, especially soft wood planks) and naval stores (pitch, turpentine, and tar).[11][12][13][14][15][16] Among these additional settlers were eight "Dutch-men" (consisting of unnamed craftsmen and three who were probably the wood-mill-men Adam, Franz and Samuel) "Dutch-men" (probably meaning German or German-speakers),[17] Polish and Slovak craftsmen,[11][12][13][14][15][16] who had been hired by the Virginia Company of London's leaders to help develop and manufacture profitable export products. There has been debate about the nationality of the specific craftsmen, and both the Germans and Poles claim the glassmaker for one of their own, but the evidence is insufficient.[18] Ethnicity is further complicated by the fact that the German minority in Royal Prussia lived under Polish control during this period. These workers staged the first recorded strike in Colonial America for the right to vote in the colony's 1619 election.
William Volday/Wilhelm Waldi, a Swiss German mineral prospector, was also among those who arrived in 1608. His mission was seeking a silver reservoir that was believed to be within the proximity of Jamestown.[19] Some of the settlers were artisans who built a glass furnace which became the first proto-factory in British North America. Additional craftsmen produced soap, pitch, and wood building supplies. Among all of these were the first made-in-America products to be exported to Europe.[20] However, despite all these efforts, profits from exports were not sufficient to meet the expenses and expectations of the investors back in England, and no silver or gold had been discovered, as earlier hoped.
The investors of the Virginia Company of London expected to reap rewards from their speculative investments. With the Second Supply, they expressed their frustrations and made demands upon the leaders of Jamestown in written form.
It fell to the third president of the Council to deliver a reply. By this time, Wingfield and Ratcliffe had been replaced by John Smith. Ever bold, Smith delivered what must have been a wake-up call to the investors in London. In what has been termed "Smith's Rude Answer", he composed a letter, writing (in part):
Smith did begin his letter with something of an apology, saying "I humbly intreat your Pardons if I offend you with my rude Answer...",[21] although at the time, the word rude was acknowledged to mean unfinished or rural, in the same way modern English uses rustic.
There are strong indications that those in London comprehended and embraced Smith's message. Their Third Supply mission was by far the largest and best equipped. They even had a new purpose-built flagship constructed, Sea Venture, placed in the most experienced of hands, Christopher Newport. With a fleet of no fewer than eight ships, the Third Supply, led by Sea Venture, left Plymouth in June, 1609.
On the subject of the Virginia Company, it is notable that, throughout its existence, Sir Edwin Sandys, was a leading force. He, of course, also hoped for profits, but also his goals included a permanent colony which would enlarge English territory, relieve the nation's overpopulation, and expand the market for English goods. He is closely identified with a faction of the company led by Henry Wriothesley, 3rd Earl of Southampton. Although profits proved elusive for their investors, the visions for the Colony of Sir Edwin Sandys and the Earl of Southampton were eventually accomplished.
In the months before becoming president of the colony for a year in September 1608, Captain John Smith did considerable exploration up the Chesapeake Bay and along the various rivers. He is credited by legend with naming Stingray Point (near present-day Deltaville in Middlesex County for an incident there).
Smith was always seeking a supply of food for the colonists, and he successfully traded for food with the Nansemonds, who were located along the Nansemond River in the modern-day City of Suffolk, and several other groups. However, while leading one food-gathering expedition in December 1607 (before his term as colony president), this time up the Chickahominy River west of Jamestown, his men were set upon by Powhatan Indians. As his party was being slaughtered around him, Smith strapped his Indian guide in front of him as a shield and escaped with his life but was captured by Opechancanough, the Powhatan chief's half-brother. Smith gave him a compass which pleased the warrior and made him decide to let Smith live.
Smith was taken before Wahunsunacock, who was commonly referred to as Chief Powhatan, at the Powhatan Confederacy's seat of government at Werowocomoco on the York River. However, 17 years later, in 1624, Smith first related that when the chief decided to execute him, this course of action was stopped by the pleas of Chief Powhatan's young daughter, Pocahontas, who was originally named "Matoaka" but whose nickname meant "Playful Mischief". Many historians today find this account dubious, especially as it was omitted in all his previous versions. Smith returned to Jamestown just in time for the First Supply, in January 1608.
In September 1609, Smith was wounded in an accident. He was walking with his gun in the river, and the powder was in a pouch on his belt. His powder bag exploded. In October, he was sent back to England for medical treatment.
While back in England, Smith wrote A True Relation and The Proceedings of the English Colony of Virginia about his experiences in Jamestown. These books, whose accuracy has been questioned by some historians due to some extent by Smith's boastful prose, were to generate public interest and new investment for the colony.
Although the life of Chief Powhatan's young daughter, Pocahontas, would be largely tied to the English after legend credits her with saving John Smith's life after his capture by Opechancanough, her contacts with Smith himself were minimal. However, records indicate that she became something of an emissary to the colonists at Jamestown Island. During their first winter, following an almost complete destruction of their fort by a fire in January 1608, Pocahontas brought food and clothing to the colonists. She later negotiated with Smith for the release of Virginia Indians who had been captured by the colonists during a raid to gain English weaponry.
During the next several years, the relationship between the Virginia Indians and the colonists became more strained, never more so than during the period of poor crops for both the natives and colonists which became known as the Starving Time in late 1609 and early 1610. Chief Powhatan relocated his principal capital from Werowocomoco, which was relatively close to Jamestown along the north shore of the York River, to a point more inland and secure along the upper reaches of the Chickahominy River.
In April 1613, Pocahontas and her husband, Kocoum were residing at Passapatanzy, a village of the Patawomecks, a Powhatan Confederacy tribe which did some trading with Powhatans. They lived in present-day Stafford County on the Potomac River near Fredericksburg, about 65 miles (105?km) from Werowocomoco. She was abducted by Englishmen whose leader was Samuel Argall, and transported about 90 miles (140?km) south to the English settlement at Henricus on the James River. There, Pocahontas converted to Christianity and took the name "Rebecca" under the tutelage of Reverend Alexander Whitaker who had arrived in Jamestown in 1611. She married prominent planter John Rolfe, who had lost his first wife and child in the journey from England several years earlier, which served to greatly improve relations between the Virginia Native Americans and the colonists for several years. However, when she and John Rolfe took their young son Thomas Rolfe on a public relations trip to England to help raise more investment money for the Virginia Company, she became ill and died just as they were leaving to return to Virginia. Her interment was at St George's Church in Gravesend.
What became known as the "Starving Time" in the Virginia Colony occurred during the winter of 1609ÿ10. Only 60 of 500 English colonists survived.[22][23][24]
The colonists, the first group of whom had originally arrived at Jamestown on May 14, 1607, had never planned to grow all of their own food. Instead, their plans also depended upon trade with the local Virginia Indians to supply them with enough food between the arrival of periodic supply ships from England, upon which they also relied.
This period of extreme hardship for the colonists began in 1609 with a drought which caused their already limited farming activities to produce even fewer crops than usual. Then, there were problems with both of their other sources for food.
An unexpected delay occurred during the Virginia Company of London's Third Supply mission from England due to a major hurricane in the Atlantic Ocean. A large portion of the food and supplies had been aboard the new flagship of the Virginia Company, Sea Venture, which became shipwrecked at Bermuda and separated from the other ships, seven of which arrived at the colony with even more new colonists to feed, and few supplies, most of which had been aboard the larger flagship.
The impending hardship was further compounded by the loss of their most skillful leader in dealing with the Powhatan Confederacy in trading for food: Captain John Smith. He became injured in August 1609 in a gunpowder accident, and was forced to return to England for medical attention in October 1609. After Smith left, Chief Powhatan severely curtailed trading with the colonists for food. Neither the missing Sea Venture nor any other supply ship arrived as winter set upon the inhabitants of the young colony in late 1609.
When the survivors of the shipwreck of the Third Supply mission's flagship Sea Venture finally arrived at Jamestown the following May 23 in two makeshift ships they had constructed while stranded on Bermuda for nine months, they found fewer than 100 colonists still alive, many of whom were sick. Worse yet, the Bermuda survivors had brought few supplies and only a small amount of food with them, expecting to find a thriving colony at Jamestown.
Thus, even with the arrival of the two small ships from Bermuda under Captain Christopher Newport, they were faced with abandoning Jamestown and returning to England. On June 7, 1610, both groups of survivors (from Jamestown and Bermuda) boarded ships, and they all set sail down the James River toward the Chesapeake Bay and the Atlantic Ocean.
Shortly after they had abandoned Jamestown, they came upon a fleet of three supply ships arriving from England, commanded by a new governor, The 3rd Baron De La Warr. The two groups met on the James River on June 9, 1610 near Mulberry Island (adjacent to present-day Fort Eustis in Newport News).
With the new supply mission, Lord De La Warr, the new Governor, often known in modern times as "Lord Delaware", brought additional colonists, a doctor, food, and much-needed supplies. He also was of a strong determination that Jamestown and the colony were not to be abandoned. He turned the departing ships around and brought the entire group back to Jamestown. This was certainly not a popular decision at the time with at least some of the group, but Lord Delaware was to prove a new kind of leader for Virginia.
Included in those returning to Jamestown was a colonist whose wife and child had died during the shipwreck of the Sea Venture and the time at Bermuda. A businessman, he had with him some seeds for a new strain of tobacco and also some untried marketing ideas. That colonist was John Rolfe. Despite his misfortune to that point, history records that he would change the future of the colony as much as Lord Delaware's timely arrival had.
Sea Venture was the new flagship of the Virginia Company. Leaving England in 1609, and leading this Third Supply to Jamestown as "Vice Admiral" and commanding Sea Venture, Christopher Newport was in charge of a nine-vessel fleet. Aboard the flagship Sea Venture was the Admiral of the Company, Sir George Somers, Lieutenant-General Sir Thomas Gates, William Strachey and other notable personages in the early history of English colonization in North America.
While at sea, the fleet encountered a strong storm, perhaps a hurricane, which lasted for three days. Sea Venture and one other ship were separated from the seven other vessels of the fleet. Sea Venture was deliberately driven onto the reefs of Bermuda to prevent her sinking. The 150 passengers and crew members were all landed safely but the ship was now permanently damaged.[25]
Sea Venture's longboat was fitted with a mast and sent to find Virginia but it and its crew were never seen again. The remaining survivors spent nine months on Bermuda building two smaller ships, Deliverance and Patience from Bermuda cedar and materials salvaged from Sea Venture.
Leaving two men at Bermuda to maintain England's claim to the archipelago, the remainder sailed to Jamestown, finally arriving on May 23, 1610. They found the Virginia Colony in ruins and practically abandoned. Of 500 settlers who had preceded them to Jamestown, they found fewer than 100 survivors, many of whom were sick or dying. It was decided to abandon the colony and on June 7, everyone was placed aboard the ships to return to England.
During the same period that Sea Venture suffered its misfortune and its survivors were struggling in Bermuda to continue on to Virginia, back in England, the publication of Captain John Smith's books of his adventures in Virginia sparked a resurgence in interest in the colony. This helped lead to the dispatch in early 1610 of additional colonists, more supplies, and a new governor, Thomas West, Baron De La Warr.
On June 9, 1610, Lord De La Warr and his party arrived on the James River shortly after Deliverance and Patience had abandoned Jamestown. Intercepting them about 10 miles (16?km) downstream from Jamestown near Mulberry Island, the new governor forced the remaining 90 settlers to return, thwarting their plans to abandon the colony. Deliverance and Patience turned back, and all the settlers were landed again at Jamestown.[26]
Then, Sir George Somers returned to Bermuda with Patience to obtain more food supplies, but he died on the island that summer. His nephew, Matthew Somers, Captain of Patience, took the ship back to Lyme Regis, England instead of Virginia (leaving a third man behind). The Third Charter of the Virginia Company was then extended far enough across the Atlantic to include Bermuda in 1612. (Although a separate company, the Somers Isles Company, would be spun off to administer Bermuda from 1615, the first two successful English colonies would retain close ties for many more generations, as was demonstrated when Virginian general George Washington called upon the people of Bermuda for aid during the American War of Independence). In 1613, Sir Thomas Dale founded the settlement of Bermuda Hundred on the James River, which, a year later, became the first incorporated town in Virginia.
By 1611, a majority of the colonists who had arrived at the Jamestown settlement had died and its economic value was negligible with no active exports to England and very little internal economic activity. Only financial incentives including a promise of more land to the west from King James I to investors financing the new colony kept the project afloat.
The Anglo-Powhatan Wars were three wars fought between English settlers of the Virginia Colony, and Indians of the Powhatan Confederacy in the early seventeenth century. The First War started in 1610, and ended in a peace settlement in 1614.
In 1610, John Rolfe, whose wife and a child had died in Bermuda during passage in the Third Supply to Virginia, was just one of the settlers who had arrived in Jamestown following the shipwreck of Sea Venture. However, his major contribution is that he was the first man to successfully raise export tobacco in the Colony (although the colonists had begun to make glass artifacts to export immediately after their arrival). The native tobacco raised in Virginia prior to that time, Nicotiana rustica, was not to the liking of the Europeans but Rolfe had brought some seed for Nicotiana tabacum with him from Bermuda.
Although most people "wouldn't touch" the crop, Rolfe was able to make his fortune farming it, successfully exporting beginning in 1612. Soon almost all other colonists followed suit, as windfall profits in tobacco briefly lent Jamestown something like a gold rush atmosphere. Among others, Rolfe quickly became both a wealthy and prominent man. He married the young Virginia Indian woman Pocahontas on April 24, 1614. They lived first across the river from Jamestown, and later at his Varina Farms plantation near Henricus. Their son, Thomas Rolfe, was born in 1615.
In 1611, the Virginia Company of London sent Sir Thomas Dale to act as deputy-governor or as high marshall for the Virginia Colony under the authority of Thomas West, 3rd Baron De La Warr (Lord Delaware). He arrived at Jamestown on May 19 with three ships, additional men, cattle, and provisions. Finding the conditions unhealthy and greatly in need of improvement, he immediately called for a meeting of the Jamestown Council, and established crews to rebuild Jamestown.
He served as Governor for 3 months in 1611, and again for a two-year period between 1614 and 1616. It was during his administration that the first code of laws of Virginia, nominally in force from 1611 to 1619, was effectively tested. This code, entitled "Articles, Lawes, and Orders Divine, Politique, and Martiall" (popularly known as Dale's Code), was notable for its pitiless severity, and seems to have been prepared in large part by Dale himself.
Seeking a better site than Jamestown with the thought of possibly relocating the capital, Thomas Dale sailed up the James River (also named after King James) to the area now known as Chesterfield County. He was apparently impressed with the possibilities of the general area where the Appomattox River joins the James River, until then occupied by the Appomattoc Indians, and there are published references to the name "New Bermudas" although it apparently was never formalized. (Far from the mainland of North America, the archipelago of Bermuda had been established as part of the Virginia Colony in 1612 following the shipwreck of Sea Venture in 1609).
A short distance further up the James, in 1611, he began the construction of a progressive development at Henricus on and about what was later known as Farrars Island. Henricus was envisioned as possible replacement capital for Jamestown, though it was eventually destroyed during the Indian Massacre of 1622, during which a third of the colonists were killed.
In 1616, Governor Dale joined John Rolfe and Pocahontas and their young son Thomas as they left their Varina Farms plantation for a public relations mission to England, where Pocahontas was received and treated as a form of visiting royalty by Queen Anne. This stimulated more interest in investments in the Virginia Company, the desired effect. However, as the couple prepared to return to Virginia, Pocahontas died of an illness at Gravesend on March 17, 1617, where she was buried. John Rolfe returned to Virginia alone once again, leaving their son Thomas Rolfe, then a small child, in England to obtain an education.
Once back in Virginia, Rolfe married Jane Pierce and continued to improve the quality of his tobacco with the result that by the time of his death in 1622, the Colony was thriving as a producer of tobacco.
Orphaned by the age of 8, young Thomas later returned to Virginia, and settled across the James River not far from his parents farm at Varina, where he married Jane Poythress and they had one daughter, Jane Rolfe, who was born in 1650. Many of the First Families of Virginia trace their lineage through Thomas Rolfe to both Pocahontas and John Rolfe, joining English and Virginia Indian heritage.
Virginia's population grew rapidly from 1618 until 1622, rising from a few hundred to nearly 1,400 people. Wheat was also grown in Virginia starting in 1618.
On June 30, 1619 Slovak and Polish artisans conducted the first labor strike (first "in American history"[27][28]) for democratic rights ("No Vote, No Work")[27][29] in Jamestown.[29][30][31] The British Crown overturned the legislation in the Virginia House of Burgesses in its first meeting[32] and granted the workers equal voting rights on July 21, 1619.[33] Afterwards, the labor strike was ended and the artisans resumed their work.[30][31][34][35] The House of Burgesses, the first legislature of elected representatives in America, met in the Jamestown Church. One of their first laws was to set a minimum price for the sale of tobacco and set forth plans for the creation of the first ironworks of the colony. This legislative group was the predecessor of the modern Virginia General Assembly.
In August 1619 "20 and odd Blacks" arrived on the Dutch Man-of-War ship at Jamestown colony. This is the earliest record of Black people in colonial America [36] These colonists were freemen and indentured servants.[37][38][39] At this time the slave trade between Africa and the English colonies had not yet been established.
Records from 1623 and 1624 listed the African inhabitants of the colony as servants, not slaves. In the case of William Tucker, the first Black person born in the colonies, freedom was his bright right.[40] He was son of "Antony and Isabell", a married couple from Angola who worked as indentured servants for Captain William Tucker whom he was named after. Yet, court records show that at least one African had been declared a slave by 1640; John Punch. He was an indentured servant who ran away along with two White indentured servants and he was sentenced by the governing council to lifelong servitude. This action is what officially marked the institution of slavery in Jamestown and the future United States.
By 1620, more German settlers from Hamburg, Germany, who were recruited by the Virginia Company set up and operated one of the first sawmills in the region.[41] Among the Germans were several other skilled craftsmen carpenters, and pitch/tar/soap-ash makers, who produced some of the colony's first exports of these products. The Italians included a team of glass makers.[42]
During 1621 fifty-seven unmarried women sailed to Virginia under the auspices of the Virginia Company, who paid for their transport and provided them with a small bundle of clothing and other goods to take with them. A colonist who married one of the women would be responsible for repaying the Virginia Company for his wife's transport and provisions. The women traveled on three ships, The Marmaduke, The Warwick, and The Tyger.
Many of the women were not "maids" but widows. Some others were children, for example Priscilla, the eight-year-old daughter of Joanne Palmer, who travelled with her mother and her new stepfather, Thomas Palmer, on The Tyger. Some were women who were traveling with family or relatives: Ursula Clawson, "kinswoman" of ancient planter Richard Pace, traveled with Pace and his wife on the Marmaduke. Ann Jackson also came on the Marmaduke, in the company of her brother John Jackson, both of them bound for Martin's Hundred. Ann Jackson was one of the women taken captive by the Powhatans during the Indian Massacre of 1622. She was not returned until 1630. The Council ordered that she should be sent back to England on the first available ship, perhaps because she was suffering from the consequences of her long captivity.[43]
Some of the women sent to Virginia did marry. But most disappeared from the recordsperhaps killed in the massacre, perhaps' dead from other causes, perhaps returned to England. In other words, they shared the fate of most of their fellow colonists.[44]
The relations with the Natives took a turn for the worse after the death of Pocahontas in England and the return of John Rolfe and other colonial leaders in May 1617. Disease, poor harvests and the growing demand for tobacco lands caused hostilities to escalate.
After Wahunsunacock's death in 1618, his younger brother, Opitchapam, briefly became chief. However, he was soon succeeded by his own younger brother, Opechancanough. Opechancanough was not interested in attempting peaceful coexistence with the English settlers. Instead, he was determined to eradicate the colonists from what he considered to be Indian lands.
Chief Opechancanough organized and led a well-coordinated series of surprise attacks on multiple English settlements along both sides of a 50-mile (80?km) long stretch of the James River which took place early on the morning of March 22, 1622. This event came to be known as the Indian Massacre of 1622, and resulted in the deaths of 347 colonists (including men, women, and children) and the abduction of many others. Some say that this massacre was revenge. The Massacre caught most of the Virginia Colony by surprise and virtually wiped out several entire communities, including Henricus and Wolstenholme Town at Martin's Hundred.
However, Jamestown was spared from destruction due to a Virginia Indian boy named Chanco who, after learning of the planned attacks from his brother, gave warning to colonist Richard Pace, with whom he lived. Pace, after securing himself and his neighbors on the south side of the James River, took a canoe across river to warn Jamestown, which narrowly escaped destruction, although there was no time to warn the other settlements. Apparently, Opechancanough subsequently was unaware of Chanco's actions, as the young man continued to serve as his courier for some time after.
A letter by Richard Frethorne, written in 1623, reports, "we live in fear of the enemy every hour."[45]
As a result, another war between the two powers lasted from 1622 to 1632.
Some historians have noted that, as the settlers of the Virginia Colony were allowed some representative government, and they prospered, King James I was reluctant to lose either power or future financial potential. In any case, in 1624, the Virginia Company lost its charter and Virginia became a crown colony.
In 1634, the English Crown created eight shires (i.e. counties) in the colony of Virginia which had a total population of approximately 5,000 inhabitants. James City Shire was established and included Jamestown. Around 1642-43, the name of the James City Shire was changed to James City County.
The original Jamestown fort seems to have existed into the middle of the 1620s, but as Jamestown grew into a "New Town" to the east, written references to the original fort disappear. By 1634, a palisade (stockade) was completed across the Virginia Peninsula, which was about 6 miles (9.7?km) wide at that point between Queen's Creek which fed into the York River and Archer's Hope Creek, (since renamed College Creek) which fed into the James River. The new palisade provided some security from attacks by the Virginia Indians for colonists farming and fishing lower on the Peninsula from that point.
On April 18, 1644, Opechancanough again tried to force the colonists to abandon the region with another series of coordinated attacks, killing almost 500 colonists. However, this was a much less devastating portion of the growing population than had been the case in the 1622 attacks. Furthermore, the forces of Royal Governor of Virginia William Berkeley captured the old warrior in 1646,[46] variously thought to be between 90 and 100 years old. In October, while a prisoner, Opechancanough was killed by a soldier (shot in the back) assigned to guard him. Opechancanough was succeeded as Weroance (Chief) by Nectowance and then by Totopotomoi and later by his daughter Cockacoeske.
In 1646, the first treaties were signed between the Virginia Indians and the English. The treaties set up reservations, some of the oldest in America, for the surviving Powhatan. It also set up tribute payments for the Virginia Indians to be made yearly to the English.[47]
That war resulted in a boundary being defined between the Indians and English lands that could only be crossed for official business with a special pass. This situation would last until 1677 and the Treaty of Middle Plantation, which established Indian reservations following Bacon's Rebellion.
On October 20, 1698, the statehouse (capitol building) in Jamestown burned for the fourth time. Once again removing itself to a familiar alternate location, the legislature met at Middle Plantation, this time in the new College Building at the College of William and Mary, which had begun meeting there in temporary quarters in 1694.
While meeting there, a group of five students from the College submitted a well-presented and logical proposal to the legislators outlining a plan and good reasons to move the capital permanently to Middle Plantation.
Despite the periodic need to relocate the legislature from Jamestown due to contingencies such as fires, (usually to Middle Plantation), throughout the seventeenth century, Virginians had been reluctant to permanently move the capital from its "ancient and accustomed place." After all, Jamestown had always been Virginia's capital. It had a state house (except when it periodically burned) and a church, and it offered easy access to ships that came up the James River bringing goods from England and taking on tobacco bound for market.[48] However, Jamestown's status had been in some decline. In 1662, Jamestown's status as mandatory port of entry for Virginia had been ended.
The students argued that the change to the high ground at Middle Plantation would escape the dreaded malaria and mosquitoes that had always plagued the swampy, low-lying Jamestown site. The students pointed out that, while not located immediately upon a river, Middle Plantation offered nearby access to not one, but two rivers, via two deep water (6-7' depth) creeks, Queen's Creek leading to the York River, and College Creek (formerly known as Archer's Hope) which led to the James River. Other advocates of the move included the Reverend Dr. James Blair and the Governor, Sir Francis Nicholson.
Several prominent individuals like John Page, Thomas Ludwell, Philip Ludwell, and Otho Thorpe had built fine brick homes and created a substantial town at Middle Plantation. And, there was of course, the new College of William and Mary with its fine new brick building.
The proposal to move the capital of Virginia to higher ground (about 12 miles (20?km) away) at Middle Plantation was received favorably by the House of Burgesses. In 1699, the capital of the Virginia Colony was officially relocated there. Soon, the town was renamed Williamsburg, in honor of King William III. Thus, the first phase of Jamestown's history ended.
By the 1750s the land was owned and heavily cultivated, primarily by the Travis and Ambler families. A military post was located on the island during the Revolutionary War and American and British prisoners were exchanged there. During the U.S. Civil War the island was occupied by Confederate soldiers who built an earth fort near the church as part of the defense system to block the Union advance up the river to Richmond. Little further attention was paid to Virginia until preservation was undertaken in the twenty first century.
Who was blamed for the uss maine explosion?
Spain🚨USS Maine (ACR-1) is an American naval ship that sank in Havana Harbor during the Cuban revolt against Spain, an event that became a major political issue in the United States.
Commissioned in 1895, this was the first United States Navy ship to be named after the state of Maine.[a][1] Originally classified as an armored cruiser, she was built in response to the Brazilian battleship?Riachuelo and the increase of naval forces in Latin America. Maine and her near-sister ship Texas reflected the latest European naval developments, with the layout of her main armament resembling that of the British ironclad Inflexible and comparable Italian ships. Her two gun turrets were staggered en chelon, rather than on the centerline, with the fore gun sponsoned out on the starboard side of the ship and the aft gun on the port side,[2] with cutaways in the superstructure to allow both to fire ahead, astern or across her deck. She dispensed with full masts thanks to the increased reliability of steam engines by the time of her construction.
Despite these advances, Maine was out of date by the time she entered service, due to her protracted construction period and changes in the role of ships of her type, naval tactics and technology. It took nine years to complete, and nearly three years for the armor plating alone.[2] The general use of steel in warship construction precluded the use of ramming without danger to the attacking vessel. The potential for blast damage from firing end on or cross-deck discouraged en chelon gun placement. The changing role of the armored cruiser from a small, heavily armored substitute for the battleship to a fast, lightly armored commerce raider also hastened her obsolescence. Despite these disadvantages, Maine was seen as an advance in American warship design.
Maine is best known for her loss in Havana Harbor on the evening of 15 February 1898. Sent to protect U.S. interests during the Cuban revolt against Spain, she exploded suddenly, without warning, and sank quickly, killing nearly three quarters of her crew. The cause and responsibility for her sinking remained unclear after a board of inquiry investigated. Nevertheless, popular opinion in the U.S., fanned by inflammatory articles printed in the "yellow press" by William Randolph Hearst and Joseph Pulitzer, blamed Spain. The phrase, "Remember the Maine! To hell with Spain!", became a rallying cry for action, which came with the SpanishÿAmerican War later that year. While the sinking of Maine was not a direct cause for action, it served as a catalyst, accelerating the approach to a diplomatic impasse between the U.S. and Spain.
The cause of Maine's sinking remains a subject of speculation. In 1898, an investigation of the explosion was carried out by a naval board appointed under the McKinley Administration. The consensus of the board was that Maine was destroyed by an external explosion from a mine. However, the validity of this investigation has been challenged. George W. Melville, a chief engineer in the Navy, proposed that a more likely cause for the sinking was from a magazine explosion within the vessel. The Navy's leading ordnance expert, Philip R. Alger, took this theory further by suggesting that the magazines were ignited by a spontaneous fire in a coal bunker.[3] The coal used in Maine was bituminous coal, which is known for releasing firedamp, a gas that is prone to spontaneous explosions. There is stronger evidence that the explosion of Maine was caused by an internal coal fire which ignited the magazines. This was a likely cause of the explosion, rather than the initial hypothesis of a mine. The ship lay at the bottom of the harbor until 1911. A cofferdam was then built around the wreck.[4] The hull was patched up until the ship was afloat, then towed to sea and sunk. The Maine now lies on the sea-bed 3,600 feet (1,100?m) below the surface.
The delivery of the Brazilian battleship?Riachuelo in 1883 and the acquisition of other modern armored warships from Europe by Brazil, Argentina and Chile shortly afterwards, alarmed the United States government, as the Brazilian Navy was now the most powerful in the Americas.[5] The chairman of the House Naval Affairs Committee, Hilary A. Herbert, stated to Congress: "if all this old navy of ours were drawn up in battle array in mid-ocean and confronted by Riachuelo it is doubtful whether a single vessel bearing the American flag would get into port."[6] These developments helped bring to a head a series of discussions that had been taking place at the Naval Advisory Board since 1881. The board knew at that time that the U.S. Navy could not challenge any major European fleet; at best, it could wear down an opponent's merchant fleet and hope to make some progress through general attrition there. Moreover, projecting naval force abroad through the use of battleships ran counter to the government policy of isolationism. While some on the board supported a strict policy of commerce raiding, others argued it would be ineffective against the potential threat of enemy battleships stationed near the American coast. The two sides remained essentially deadlocked until Riachuelo manifested.[7]
The board, now confronted with the concrete possibility of hostile warships operating off the American coast, began planning for ships to protect it in 1884. The ships had to fit within existing docks and had to have a shallow draft to enable them to use all the major American ports and bases. The maximum beam was similarly fixed, and the board concluded that at a length of about 300 feet (91?m), the maximum displacement would be about 7,000 tons. A year later the Bureau of Construction and Repair (C & R) presented two designs to Secretary of the Navy William Collins Whitney, one for a 7,500-ton battleship and one for a 5,000-ton armored cruiser. Whitney decided instead to ask Congress for two 6,000-ton warships, and they were authorized in August 1886. A design contest was held, asking naval architects to submit designs for the two ships: armored cruiser Maine and battleship Texas. It was specified that Maine had to have a speed of 17 knots (31?km/h; 20?mph), a ram bow, and a double bottom, and be able to carry two torpedo boats. Her armament was specified as: four 10-inch (254?mm) guns, six 6-inch (152?mm) guns, various light weapons, and four torpedo tubes. It was specifically stated that the main guns "must afford heavy bow and stern fire."[8] Armor thickness and many details were also defined. Specifications for Texas were similar, but demanded a main battery of two 12-inch (305?mm) guns and slightly thicker armor.[9]
The winning design for Maine was from Theodore D. Wilson, who served as chief constructor for C & R and was a member on the Naval Advisory Board in 1881. He had designed a number of other warships for the navy.[10] The winning design for Texas was from a British designer, William John, who was working for the Barrow Shipbuilding Company at that time. Both designs resembled the Brazilian battleship Riachuelo, having the main gun turrets sponsoned out over the sides of the ship and echeloned.[11] The winning design for Maine, though conservative and inferior to other contenders, may have received special consideration due to a requirement that one of the two new ships be Americanÿdesigned.[12]
Congress authorized construction of Maine on 3 August 1886, and her keel was laid down on 17 October 1888, at the Brooklyn Navy Yard. She was the largest vessel built in a U.S. Navy yard up to that time.[13]
Maine's building time of nine years was unusually protracted, due to the limits of U.S. industry at the time. (The delivery of her armored plating took three years and a fire in the drafting room of the building yard, where Maine's working set of blueprints were stored, caused further delay.) In those nine years, naval tactics and technology changed radically and left Maine's actual role in the navy ill-defined. At the time she was laid down, armored cruisers such as Maine were intended to serve as small battleships on overseas service and were built with heavy belt armor. Great Britain, France and Russia had constructed such ships to serve this purpose and sold others of this type, including Riachuelo, to second-rate navies. Within a decade, this role had changed to commerce raiding, for which fast, long-range vessels, with only limited armor protection, were needed. The advent of lightweight armor, such as Harvey steel, made this transformation possible.[14]
As a result of these changing priorities, Maine was caught between two separate positions and could not perform either one adequately. She lacked both the armor and firepower to serve as a ship-of-the-line against enemy battleships and the speed to serve as a cruiser. Nevertheless, she was expected to fulfill more than one tactical function.[15] In addition, because of the potential of a warship sustaining blast damage to herself from cross-deck and end-on fire, Maine's main-gun arrangement was obsolete by the time she entered service.[11]
Maine was 324?feet 4?inches (98.9?m) long overall, with a beam of 57 feet (17.4?m), a maximum draft of 22?feet 6?inches (6.9?m) and a displacement of 6,682 long tons (6,789.2?t).[16] She was divided into 214 watertight compartments.[17] A centerline longitudinal watertight bulkhead separated the engines and a double bottom covered the hull only from the foremast to the aft end of the armored citadel, a distance of 196 feet (59.7?m). She had a metacentric height of 3.45 feet (1.1?m) as designed and was fitted with a ram bow.[18]
Maine's hull was long and narrow, more like a cruiser than that of Texas, which was wide-beamed. Normally, this would have made Maine the faster ship of the two. However, Maine's weight distribution was ill-balanced, which slowed her considerably. Her main turrets, awkwardly situated on a cut-away gundeck, were nearly awash in bad weather. Because they were mounted toward the ends of the ship, away from its center of gravity, Maine was also prone to greater motion in heavy seas. While she and Texas were both considered seaworthy, the latter's high hull and guns mounted on her main deck made her the drier ship.[19]
The two main gun turrets were sponsoned out over the sides of the ship and echeloned to allow both to fire fore and aft. The practice of en echelon mounting had begun with Italian battleships designed in the 1870s by Benedetto Brin and followed by the British Navy with HMS?Inflexible, which was laid down in 1874 but not commissioned until October 1881.[20] This gun arrangement met the design demand for heavy end-on fire in a ship-to-ship encounter, tactics which involved ramming the enemy vessel.[11] The wisdom of this tactic was purely theoretical at the time it was implemented. A drawback of an en echelon layout limited the ability for a ship to fire broadside, a key factor when employed in a line of battle. To allow for at least partial broadside fire, Maine's superstructure was separated into three structures. This technically allowed both turrets to fire across the ship's deck (cross-deck fire), between the sections. However, this ability was still significantly limited as the superstructure restricted each turret's arc of fire.[8]
This plan and profile view show Maine with eight six-pounder guns (one is not seen on the port part of the bridge but that is due to the bridge being cut away in the drawing). Another early published plan shows the same. In both cases the photographs show a single extreme bow mounted six-pounder. However, careful examination of Maine photographs confirm that she did not carry that gun. Maine's armament set up in the bow was not identical to the stern which had a single six-pounder mounted at extreme aft of the vessel. Maine carried two six-pounders forward, two on the bridge and three on the stern section, all one level above the abbreviated gun deck that permitted the ten-inch guns to fire across the deck. The six-pounders located in the bow were positioned more forward than the pair mounted aft which necessitated the far aft single six-pounder.
Maine was the first U.S. capital ship to have its power plant given as high a priority as its fighting strength.[21] Her machinery, built by the N. F. Palmer Jr. & Company's Quintard Iron Works of New York,[22] was the first designed for a major ship under the direct supervision of Arctic explorer and soon-to-be commodore, George Wallace Melville.[23] She had two inverted vertical triple-expansion steam engines, mounted in watertight compartments and separated by a fore-to-aft bulkhead, with a total designed output of 9,293 indicated horsepower (6,930?kW). Cylinder diameters were 35.5 inches (900?mm) (high-pressure), 57 inches (1,400?mm) (intermediate-pressure) and 88 inches (2,200?mm) (low-pressure). Stroke for all three pistons was 36 inches (910?mm).[17]
Melville mounted Maine's engines with the cylinders in vertical mode, a departure from conventional practice. Previous ships had had their engines mounted in horizontal mode, so that they would be completely protected below the waterline. Melville believed a ship's engines needed ample room to operate and that any exposed parts could be protected by an armored deck. He therefore opted for the greater efficiency, lower maintenance costs and higher speeds offered by the vertical mode.[24][25] Also, the engines were constructed with the high-pressure cylinder aft and the low-pressure cylinder forward. This was done, according to the ship's chief engineer, A. W. Morley, so the low-pressure cylinder could be disconnected when the ship was under low power. This allowed the high and intermediate-power cylinders to be run together as a compound engine for economical running.[clarification needed]
Eight single-ended Scotch marine boilers provided steam to the engines at a working pressure of 135 pounds per square inch (930?kPa; 9.5?kgf/cm2) at a temperature 364?F (184?C). On trials, she reached a speed of 16.45 knots (30.47?km/h; 18.93?mph), failing to meet her contract speed of 17 knots (31?km/h; 20?mph). She carried a maximum load of 896 long tons (910?t) of coal[26] in 20 bunkers, 10 on each side, which extended below the protective deck. Wing bunkers at each end of each fire room extended inboard to the front of the boilers.[17] This was actually a very low capacity for a ship of Maine's rating, which limited her time at sea and her ability to run at flank speed, when coal consumption increased dramatically. Maine's overhanging main turrets also prevented coaling at sea, except in the calmest of waters; otherwise, the potential for damage to a collier, herself or both vessels was extremely great.
Maine also carried two small dynamos to power her searchlights and provide interior lighting.[27]
Maine was designed initially with a three-mast barque rig for auxiliary propulsion, in case of engine failure and to aid long-range cruising.[28] This arrangement was limited to "two-thirds" of full sail power, determined by the ship's tonnage and immersed cross-section.[29] The mizzen mast was removed in 1892, after the ship had been launched, but before her completion.[28] Maine was completed with a two-mast military rig and the ship never spread any canvas.[30]
Maine's main armament consisted of four 10-inch (254?mm)/30 caliber Mark II guns, which had a maximum elevation of 15 and could depress to ?3. Ninety rounds per gun were carried. The ten-inch guns fired a 510 pounds (231?kg) shell at a muzzle velocity of 2,000 feet per second (610?m/s) to a range of 20,000 yards (18,000?m) at maximum elevation.[31] These guns were mounted in twin hydraulically powered Mark 3 turrets, the fore turret sponsoned to starboard and the aft turret sponsoned to port.[5]
The 10" guns were initially to be mounted in open barbettes (the C & R proposal blueprint shows them as such). During Maine's extended construction, the development of rapid-fire intermediate-caliber guns, which could fire high-explosive shells, became a serious threat and the navy redesigned Maine with enclosed turrets. Because of the corresponding weight increase, the turrets were mounted one deck lower than planned originally.[30][32] Even with this modification, the main guns were high enough to fire unobstructed for 180 on one side and 64 on the other side.[17] They could also be loaded at any angle of train; initially the main guns of Texas, by comparison, with external rammers, could be loaded only when trained on the centerline or directly abeam, a common feature in battleships built before 1890.[11] However, by 1897, Texas' turrets had been modified with internal rammers to permit much faster reloading.
The en echelon arrangement proved problematic. Because Maine's turrets were not counterbalanced, she heeled over if both were pointed in the same direction, which reduced the range of the guns. Also, cross-deck firing damaged her deck and superstructure significantly due to the vacuum from passing shells.[33] Because of this, and the potential for undue hull stress if the main guns were fired end-on, the en echelon arrangement was not used in U.S. Navy designs after Maine and Texas.[11][33]
The six 6-inch (152?mm)/30 caliber Mark 3 guns were mounted in casemates in the hull, two each at the bow and stern and the last two amidships.[22] Data is lacking, but they could probably depress to ?7 and elevate to +12. They fired shells that weighed 105 pounds (48?kg) with a muzzle velocity of about 1,950 feet per second (590?m/s). They had a maximum range of 9,000 yards (8,200?m) at full elevation.[34]
The anti-torpedo boat armament consisted of seven 57-millimeter (2.2?in) Driggs-Schroeder six-pounder guns mounted on the superstructure deck.[22] They fired a shell weighing about 6?lb (2.7?kg) at a muzzle velocity of about 1,765 feet per second (538?m/s) at a rate of 20 rounds per minute to a maximum range of 8,700 yards (7,955?m).[35] The lighter armament comprised four each 37-millimeter (1.5?in) Hotchkiss and Driggs-Schroeder one-pounder guns. Four of these were mounted on the superstructure deck, two were mounted in small casemates at the extreme stern and one was mounted in each fighting top.[22] They fired a shell weighing about 1.1 pounds (0.50?kg) at a muzzle velocity of about 2,000 feet per second (610?m/s) at a rate of 30 rounds per minute to a range about 3,500 yards (3,200?m).[36]
Maine had four 18-inch (457?mm) above-water torpedo tubes, two on each broadside. In addition, she was designed to carry two 14.8 long tons (15.0?t) steam-powered torpedo boats, each with a single 14-inch (356?mm) torpedo tube and a one-pounder gun. Only one was built, but it had a top speed of only a little over 12 knots (22?km/h; 14?mph) so it was transferred to the Naval Torpedo Station at Newport, Rhode Island, as a training craft.[b][37]
The main waterline belt, made of nickel steel, had a maximum thickness of 12 inches (305?mm) and tapered to 7 inches (178?mm) at its lower edge. It was 180 feet (54.9?m) long and covered the machinery spaces and the 10-inch magazines. It was 7 feet (2.1?m) high, of which 3 feet (0.9?m) was above the design waterline. It angled inwards for 17 feet (5.2?m) at each end, thinning to 8 inches (203?mm), to provide protection against raking fire. A 6-inch transverse bulkhead closed off the forward end of the armored citadel. The forward portion of the 2-inch-thick (51?mm) protective deck ran from the bulkhead all the way to the bow and served to stiffen the ram. The deck sloped downwards to the sides, but its thickness increased to 3 inches (76?mm). The rear portion of the protective deck sloped downwards towards the stern, going below the waterline, to protect the propeller shafts and steering gear. The sides of the circular turrets were 8 inches thick. The barbettes were 12 inches thick, with their lower portions reduced to 10 inches. The conning tower had 10-inch walls. The ship's voicepipes and electrical leads were protected by an armored tube 4.5 inches (114?mm) thick.[38]
Two flaws emerged in Maine's protection, both due to technological developments between her laying-down and her completion. The first was a lack of adequate topside armor to counter the effects of rapid-fire intermediate-caliber guns and high-explosive shells. This was a flaw she shared with Texas.[33] The second was the use of nickel-steel armor. Introduced in 1889, nickel steel was the first modern steel alloy armor and, with a figure of merit of 0.67, was an improvement over the 0.6 rating of mild steel used until then. Harvey steel and Krupp armors, both of which appeared in 1893, had merit figures of between 0.9 and 1.2, giving them roughly twice the tensile strength of nickel steel. Although all three armors shared the same density (about 40 pounds per square foot for a one-inch-thick plate), six inches of Krupp or Harvey steel gave the same protection as 10?inches of nickel. The weight thus saved could be applied either to additional hull structure and machinery or to achieving higher speed. The navy would incorporate Harvey armor in the Indiana-class battleships, designed after Maine, but commissioned at roughly the same time.[39][40]
Maine was launched on 18 November 1889, sponsored by Alice Tracey Wilmerding, the granddaughter of Navy Secretary Benjamin F. Tracy. Not long afterwards, a reporter wrote for Marine Engineer and Naval Architect magazine, "it cannot be denied that the navy of the United States is making rapid strides towards taking a credible position among the navies of the world, and the launch of the new armoured battleship Maine from the Brooklyn Navy Yard ... has added a most powerful unit to the United States fleet of turret ships."[41] In his 1890 annual report to congress, the Secretary of the Navy wrote, "the Maine ... stands in a class by herself" and expected the ship to be commissioned by July 1892.[13]
A three-year delay ensued, while the shipyard waited for nickel steel plates for Maine's armor. Bethlehem Steel Company had promised the navy 300 tons per month by December 1889 and had ordered heavy castings and forging presses from the British firm of Armstrong Whitworth in 1886 to fulfil its contract. This equipment did not arrive until 1889, pushing back Bethlehem's timetable. In response, Navy Secretary Benjamin Tracy secured a second contractor, the newly expanded Homestead mill of Carnegie, Phipps & Company. In November 1890, Tracy and Andrew Carnegie signed a contract for Homestead to supply 6000 tons of nickel steel.[42] However, Homestead was, what author Paul Krause calls, "the last union stronghold in the steel mills of the Pittsburgh district." The mill had already weathered one strike in 1882 and a lockout in 1889 in an effort to break the union there. Less than two years later, came the Homestead Strike of 1892, one of the largest, most serious disputes in U.S. labor history.[43]
A photo of the christening shows Mrs. Wilmerding striking the bow near the plimsoll line depth of 13 which lead to many comments (much later of course) that the ship was "unlucky" from the launching.
Maine was commissioned on 17 September 1895, under the command of Captain Arent S. Crowninshield.[44] On 5 November 1895, Maine steamed to Sandy Hook Bay, New Jersey. She anchored there two days, then proceeded to Newport, Rhode Island, for fitting out and test firing of her torpedoes. After a trip, later that month, to Portland, Maine, she reported to the North Atlantic Squadron for operations, training manoeuvres and fleet exercises. Maine spent her active career with the North Atlantic Squadron, operating from Norfolk, Virginia along the East Coast of the United States and the Caribbean. On 10 April 1897, Captain Charles Dwight Sigsbee relieved Captain Crowninshield as commander of Maine.[45]
The ship's crew consisted of 355: 26 officers, 290 sailors, and 39 marines. Of these, there were 261 fatalities:
Of the 94 survivors, 16 were uninjured.[46]
In January 1898, Maine was sent from Key West, Florida, to Havana, Cuba, to protect U.S. interests during the Cuban War of Independence. Three weeks later, at 21:40, on 15 February, an explosion on board Maine occurred in the Havana Harbor (coordinates: 230807N 822038W).[47] Later investigations revealed that more than 5 long tons (5.1?t) of powder charges for the vessel's six- and ten-inch guns had detonated, obliterating the forward third of the ship.[48] The remaining wreckage rapidly settled to the bottom of the harbor. Most of Maine's crew were sleeping or resting in the enlisted quarters, in the forward part of the ship, when the explosion occurred. In total, 260[49] men lost their lives as a result of the explosion or shortly thereafter, and six[49] more died later from injuries. Captain Sigsbee and most of the officers survived, because their quarters were in the aft portion of the ship. Altogether there were 89 survivors, 18 of whom were officers.[50] On 21 March, the U.S. Naval Court of Inquiry, in Key West, declared that a naval mine caused the explosion.[51]
The New York Journal and New York World, owned respectively by William Randolph Hearst and Joseph Pulitzer, gave Maine intense press coverage, but employed tactics that would later be labeled "yellow journalism." Both papers exaggerated and distorted any information they could attain, sometimes even fabricating news when none that fit their agenda was available. For a week following the sinking, the Journal devoted a daily average of eight and a half pages of news, editorials and pictures to the event. Its editors sent a full team of reporters and artists to Havana, including Frederic Remington,[52] and Hearst announced a reward of $50,000 "for the conviction of the criminals who sent 258 American sailors to their deaths."[53] The World, while overall not as lurid or shrill in tone as the Journal, nevertheless indulged in similar theatrics, insisting continually that Maine had been bombed or mined. Privately, Pulitzer believed that "nobody outside a lunatic asylum" really believed that Spain sanctioned Maine's destruction. Nevertheless, this did not stop the World from insisting that the only "atonement" Spain could offer the U.S. for the loss of ship and life, was the granting of complete Cuban independence. Nor did it stop the paper from accusing Spain of "treachery, willingness, or laxness" for failing to ensure the safety of Havana Harbor.[54] The American public, already agitated over reported Spanish atrocities in Cuba, was driven to increased hysteria.[55]
Maine's destruction did not result in an immediate declaration of war with Spain. However, the event created an atmosphere that virtually precluded a peaceful solution.[56] The SpanishÿAmerican War began in April 1898, two months after the sinking. Advocates of the war used the rallying cry, "Remember the Maine! To hell with Spain!"[57][58][59][60] The episode focused national attention on the crisis in Cuba, but was not cited by the William McKinley administration as a casus belli, though it was cited by some already inclined to go to war with Spain over perceived atrocities and loss of control in Cuba.[61][62]
In addition to the inquiry commissioned by the Spanish government to naval officers Del Peral and De Salas, two Naval Courts of Inquiry were ordered: The Sampson Board in 1898 and the Vreeland board in 1911. In 1976, Admiral Hyman G. Rickover commissioned a private investigation into the explosion, and the National Geographic Society did an investigation in 1999, using computer simulations. All investigations agreed that an explosion of the forward magazines caused the destruction of the ship, but different conclusions were reached as to how the magazines could have exploded.[62][63]
The Spanish inquiry, conducted by Del Peral and De Salas, collected evidence from officers of naval artillery, who had examined the remains of Maine. Del Peral and De Salas identified the spontaneous combustion of the coal bunker, located adjacent to the munition stores in Maine, as the likely cause of the explosion. However, the possibility of other combustibles causing the explosion such as paint or drier products was not discounted. Additional observations included that:
The conclusions of the report were not reported at that time by the American press.[64]
In order to find the cause of the explosion, a naval inquiry was ordered by the United States shortly after the incident, headed by Captain William T. Sampson. Ram܇n Blanco y Erenas, Spanish governor of Cuba, had proposed instead a joint Spanish-American investigation of the sinking.[65] Captain Sigsbee had written that "many Spanish officers, including representatives of General Blanco, now with us to express sympathy."[66] In a cable, the Spanish minister of colonies, Segismundo Moret, had advised Blanco "to gather every fact you can, to prove the Maine catastrophe cannot be attributed to us."[67]
According to Dana Wegner, who worked with U.S. Admiral Hyman G. Rickover on his 1974 investigation of the sinking, the Secretary of the Navy had the option of selecting a board of inquiry personally. Instead, he fell back on protocol and assigned the commander-in-chief of the North Atlantic Squadron to do so. The commander produced a list of junior line officers for the board. The fact that the officer proposed to be court president was junior to the captain of Maine, Wegner writes, "would indicate either ignorance of navy regulations or that, in the beginning, the board did not intend to examine the possibility that the ship was lost by accident and the negligence of her captain."[this quote needs a citation] Eventually, navy regulations prevailed in leadership of the board; Captain Sampson being senior to Captain Sigsbee.[68]
The board arrived on 21 February and took testimony from survivors, witnesses and divers (who were sent down to investigate the wreck). The Sampson Board produced its findings in two parts: the proceedings, which consisted mainly of testimonies, and the findings, which were the facts, as determined by the court. Between the proceedings and the findings, there was, what Wegner calls, "a broad gap", where the court "left no record of the reasoning that carried it from the oftenÿinconsistent witnesses to [its] conclusion." Another inconsistency, according to Wegner, was that of only one technical witness, Commander George Converse, from the Torpedo Station at Newport, Rhode Island. Captain Sampson read Commander Converse a hypothetical situation of a coal bunker fire igniting the reserve six-inch ammunition, with a resulting explosion sinking the ship. He then asked Commander Converse about the feasibility of such a scenario. Commander Converse "simply stated, without elaboration, that he could not realize such an event happening".[69]
The board concluded that Maine had been blown up by a mine, which, in turn, caused the explosion of her forward magazines. They reached this conclusion, based on the fact that the majority of witnesses stated that they had heard two explosions and that that part of the keel was bent inwards.[62] The official report from the board, which was presented to the Navy Department in Washington, D.C. on 21 March, specifically stated the following:
At frame 18 the vertical keel is broken in two and the flat keel is bent at an angle similar to the angle formed by the outside bottom plating. [...] In the opinion of the court, this effect could have been produced only by the explosion of a mine situated under the bottom of the ship at about frame 18, and somewhat on the port side of the ship." (part of the court's 5th finding)
"In the opinion of the court, the Maine was destroyed by the explosion of a submarine mine, which caused the partial explosion of two or more of her forward magazines." (the court's 7th finding) and
"The court has been unable to obtain evidence fixing the responsibility for the destruction of the Maine upon any person or persons." (the court's 8th finding).[51]
In 1910, the decision was made to do a second Court of Inquiry. The reasons for this were the recovery of the bodies of the victims, so they could be buried in the United States and also a desire for a more thorough investigation. The fact that the Cuban government wanted the wreck removed from Havana Harbor might also have played a role: it at least offered the opportunity to examine the wreck in greater detail than had been possible in 1898, while simultaneously obliging the Cubans. Wegner suggests that the fact that this inquiry could be held without the pending risk of war, which had been the case in 1898, lent it the potential for greater objectivity than had been possible previously. Moreover, since several of the members of the 1910 board would be certified engineers, they would be better qualified to evaluate their findings than the line officers of the 1898 board had been.[70]
Beginning in December 1910, a cofferdam was built around the wreck and water was pumped out, exposing the wreck by late 1911. Between 20 November and 2 December 1911, a court of inquiry headed by Rear Admiral Charles E. Vreeland inspected the wreck. They concluded that an external explosion had triggered the explosion of the magazines. However, this explosion was farther aft and lower powered than concluded by the Sampson Board. The Vreeland Board also found that the bending of frame 18 was caused by the explosion of the magazines, not by the external explosion.[62] After the investigation, the newly located dead were buried in Arlington National Cemetery and the hollow, intact portion of the hull of Maine was refloated and ceremoniously scuttled at sea on 16 March 1912.[71]
Admiral Hyman G. Rickover became intrigued with the disaster and began a private investigation, in 1974. Using information from the two official inquiries, newspapers, personal papers and information on the construction and ammunition of Maine, it was concluded that the explosion was not caused by a mine. Instead, spontaneous combustion of coal in the bunker, next to magazine, was speculated to be the most likely cause. Rickover published a book about this investigation, How the Battleship Maine Was Destroyed, in 1976.[72]
In the 2001 book Theodore Roosevelt, the U.S. Navy and the SpanishÿAmerican War, Wegner revisits the Rickover investigation and offers additional details. According to Wegner, Rickover inquired with naval historians, at the Energy Research and Development Agency, about Maine, after reading an article in the Washington Star-News in which its author, John M. Taylor, claimed the U.S. Navy "made little use of its technically trained officers during its investigation of the tragedy." The historians, then working with the admiral on a study of the U.S. Navy's nuclear propulsion program, said they knew no details of Maine's sinking. When Rickover asked whether they could investigate the matter, the historians, now intrigued, agreed. Knowing of Rickover's "insistence on thoroughness," Wegner says, all relevant documents were obtained and studied. These included the ship's plans and weekly reports of the unwatering of Maine, in 1912, by the chief engineer for the project, William Furgueson. These reports included numerous photos, annotated by Furgueson with frame and strake numbers on corresponding parts of the wreckage. Two experts on naval demolitions and ship explosions were brought in. Since the photos showed "no plausible evidence of penetration from the outside," they believed the explosion originated inside the ship.[73]
Wegner suggests that a combination of naval ship design, and a change in the type of coal used to fuel naval ships, might have facilitated the explosion postulated by the Rickover study. Up to the time of Maine's building, he explains, common bulkheads separated coal bunkers from ammunition lockers and American naval ships burned primarily smokeless anthracite coal. With an increase in the number of steel ships, the U.S. Navy switched to bituminous coal, which burned at a hotter temperature than anthracite coal, and allowed ships to steam faster. However, Wegner explains, while anthracite coal is not subject to spontaneous combustion, bituminous coal is considerably more volatile. In fact, bituminous coal is known for releasing the largest amounts of firedamp, a dangerous and explosive mixture of gases (chiefly methane). Firedamp is explosive at concentrations between 4% and 16%, with most violence at around 10%. In addition, there was another potential contributing factor in the bituminous coal ÿ this was iron sulfide, also known as pyrite, that was likely present. The presence of pyrites presents two additional risk factors. The first involves oxidation. Pyrite oxidation is sufficiently exothermic that underground coal mines in high-sulfur coal seams have occasionally had serious problems with spontaneous combustion in the mined-out areas of the mine. This process can result from the disruption caused by mining from the seams, or other processing, which then exposes the sulfides in the ore to air and water. The presence of pyrites in coal has been recognized to be self-heating. The second risk factor involves an additional capability of pyrites to provide fire ignition under certain conditions. Pyrites, which derive their name from the Greek root word pyr, which means fire, can cause sparks when struck by steel or other sufficiently hard surfaces. Before the use of flintlock guns, for example, pyrites were used to strike sparks to ignite gunpowder in an earlier model type gun, known as a wheellock. In the presence of combustible gasses issuing from the bituminous coal, the pyrites could therefore have provided the ignition capability needed to create an explosion. A number of bunker fires of this type had, in fact, been reported aboard U.S. warships before Maine's explosion, in several cases nearly sinking the ships. Wegner also cites a 1997 heat transfer study which concluded that a coal bunker fire, of the type suggested by Rickover, could have taken place and ignited the ship's ammunition.[74]
In 1998, National Geographic magazine commissioned an analysis by Advanced Marine Enterprises (AME). This investigation, done to commemorate the centennial of the sinking of USS Maine, was based on computer modeling, a technique unavailable for previous investigations. The results reached were inconclusive. National Geographic reported that "a fire in the coal bunker could have generated sufficient heat to touch off an explosion in the adjacent magazine [but] on the other hand, computer analysis also shows that even a small, handmade mine could have penetrated the ship's hull and set off explosions within."[75] The AME investigation, however, did note that "the size and location of the soil depression beneath the Maine 'is more readily explained by a mine explosion than by magazine explosions alone'".[63] The team noted that this was not "definitive in proving that a mine was the cause of the sinking" although it did "strengthen the case".[63]
Some experts, including Admiral Rickover's team and several analysts at AME, do not agree with the conclusion.[63] Wegner claims that technical opinion among the Geographic team was divided between its younger members, who focused on computer modeling results, and its older ones, who weighed their inspection of photos of the wreck with their own experience. He adds that the data AME used for its findings were flawed concerning Maine's design and ammunition storage. Wegner was also critical of the fact that participants in the Rickover study were not consulted until AME's analysis was essentially complete, far too late to confirm the veracity of data being used or engage in any other meaningful cooperation.[76]
In 2002, the Discovery Channel produced an episode of the Unsolved History documentaries titled "Death of the U.S.S. Maine" that used photographic evidence, naval experts, and archival information to determine the cause of the explosion. Its conclusion was that a coal bunker fire caused the explosion, and it identified a weakness or gap in the bulkhead separating the coal and powder bunkers that allowed the fire to spread from the coal bunker to the powder bunker.
It has been suggested by some that the sinking was a false flag operation conducted by the U.S. This is the official view in Cuba. Cuban officials argue that the U.S. may have deliberately sunk the ship to create a pretext for military action against Spain. The wording on the Maine monument in Havana describes Maine's sailors as "victims sacrificed to the imperialist greed in its fervor to seize control of Cuba",[77] which alludes to the theory that U.S. agents deliberately blew up their own ship.[78]
Eliades Acosta, a prominent Cuban historian, head of the Cuban Communist Party's Committee on Culture and former director of the Jose Marti National Library in Havana, offered the standard Cuban interpretation of the sinking of the Maine (that the United States itself probably did it) in an interview to The New York Times. But Acosta adds that "Americans died for the freedom of Cuba, and that should be recognized. But others wanted to annex Cuba, and that should be criticized. If relations with the United States improve, all these things can be re-examined more fairly".[79] This claim has also been made in Russia. Mikhail Khazin, a Russian economist who once ran the cultural section at Komsomolskaya Pravda, speaking in a 2008 Pravda interview of the need in troubled times to change the psychology of society, to unite it, said that "the Americans blew up their own battleship Maine."[80]
Operation Northwoods was a series of proposals prepared by Pentagon officials for the Joint Chiefs of Staff in 1962, setting out a number of proposed false flag operations that could be blamed on the Cuban Communists in order to rally support against them.[81][82] One of these suggested that a U.S. Navy ship be blown up in Guantanamo Bay deliberately. In an echo of the yellow press headlines of the earlier period, the specific phrase "A 'Remember the Maine' incident" was used.[82][83]
For several years, Maine was left where she sank in Havana Harbor, although it was evident she would have to be removed sometime. Maine took up valuable space and the buildup of silt around her hull threatened to create a shoal. In addition, various patriotic groups wanted mementos of the ship. On 9 May 1910, Congress authorized funds for the removal of Maine, the proper interment in Arlington National Cemetery of the estimated 70 bodies still inside, and the removal and transport of the main mast to Arlington. Congress did not demand a new investigation into the sinking at that time.[84]
The Army Corps of Engineers built a cofferdam around Maine and pumped water out from inside it.[4] By 30 June 1911, Maine's main deck was exposed. The ship forward of frame 41 was entirely destroyed; a twisted mass of steel out of line with the rest of the hull, all that was left of the bow, bore no resemblance to a ship. The rest of the wreck was badly corroded. Army engineers dismantled the damaged superstructure and decks, which were then dumped at sea. About halfway between bow and stern, they built a concrete and wooden bulkhead to seal the after-section, then cut away what was left of the forward portion. Holes were cut in the bottom of the after-section, through which jets of water were pumped, to break the mud seal holding the ship, then plugged, with flood cocks, which would later be used for sinking the ship.[85]
The Maine had been outfitted with Worthington steam pumps. Although lying on the bottom of Havana Harbor for fourteen years these pumps were found to be still operational, and were subsequently used to raise the ship. (Worthington Pump History, 1840ÿ1940)
On 13 February 1912, the engineers let water back into the interior of the cofferdam. Three days later, the interior of the cofferdam was full and Maine floated. Two days after that, Maine was towed out by the tug Osceola. The bodies of its crew were then removed to the armored cruiser North Carolina for repatriation. On 16 March, Maine was towed four miles from the Cuban coast by Osceola, escorted by North Carolina and the light cruiser Birmingham. Its sea cocks were opened and it sank in 600 fathoms (3,600?ft; 1,100?m) of water to the salutes of Birmingham and North Carolina.[86][87] During the salvage, remains of 66 more were found, of whom only one (an engineering officer) was identified and returned to his home town; the rest were reburied at Arlington Cemetery making a total of 229 buried there.[88]
In 2000, the wreck of Maine was rediscovered by Advanced Digital Communications, a Toronto-based expedition company, in about 3,770 feet (1,150 m) of water roughly 3 miles (4.8?km) northeast of Havana Harbor. The company had been working with Cuban scientists and oceanographers from the University of South Florida College of Marine Science, on testing underwater exploration technology. The ship had been discovered east of where it was believed it had been scuttled; according to the researchers, during the sinking ceremony and the time it took the wreck to founder, currents pushed the Maine east until it came to rest at its present location. Before the team identified the site as Maine, they referred to the location as the "square" due to its unique shape, and at first they did not believe it was the ship, due to its unexpected location. The site was explored with an ROV. According to Dr. Frank Muller-Karger, the hull was not oxidized and the crew could "see all of its structural parts".[89] The expedition was able to identify the ship due to the doors and hatches on the wreck, as well as the anchor chain, the shape of the propellers, and the holes where the bow was cut off. Due to the 1912 raising of the ship, the wreck was completely missing its bow; this tell-tale feature was instrumental in identifying the ship. The team also located a boiler nearby, and a debris field of coal.[89]
In February 1898, the recovered bodies of sailors who died on Maine were interred in the Colon Cemetery, Havana. Some injured sailors were sent to hospitals in Havana and Key West, Florida. Those who died in hospitals were buried in Key West. In December 1899, the bodies in Havana were disinterred and brought back to the United States for burial at Arlington National Cemetery.[90] In 1915, President Woodrow Wilson dedicated the USS Maine Mast Memorial to those who died. The memorial includes the ship's main mast. Roughly 165 were buried at Arlington, although the remains of one sailor were exhumed for his home town, Indianapolis, Indiana. Of the rest, only 62 were known.[88] Nine bodies were never recovered and 19 crewmen, several unidentified, are buried in Key West Cemetery under a statue of a U.S. Sailor holding an oar.[c]
Memorial at Arlington National Cemetery centered on the ship's main mast
The Cuban Friendship Urn on Ohio Drive, Southwest, Washington, D.C., East Potomac Park
Monument to victims of Maine in Havana, Cuba, c. 1930
A gun from Maine at Fort Allen Park, Portland, Maine
U.S. Battleship Maine Monument Key West Cemetery, Florida
The explosion-bent fore mast of Maine is located at the United States Naval Academy at Annapolis, Maryland.[91][92]
In 1926, the Cuban government erected a memorial to the victims of Maine on the Malecon, near the Hotel Nacional, to commemorate United States assistance in acquiring Cuban independence from Spain. The memorial was damaged by crowds, following the Bay of Pigs Invasion in 1961, and the eagle on top was broken and removed.[93] The Communist government then added its own inscription blaming "imperialist voracity in its eagerness to seize the island of Cuba" for Maine's sinking.[93][94] The monument was cleaned and restored in 2013. However, the eagle's head was retained by the U.S. Interests Section in Havana, and the body by the city's museum.[95]
USS Maine Monument in New York City
USS Maine Monument, Columbus Circle, NYC
Columbia Triumphant
Memorial plaque by Charles Keck, USS Maine Memorial
Sculpture group by Attilio Piccirilli at USS Maine Memorial
Columbia Triumphant sculpture group atop USS Maine Memorial
A 6-inch deck gun from Maine is on the North lawn of the South Carolina State House in Columbia, SC.
A bronze torpedo tube and armoured hatch form part of a memorial in West Park, Pittsburgh, Pennsylvania, just south of West North Avenue, http://www.waymarking.com/waymarks/WM769B_Maine_Memorial_Pittsburgh_PA
There is also a U.S.S. Maine Memorial plaque<visible in photographs and physically in situ> at the south door of the Jefferson County Courthouse, in Steubenville, OH.
Coordinates: 231153N 822118W? / ?23.198N 82.355W? / 23.198; -82.355? (USS Maine)[101]
?This article incorporates?public domain material from the United States Navy website https://web.archive.org/web/20010204074000/http://www.history.navy.mil/faqs/faq71-1.htm.
Bibliography
When did stevie wonder released his first album?
September 1962🚨Stevland Hardaway Morris (n Judkins; born May 13, 1950),[1] known by his stage name Stevie Wonder, is an American singer, songwriter, record producer, and multi-instrumentalist. A child prodigy, he is considered to be one of the most critically and commercially successful musical performers of the late 20th century.[2] Wonder signed with Motown's Tamla label at the age of 11,[2] and he continued performing and recording for Motown into the 2010s. He has been blind since shortly after birth.[3]
Among Wonder's works are singles such as "Signed, Sealed, Delivered I'm Yours", "Superstition", "Sir Duke", "You Are the Sunshine of My Life" and "I Just Called to Say I Love You"; and albums such as Talking Book, Innervisions and Songs in the Key of Life.[2] He has recorded more than 30 U.S. top ten hits and received 25 Grammy Awards, one of the most-awarded male solo artists, and has sold over 100 million records worldwide, making him one of the top 60 best-selling music artists.[4] Wonder is also noted for his work as an activist for political causes, including his 1980 campaign to make Martin Luther King Jr.'s birthday a holiday in the United States.[5] In 2009, Wonder was named a United Nations Messenger of Peace.[6] In 2013, Billboard magazine released a list of the Billboard Hot 100 All-Time Top Artists to celebrate the US singles chart's 55th anniversary, with Wonder at number six.[7]
Stevie Wonder was born in Saginaw, Michigan, in 1950, the third of six children of Calvin Judkins and Lula Mae Hardaway, a songwriter. He was born six weeks premature which, along with the oxygen-rich atmosphere in the hospital incubator, resulted in retinopathy of prematurity (ROP), a condition in which the growth of the eyes is aborted and causes the retinas to detach; so he became blind.[3][8] When Wonder was four, his mother divorced his father and moved to Detroit with her children. She changed her name back to Lula Hardaway and later changed her son's surname to Morris, partly because of relatives. Wonder has retained Morris as his legal surname. He began playing instruments at an early age, including piano, harmonica and drums. He formed a singing partnership with a friend; calling themselves Stevie and John, they played on street corners, and occasionally at parties and dances.[9]
Wonder sang as a child in a choir at the Whitestone Baptist Church in Detroit, Michigan.[10]
In 1961, when aged 11, Wonder sang his own composition, "Lonely Boy", to Ronnie White of the Miracles;[11][12] White then took Wonder and his mother to an audition at Motown, where CEO Berry Gordy signed Wonder to Motown's Tamla label.[1] Before signing, producer Clarence Paul gave him the name Little Stevie Wonder.[3] Because of Wonder's age, the label drew up a rolling five-year contract in which royalties would be held in trust until Wonder was 21. He and his mother would be paid a weekly stipend to cover their expenses: Wonder received $2.50 (equivalent to $20.47 in 2017) per week, and a private tutor was provided for when Wonder was on tour.[12]
Wonder was put in the care of producer and songwriter Clarence Paul, and for a year they worked together on two albums. Tribute to Uncle Ray was recorded first, when Wonder was still 11 years old. Mainly covers of Ray Charles's songs, it included a Wonder and Paul composition, "Sunset". The Jazz Soul of Little Stevie was recorded next, an instrumental album consisting mainly of Paul's compositions, two of which, "Wondering" and "Session Number 112", were co-written with Wonder.[13] Feeling Wonder was now ready, a song, "Mother Thank You", was recorded for release as a single, but then pulled and replaced by the Berry Gordy song "I Call It Pretty Music, But the Old People Call It the Blues" as his dbut single;[14] released summer 1962,[15] it almost broke into the Billboard 100, spending one week of August at 101 before dropping out of sight.[16] Two follow-up singles, "Little Water Boy" and "Contract on Love", both had no success, and the two albums, released in reverse order of recordingThe Jazz Soul of Little Stevie in September 1962 and Tribute to Uncle Ray in October 1962also met with little success.[13][17]
At the end of 1962, when Wonder was 12 years old, he joined the Motortown Revue, touring the "chitlin' circuit" of theatres across America that accepted black artists. At the Regal Theater, Chicago, his 20-minute performance was recorded and released in May 1963 as the album Recorded Live: The 12 Year Old Genius.[13] A single, "Fingertips", from the album was also released in May, and became a major hit.[18] The song, featuring a confident and enthusiastic Wonder returning for a spontaneous encore that catches out the replacement bass player, who is heard to call out "What key? What key?",[18][19] was a No. 1 hit on the Billboard Hot 100 when Wonder was aged 13, making him the youngest artist ever to top the chart.[20] The single was simultaneously No. 1 on the R&B chart, the first time that had occurred.[21] His next few recordings, however, were not successful; his voice was changing as he got older, and some Motown executives were considering cancelling his recording contract.[21] During 1964, Wonder appeared in two films as himself, Muscle Beach Party and Bikini Beach, but these were not successful either.[22] Sylvia Moy persuaded label owner Berry Gordy to give Wonder another chance.[21] Dropping the "Little" from his name, Moy and Wonder worked together to create the hit "Uptight (Everything's Alright)",[21] and Wonder went on to have a number of other hits during the mid-1960s, including "With a Child's Heart", and "Blowin' in the Wind",[19] a Bob Dylan cover, co-sung by his mentor, producer Clarence Paul.[23] He also began to work in the Motown songwriting department, composing songs both for himself and his label mates, including "The Tears of a Clown", a No. 1 hit for Smokey Robinson and the Miracles (it was first released in 1967, mostly unnoticed as the last track of their Make It Happen LP, but eventually became a major success when re-released as a single in 1970, which prompted Robinson to reconsider his intention of leaving the group).[24]
In 1968 he recorded an album of instrumental soul/jazz tracks, mostly harmonica solos, under the title Eivets Rednow, which is "Stevie Wonder" spelled backwards.[25] The album failed to get much attention, and its only single, a cover of "Alfie", only reached number 66 on the U.S. Pop charts and number 11 on the US Adult Contemporary charts. Nonetheless, he managed to score several hits between 1968 and 1970 such as "I Was Made to Love Her",[23] "For Once in My Life" and "Signed, Sealed, Delivered I'm Yours". A number of Wonder's early hits, including "My Cherie Amour", "I Was Made to Love Her", and "Uptight (Everything's Alright)", were co-written with Henry Cosby.
In September 1970, at the age of 20, Wonder married Syreeta Wright, a songwriter and former Motown secretary. Wright and Wonder worked together on the next album, Where I'm Coming From; Wonder writing the music, and Wright helping with the lyrics.[26] They wanted to "touch on the social problems of the world", and for the lyrics "to mean something".[26] It was released at around the same time as Marvin Gaye's What's Going On. As both albums had similar ambitions and themes, they have been compared; in a contemporaneous review by Vince Aletti in Rolling Stone, Gaye's was seen as successful, while Wonder's was seen as failing due to "self-indulgent and cluttered" production, "undistinguished" and "pretentious" lyrics, and an overall lack of unity and flow.[27] Also in 1970, Wonder co-wrote, and played numerous instruments on the hit "It's a Shame" for fellow Motown act the Spinners. His contribution was meant to be a showcase of his talent and thus a weapon in his ongoing negotiations with Gordy about creative autonomy.[28] Reaching his 21st birthday on May 13, 1971, he allowed his Motown contract to expire.[29]
During this period, Wonder independently recorded two albums and signed a new contract with Motown Records. The 120-page contract was a precedent at Motown and gave Wonder a much higher royalty rate.[30] Wonder returned to Motown in March 1972 with Music of My Mind. Unlike most previous albums on Motown, which usually consisted of a collection of singles, B-sides and covers, Music of My Mind was a full-length artistic statement with songs flowing together thematically.[30] Wonder's lyrics dealt with social, political, and mystical themes as well as standard romantic ones, while musically he began exploring overdubbing and recording most of the instrumental parts himself.[30] Music of My Mind marked the beginning of a long collaboration with Tonto's Expanding Head Band (Robert Margouleff and Malcolm Cecil).[31][32]
Released in late 1972, Talking Book featured the No. 1 hit "Superstition",[33] which is one of the most distinctive and famous examples of the sound of the Hohner Clavinet keyboard.[34] Talking Book also featured "You Are the Sunshine of My Life", which also peaked at No. 1. During the same time as the album's release, Wonder began touring with the Rolling Stones to alleviate the negative effects from pigeonholing as a result of being an R&B artist in America.[11] Wonder's touring with the Stones was also a factor behind the success of both "Superstition" and "You Are the Sunshine of My Life".[30][35] Between them, the two songs won three Grammy Awards.[36] On an episode of the children's television show Sesame Street that aired in April 1973,[37] Wonder and his band performed "Superstition", as well as an original called "Sesame Street Song", which demonstrated his abilities with television.
Innervisions, released in 1973, featured "Higher Ground" (No. 4 on the pop charts) as well as the trenchant "Living for the City" (No. 8).[33] Both songs reached No. 1 on the R&B charts. Popular ballads such as "Golden Lady" and "All in Love Is Fair" were also present, in a mixture of moods that nevertheless held together as a unified whole.[38] Innervisions generated three more Grammy Awards, including Album of the Year.[36] The album is ranked No. 23 on Rolling Stone's 500 Greatest Albums of All Time.[39] Wonder had become the most influential and acclaimed black musician of the early 1970s.[30]
On August 6, 1973, Wonder was in a serious automobile accident while on tour in North Carolina, when a car in which he was riding hit the back of a truck.[30][40] This left him in a coma for four days and resulted in a partial loss of his sense of smell and a temporary loss of sense of taste.[41] Despite the setback, Wonder re-appeared for a European tour in early 1974, performing at the Midem convention in Cannes, at the Rainbow Theatre in London, and on the German television show Musikladen.[42] On his return from Europe, he played a sold-out concert at Madison Square Garden in March 1974, highlighting both up-tempo material and long, building improvisations on mid-tempo songs such as "Living for the City".[30] The album Fulfillingness' First Finale appeared in July 1974 and set two hits high on the pop charts: the No. 1 "You Haven't Done Nothin'" and the Top Ten "Boogie on Reggae Woman". The Album of the Year was again one of three Grammys won.[36]
The same year Wonder took part in a Los Angeles jam session that would become known as the bootleg album A Toot and a Snore in '74.[43][44] He also co-wrote and produced the Syreeta Wright album Stevie Wonder Presents: Syreeta.[45][46]
On October 4, 1975, Wonder performed at the historic "Wonder Dream Concert" in Kingston, Jamaica, a benefit for the Jamaican Institute for the Blind.[47]
By 1975, at age 25, Wonder had won two consecutive Grammy Awards: in 1974 for Innervisions and in 1975 for Fulfillingness' First Finale.[48] In 1975, he played harmonica on two tracks on Billy Preston's album It's My Pleasure.
The double album-with-extra-EP Songs in the Key of Life was released in September 1976. Sprawling in style, unlimited in ambition, and sometimes lyrically difficult to fathom, the album was hard for some listeners to assimilate, yet is regarded by many as Wonder's crowning achievement and one of the most recognizable and accomplished albums in pop music history.[30][33][49] The album became the first by an American artist to debut straight at No. 1 in the Billboard charts, where it stood for 14 non-consecutive weeks.[50] Two tracks became No. 1 Pop/R&B hits "I Wish" and "Sir Duke". The baby-celebratory "Isn't She Lovely?" was written about his newborn daughter Aisha, while songs such as "Love's in Need of Love Today" and "Village Ghetto Land" reflected a far more pensive mood. Songs in the Key of Life won Album of the Year and two other Grammys.[36] The album ranks 57th on Rolling Stone's 500 Greatest Albums of All Time.[39]
Until 1979's Stevie Wonder's Journey Through "The Secret Life of Plants" his only release was the retrospective three-disc album Looking Back, an anthology of his early Motown period.
The 1980s saw Wonder achieving his biggest hits and highest level of fame; he had increased album sales, charity participation, high-profile collaborations, political impact, and television appearances. The 1979 mainly instrumental soundtrack album Stevie Wonder's Journey Through "The Secret Life of Plants" was composed using an early music sampler, a Computer Music Melodian.[51] Wonder toured briefly in support of the album, and used a Fairlight CMI sampler on stage.[52] In this year Wonder also wrote and produced the dance hit "Let's Get Serious", performed by Jermaine Jackson and (ranked by Billboard as the No. 1 R&B single of 1980).
Hotter than July (1980) became Wonder's first platinum-selling single album, and its single "Happy Birthday" was a successful vehicle for his campaign to establish Dr. Martin Luther King's birthday as a national holiday. The album also included "Master Blaster (Jammin')", "I Ain't Gonna Stand for It", and the sentimental ballad, "Lately".
In 1982, Wonder released a retrospective of his 1970s work with Stevie Wonder's Original Musiquarium, which included four new songs: the ten-minute funk classic "Do I Do" (which featured Dizzy Gillespie), "That Girl" (one of the year's biggest singles to chart on the R&B side), "Front Line", a narrative about a soldier in the Vietnam War that Wonder wrote and sang in the first person, and "Ribbon in the Sky", one of his many classic compositions. He also gained a No. 1 hit that year in collaboration with Paul McCartney in their paean to racial harmony, "Ebony and Ivory".
In 1983, Wonder performed the song "Stay Gold", the theme to Francis Ford Coppola's film adaptation of S. E. Hinton's novel The Outsiders. Wonder wrote the lyrics. In 1983, he scheduled an album to be entitled People Work, Human Play. The album never surfaced and instead 1984 saw the release of Wonder's soundtrack album for The Woman in Red. The lead single, "I Just Called to Say I Love You", was a No. 1 pop and R&B hit in both the United States and the United Kingdom, where it was placed 13th in the list of best-selling singles in the UK published in 2002. (The single was also a hit in lots of other countries as well). It went on to win an Academy award for best song in 1985. Wonder accepted the award in the name of Nelson Mandela and was subsequently banned from all South African radio by the Government of South Africa.[53] Incidentally, on the occasion of his 35th birthday, Stevie Wonder was honored by the United Nations Special Committee Against Apartheid for his stance against racism in South Africa that same year (1985).[54] The album also featured a guest appearance by Dionne Warwick, singing the duet "It's You" with Stevie and a few songs of her own. Following the success of the album and its lead single,Wonder made an appearance on The Cosby Show,in the episode "A Touch of Wonder" where he demonstrated his ability to sample. The following year's In Square Circle featured the No. 1 pop hit "Part-Time Lover". The album also has a Top 10 Hit with "Go Home." It also featured the ballad "Overjoyed", which was originally written for Journey Through "The Secret Life of Plants", but did not make the album. He performed "Overjoyed" on Saturday Night Live when he was the host. He was also featured in Chaka Khan's cover of Prince's "I Feel For You", alongside Melle Mel, playing his signature harmonica. In roughly the same period he was also featured on harmonica on Eurythmics' single, "There Must Be an Angel (Playing with My Heart)" and Elton John's "I Guess That's Why They Call It the Blues".
Wonder was in a featured duet with Bruce Springsteen on the all-star charity single for African Famine Relief, "We Are the World", and he was part of another charity single the following year (1986), the AIDS-inspired "That's What Friends Are For". He played harmonica on the album Dreamland Express by John Denver in the song "If Ever", a song Wonder co-wrote with Stephanie Andrews; wrote the track "I Do Love You" for the Beach Boys' 1985 self-titled album; and played harmonica on "Can't Help Lovin' That Man" on The Broadway Album by Barbra Streisand. In 1987, Wonder appeared on Michael Jackson's Bad album, on the duet "Just Good Friends". Michael Jackson also sang a duet with him entitled "Get It" on Wonder's 1987 album Characters. This was a minor hit single, as were "Skeletons" and "You Will Know".
After 1987's Characters album, Wonder continued to release new material, but at a slower pace. He recorded a soundtrack album for Spike Lee's film Jungle Fever in 1991. From this album, singles and videos were released for "Gotta Have You", "Fun Day"(remix only), "These Three Words" and "Jungle Fever" . The B-side to the "Gotta Have You" single was "Feeding Off The Love of the Land", which was played during the end credits of the movie Jungle Fever but was not included on the soundtrack. A piano and vocal version of "Feeding Off The Love of the Land" was also released on the Nobody's Child: Romanian Angel Appeal compilation. Conversation Peace and the live album Natural Wonder were released in the 1990s.[55]
Among his other activities he played harmonica on one track for the 1994 tribute album Kiss My Ass: Classic Kiss Regrooved;[56] sang at the 1996 Summer Olympics closing ceremony;[57] collaborated in 1997 with Babyface on "How Come, How Long", a song about domestic violence that was nominated for a Grammy award;[58] and played harmonica on Sting's 1999 "Brand New Day".[59] In December 1999, Wonder announced that he was interested in pursuing an intraocular retinal prosthesis to partially restore his sight.[60]
Into the 21st century, Wonder continues to record and perform; though mainly occasional appearances and guest performances, he did do two tours, and released one album of new material, 2005's A Time to Love. His key appearances include performing at the opening ceremony of the 2002 Winter Paralympics in Salt Lake City,[61] the 2005 Live 8 concert in Philadelphia,[62] the pre-game show for Super Bowl XL in 2006, the Obama Inaugural Celebration in 2009, and the opening ceremony of the 2011 Special Olympics World Summer Games in Athens, Greece.[63]
He sang at the Michael Jackson memorial service in 2009,[64] at Etta James' funeral, in 2012,[65] and a month later at Whitney Houston's memorial service.[66]
Wonder's first new album in ten years, A Time to Love, was released in October 2005 to lower sales than previous albums, and lukewarm reviewsmost reviewers appearing frustrated at the end of the long delay to get an album that mainly copied the style of Wonder's "classic period" without doing anything new.[67] The first single, "So What the Fuss", was released in April. A second single, "From the Bottom of My Heart", was a hit on adult-contemporary R&B radio. The album also featured a duet with India Arie on the title track "A Time to Love". By June 2008, Wonder was working on two projects simultaneously: a new album called The Gospel Inspired By Lula, which will deal with the various spiritual and cultural crises facing the world, and Through The Eyes Of Wonder, an album he has described as a performance piece that will reflect his experience as a blind man. Wonder was also keeping the door open for a collaboration with Tony Bennett and Quincy Jones concerning a rumored jazz album.[68] If Wonder were to join forces with Bennett, it would not be for the first time; their rendition of "For Once in My Life" earned them a Grammy for best pop collaboration with vocals in 2006.[36] Wonder's harmonica playing can be heard on the 2009 Grammy-nominated "Never Give You Up", featuring CJ Hilton and Raphael Saadiq.[69]
In October 2013, Wonder revealed that he had been recording new material for two albums, When the World Began and Ten Billion Hearts, in collaboration with producer David Foster, the albums to be released in 2014.[70] He is featured on two tracks on Mark Ronson's new album Uptown Special.
Wonder did a 13-date tour of North America in 2007, starting in San Diego on August 23; this was his first U.S. tour in over ten years.[71] On September 8, 2008, Wonder started the European leg of his Wonder Summer's Night Tour, the first time he had toured Europe in over a decade. His opening show was at the National Indoor Arena in Birmingham. During the tour, Wonder played eight UK gigs; four at the O2 Arena in London (filmed in HD and subsequently released as a live in concert release on DVD and Blu-Ray, "Live At Last"[72]), two in Birmingham and two at the M.E.N. Arena in Manchester. Wonder's other stops in the tour's European leg also found him performing in the Netherlands (Rotterdam), Sweden (Stockholm), Germany (Cologne, Mannheim and Munich), Norway (Hamar), France (Paris), Italy (Milan) and Denmark (Aalborg). Wonder also toured Australia (Perth, Adelaide, Melbourne, Sydney and Brisbane) and New Zealand (Christchurch, Auckland and New Plymouth) in October and November.[73] His 2010 tour included a two-hour set at the Bonnaroo Music Festival in Manchester, Tennessee, a stop at London's "Hard Rock Calling" in Hyde Park, and appearances at England's Glastonbury Festival, Rotterdam's North Sea Jazz Festival, and a concert in Bergen, Norway, and a concert in Dublin, Ireland, at the O2 Arena on June 24.[73]
In 2000, Wonder contributed two new songs to the soundtrack for Spike Lee's Bamboozled album ("Misrepresented People" and "Some Years Ago").[74] In June 2006, Wonder made a guest appearance on Busta Rhymes' album The Big Bang, on the track "Been through the Storm". He sings the refrain and plays the piano on the Dr. Dre and Sha Money XL-produced track. He appeared again on the last track of Snoop Dogg's album Tha Blue Carpet Treatment, "Conversations". The song is a remake of "Have a Talk with God" from Songs in the Key of Life. In 2006, Wonder staged a duet with Andrea Bocelli on the latter's album Amore, offering harmonica and additional vocals on "Canzoni Stonate". Wonder also performed at Washington, D.C.'s 2006 "A Capitol Fourth" celebration. Wonder appeared on singer Celine Dion's studio album Loved Me Back to Life performing a cover of his 1985 song "Overjoyed".[75] The album was released in October 2013.
A prominent figure in popular music during the latter half of the 20th century, Wonder has recorded more than 30 U.S. top ten hits and won 25 Grammy Awards[36] (the most ever won by a solo artist) as well as a Lifetime Achievement Award. He has also won an Academy Award for Best Song,[76] and been inducted into both the Rock and Roll[77] and Songwriters[78] halls of fame. He has also been awarded the Polar Music Prize.[79] American music magazine Rolling Stone named him the ninth greatest singer of all time.[80][81] In June 2009 he became the fourth artist to receive the Montreal Jazz Festival Spirit Award.[82]
He has had ten U.S. number-one hits on the pop charts as well as 20 R&B number one hits, and has sold over 100 million records, 19.5?million of which are albums;[83] he is one of the top 60 best-selling music artists with combined sales of singles and albums.[4] Wonder has recorded several critically acclaimed albums and hit singles, and writes and produces songs for many of his label mates and outside artists as well. Wonder plays the piano, synthesizer, harmonica, congas, drums, bongos, organ, melodica and Clavinet. In his childhood, he was best known for his harmonica work, but today he is better known for his keyboard skills and vocal ability. Wonder was the first Motown artist and second African-American musician to win an Academy Award for Best Original Song, which he won for his 1984 hit single "I Just Called to Say I Love You" from the movie The Woman in Red.
Wonder's "classic period" is generally agreed to be between 1972 and 1977.[84][85][86] Some observers see in 1971's Where I'm Coming From certain indications of the beginning of the classic period, such as its new funky keyboard style which Wonder used throughout the classic period.[86] Some determine Wonder's first "classic" album to be 1972's Music of My Mind, on which he attained personal control of production, and on which he programmed a series of songs integrated with one another to make a concept album.[86] Others skip over early 1972 and determine the beginning of the classic period to be Talking Book in late 1972,[87] the album in which Wonder "hit his stride".[86]
His classic 1970s albums were very influential on the music world: the 1983 Rolling Stone Record Guide said they "pioneered stylistic approaches that helped to determine the shape of pop music for the next decade";[33] Rolling Stone's 2003 list of the 500 Greatest Albums of All Time included four of the five albums, with three in the top 90;[39] and in 2005, Kanye West said of his own work, "I'm not trying to compete with what's out there now. I'm really trying to compete with Innervisions and Songs in the Key of Life. It sounds musically blasphemous to say something like that, but why not set that as your bar?"[88]
Wonder has been married three times. He was married to Motown singer-songwriter and frequent collaborator Syreeta Wright from 1970 until their amicable divorce in 1972. From 2001 until 2012 he was married to fashion designer Kai Millard.[89] In October 2009, Wonder and Millard separated; Wonder filed for divorce in August 2012.[90] In 2017 he married Tomeeka Bracy.[91]
Wonder has nine children by five different women.[92] The mother of Wonder's first child is Yolanda Simmons, whom Wonder met when she applied for a job as secretary for his publishing company.[93] Simmons gave birth to Wonder's daughter on February 2, 1975: Aisha Morris.[94][95] After Aisha was born, Stevie said "she was the one thing that I needed in my life and in my music for a long time."[93] Aisha was the inspiration for Wonder's hit single "Isn't She Lovely?" Aisha is a singer who has toured with her father and accompanied him on recordings, including his 2005 album, A Time to Love. Wonder and Simmons had a son, Keita, in 1977.[96]
In 1983 Wonder had a son with Melody McCulley, Mumtaz Morris.[97] Wonder has a daughter, Sophia, and a son, Kwame, with a woman whose identity has not been publicly disclosed.[96]
Wonder has two sons with second wife Kai Millard Morris; the elder is named Kailand and he occasionally performs as a drummer on stage with his father. The younger son, Mandla Kadjay Carl Stevland Morris, was born on May 13, 2005, his father's 55th birthday.[89]
Wonder's ninth child, his second with Tomeeka Robyn Bracy, was born in December 2014, amid rumors that Wonder would be the father to triplets.[98] This turned out not to be the case, and the couple's new daughter was given the name of Nia,[99] meaning "purpose" ÿ "one of the seven principles of Kwanzaa".[98] The name of Wonder's first child with Bracy is unknown.
In May 2006, Wonder's mother Lula Mae Hardaway died in Los Angeles at the age of 76. During his September 8, 2008, UK concert in Birmingham, he spoke of his decision to begin touring again following his loss: "I want to take all the pain that I feel and celebrate and turn it around."[100]
Wonder was introduced to Transcendental Meditation through his marriage to Syreeta Wright.[101] Consistent with that spiritual vision, Wonder became vegetarian, and later a vegan, singing about it on The Late Late Show with James Corden during the show's "Carpool Karaoke" segment.[102][103][104]
Stevie Wonder joined twitter on April 4, 2018, and his first tweet was a five-minute video honoring Martin Luther King Jr. Dozens of famous personalities were rounded up in the video, which was titled as "The Dream Still Lives". Each person involved shared their dream calling back to King?s popular speech in 1963. His very first tweet took the internet by storm where he also encouraged viewers to share their own videos about their dreams with the hashtag #DreamStillLives.[105]
Stevie Wonder has been a longtime Baptist affiliated with Black churches.[106][107][108]
Wonder has won 25 Grammy Awards:[36] as well as a Grammy Lifetime Achievement Award in 1996.[109] He is one of only two artists and groups who have won the Grammy for Album of the Year three times as the main credited artist, along with Frank Sinatra.
Wonder has been given a range of awards for his music, and for his civil rights work, including induction into the Songwriters and the Rock and Roll halls of fame; gaining a Lifetime Achievement Award from the National Civil Rights Museum, being named one of the United Nations Messengers of Peace, and earning a Presidential Medal of Freedom from President Barack Obama in 2014.
In December 2016, the City of Detroit recognized Wonder's legacy by renaming a portion of his childhood street, Milwaukee Avenue West, between Woodward Avenue and Brush Street, as "Stevie Wonder Avenue". He was also awarded an honorary key to the city, presented by Mayor Mike Duggan.[111]
Stevie Wonder has received many honorary degrees in recognition of his music career. These include:
Who performed the first successful open heart surgery?
Daniel Hale Williams🚨Daniel Hale Williams (January 18, 1856[1] ÿ August 4, 1931) was an African-American general surgeon, who in 1893 performed the second documented, successful pericardium surgery in the United States to repair a wound.[2][3][4][5] He founded Chicago's Provident Hospital, the first non-segregated hospital in the United States, and also founded an associated nursing school for African Americans.
The heart surgery at Provident, which his patient survived for the next twenty years, is referred to as "the first successful heart surgery" by Encyclopedia Britannica.[6] [7] In 1913, Williams was elected as the only African-American charter member of the American College of Surgeons.[6]
At the time that Williams graduated from medical school, black doctors were not allowed to work in Chicago hospitals. As a result, in 1891, Williams founded the Provident Hospital and training school for nurses in Chicago. This was established mostly for the benefit of African-American residents, to increase their accessibility to health care.[8]
In 1893, Williams became the first African American on record to have successfully performed pericardium surgery to repair a wound. On September 6, 1891,[3][4] Henry Dalton had successfully performed pericardium surgery to repair a wound, with the patient fully recovering.[9] Earlier successful surgeries to drain the pericardium, by performing a pericardiostomy were done by Francisco Romero in 1801[10] and Dominique Jean Larrey in 1810.[11]
On July 10, 1893, Williams repaired the torn pericardium of a knife wound patient, James Cornish.[3] Cornish, who was stabbed directly through the left fifth costal cartilage,[3] had been admitted the previous night. Williams decided to operate the next morning in response to continued bleeding, cough and "pronounced" symptoms of shock.[3] He performed this surgery, without the benefit of penicillin or blood transfusion, at Provident Hospital, Chicago.[12] It was not reported until 1897.[3] He undertook a second procedure to drain fluid. About fifty days after the initial procedure, Cornish left the hospital.[8]
In 1893, during the administration of President Grover Cleveland, Williams was appointed surgeon-in-chief of Freedman's Hospital in Washington, D.C., a post he held until 1898. That year he married Alice Johnson, who was born in the city and graduated from Howard University, and moved back to Chicago. In addition to organizing Provident Hospital, Williams also established a training school for African-American nurses at the facility.
Williams was a Professor of Clinical Surgery at Meharry Medical College in Nashville, Tennessee and was an attending surgeon at Cook County Hospital in Chicago. He worked to create more hospitals that admitted African Americans. In 1895 he co-founded the National Medical Association for African American doctors, and in 1913 he became a charter member and the only African-American doctor in the American College of Surgeons.
Daniel Hale Williams was born in 1856 and raised in the city of Hollidaysburg, Pennsylvania. His father, Daniel Hale Williams, Jr. was the son of a black barber and a Scots-Irish woman.[13] His mother was African American and likely also mixed race.
The fifth child born, Williams lived with his parents, a brother and five sisters. His family eventually moved to Annapolis, Maryland. Shortly after when Williams was nine, his father died of tuberculosis.[14] Williams' mother realized she could not manage the entire family and sent some of the children to live with relatives. Williams was apprenticed to a shoemaker in Baltimore, Maryland but ran away to join his mother, who had moved to Rockford, Illinois. He later moved to Edgerton, Wisconsin, where he joined his sister and opened his own barber shop. After moving to nearby Janesville, Wisconsin, Williams became fascinated by the work of a local physician and decided to follow his path.
He began working as an apprentice to Dr. Henry W. Palmer, studying with him for two years. In 1880 Williams ntered Chicago Medical College, now known as Northwestern University Medical School. After graduation from Northwestern in 1883, he opened his own medical office in Chicago, Illinois.[15]
Williams was married in 1898 to Alice Johnson, natural daughter of American sculptor Moses Jacob Ezekiel and a mixed-race maid.[16] Williams died of a stroke in Idlewild, Michigan on August 4, 1931. His wife, Alice Johnson, had died in 1924.[8]
In the 1890s several attempts were made to improve cardiac surgery. On September 6, 1891 the first successful pericardial sac repair operation in the United States of America was performed by Henry C. Dalton of Saint Louis, Missouri.[17] The first successful surgery on the heart itself was performed by Norwegian surgeon Axel Cappelen on September 4, 1895 at Rikshospitalet in Kristiania, now Oslo.[18][19] The first successful surgery of the heart, performed without any complications, was by Dr. Ludwig Rehn of Frankfurt, Germany, who repaired a stab wound to the right ventricle on September 7, 1896.[20][21] Despite these improvements, heart-related surgery was not widely accepted in the field of medical science until during World War II. Surgeons were forced to improve their methods of surgery in order to repair severe war wounds.[22] Although they did not receive early recognition for their pioneering work, Dalton and Williams were later recognised for their roles in cardiac surgery.[22]
Williams received honorary degrees from Howard and Wilberforce Universities, was named a charter member of the American College of Surgeons, and was a member of the Chicago Surgical Society.
A Pennsylvania State Historical Marker was placed at U.S. Route 22 eastbound (Blair St., 300 block), Hollidaysburg, Pennsylvania, to commemorate his accomplishments and mark his boyhood home.[23]
What is the main purpose of mvc architecture?
to separate internal representations of information from the ways information is presented to and accepted from the user🚨Modelÿviewÿcontroller is commonly used for developing software that divides an application into three interconnected parts. This is done to separate internal representations of information from the ways information is presented to and accepted from the user.[1][2] The MVC design pattern decouples these major components allowing for efficient code reuse and parallel development.
Traditionally used for desktop graphical user interfaces (GUIs), this architecture has become popular for designing web applications and even mobile, desktop and other clients.[3] Popular programming languages like Java, C#, Ruby, PHP and others have popular MVC frameworks that are currently being used in web application development straight out of the box.
As with other software patterns, MVC expresses the "core of the solution" to a problem while allowing it to be adapted for each system.[4] Particular MVC architectures can vary significantly from the traditional description here.[5]
In addition to dividing the application into three kinds of components, the modelÿviewÿcontroller design defines the interactions between them.[8]
One of the seminal insights in the early development of graphical user interfaces, MVC became one of the first approaches to describe and implement software constructs in terms of their responsibilities.[9]
Trygve Reenskaug introduced MVC into Smalltalk-76 while visiting the Xerox Palo Alto Research Center (PARC)[10][11] in the 1970s. In the 1980s, Jim Althoff and others implemented a version of MVC for the Smalltalk-80 class library. Only later did a 1988 article in The Journal of Object Technology (JOT) express MVC as a general concept.[12]
The MVC pattern has subsequently evolved,[13] giving rise to variants such as hierarchical modelÿviewÿcontroller (HMVC), modelÿviewÿadapter (MVA), modelÿviewÿpresenter (MVP), modelÿviewÿviewmodel (MVVM), and others that adapted MVC to different contexts.
The use of the MVC pattern in web applications exploded in popularity after the introduction of NeXT's WebObjects in 1996, which was originally written in Objective-C (that borrowed heavily from Smalltalk) and helped enforce MVC principles. Later, the MVC pattern became popular with Java developers when WebObjects was ported to Java. Later frameworks for Java, such as Spring (released in October 2002), continued the strong bond between Java and MVC. The introduction of the frameworks Django (July 2005, for Python) and Rails (December 2005, for Ruby), both of which had a strong emphasis on rapid deployment, increased MVC's popularity outside the traditional enterprise environment in which it has long been popular. MVC web frameworks now hold large market-shares relative to non-MVC web toolkits.[14]
Although originally developed for desktop computing, MVC has been widely adopted as an architecture for World Wide Web applications in major programming languages. Several web frameworks have been created that enforce the pattern. These software frameworks vary in their interpretations, mainly in the way that the MVC responsibilities are divided between the client and server.[15]
Some web MVC frameworks take a thin client approach that places almost the entire model, view and controller logic on the server. This is reflected in frameworks such as Django, Rails and ASP.NET MVC. In this approach, the client sends either hyperlink requests or form submissions to the controller and then receives a complete and updated web page (or other document) from the view; the model exists entirely on the server.[15] Other frameworks such as AngularJS, EmberJS, JavaScriptMVC and Backbone allow the MVC components to execute partly on the client (also see Ajax).[citation needed]
Because MVC decouples the various components of an application, developers are able to work in parallel on different components without impacting or blocking one another. For example, a team might divide their developers between the front-end and the back-end. The back-end developers can design the structure of the data and how the user interacts with it without requiring the user interface to be completed. Conversely, the front-end developers are able to design and test the layout of the application prior to the data structure being available.
By creating components that are independent of each other, developers are able to reuse components quickly and easily in other applications. The same (or similar) view for one application can be refactored for another application with different data because the view is simply handling how the data is being displayed to the user.
What species of plant can be found in the great thar desert?
ronj, palm trees, ber, dhok🚨Desert National Park, Rajasthan, India, is situated in the west Indian state of Rajasthan near the town of Jaisalmer. This is one of the largest national parks, covering an area of 3162?km2. The Desert National Park is an excellent example of the ecosystem of the Thar Desert. Sand dunes form around 20% of the Park. The major landform consists of craggy rocks and compact salt lake bottoms, intermedial areas and fixed dunes.
Despite a fragile ecosystem there is an abundance of birdlife. The region is a haven for migratory and resident birds of the desert. Many eagles, harriers, falcons, buzzards, kestrel and vultures. Short-toed eagles, tawny eagles, spotted eagles, laggar falcons and kestrels are the most common among these. Sand grouse are spotted near small ponds or lakes. The endangered great Indian bustard is a magnificent bird found in relatively fair numbers. It migrates locally in different seasons. The most suitable time to visit the area is between November and January. The Desert National Park has a collection of fossils of animals and plants of 180 million years old. Some fossils of dinosaurs of 6 million years old have been found in the area.[1]
Situated in Jaisalmer and Barmer districts of Indian state Rajasthan.
The blackbuck is a common antelope of this region. The national park's other notable inhabitants are the desert fox, wolf and desert cat. Birdlife in this sandy habitat is vivid and spectacular. Birds such as sandgrouse, partridges, bee-eaters, larks, and shrikes are commonly seen. In the winter, the birdlife is augmented by species such as the demoiselle crane and MacQueen's bustard.
Perhaps the greatest attraction of the park is a bird called the great Indian bustard, an endangered species found only in India. Desert National Park is one of the last sites in which this species can be found in good numbers. As such, the species draws in thousands of birdwatchers from all over the world. In addition to the great Indian bustard, the park supports a variety of other birds of interest to birdwatchers and conservationists alike.
The Thar Desert, often called an 'ocean of sand', covers a large area of western Rajasthan. The fragile ecosystem of the Thar supports a unique and varied wildlife. In this vast ocean of sands lies the famous Desert National Park, which provides an excellent example of the ecosystem of the Thar Desert and its diverse wildlife adventure.
The vegetation is sparse, and patches of sewan grass and aak shrub (Calotropis) can be seen. The landscape includes craggy rocks and compact salt lake bottoms, as well as intermediate areas and both fixed and shifting dunes. Around 20 percent of the vast expanse is covered with sand dunes.
Flora: ronj, palm trees, ber, dhok.
Mammals: desert fox, Bengal fox, desert cat, wolf, hedgehog, blackbuck and chinkara.
Reptiles: spiny-tailed lizard, monitor lizard, saw-scaled viper, Russell's viper, common krait.
Avifauna: sandgrouse, partridges, bee-eaters, larks and shrikes are year-round residents, while demoiselle crane and houbara bustard arrive in winter. Raptors include tawny and steppe eagles, long-legged and honey buzzards, and falcons.[2]
Indian bustard: The endangered Indian bustard is the major attraction of Desert National Park. Brown and white in colour, the bustard is a metre tall and has long bare legs and a long neck. One can spot this tall and graceful ground-dwelling bird near the Sudashri waterhole. Sam Sand Dunes:
These dunes are located near the Thar Desert.
Gadsisar Sagar Tank: This tank is among the tourist places in Jaisalmer, Rajasthan. Thousands of migratory birds come to this place every year.
Jeep Safari enables tourists to explore a wider area of the park in relatively short span of time.
Desert Jird at the Desert NP
Indian fox a common resident of Desert National Park
Chinkara commonly known as Indian gazelle at Desert NP