USA & CANADA (901)
Latest News
President Joe Biden, 80, takes a tumble on the stage during Air Force Academy graduation (Video)
Sunday, 04 June 2023 01:27 Written by lindaikeji US President Joe Biden, 80, took a tumble on the stage during Air Force Academy graduation on Thursday, June 1. Afterwards, he sat back down and appeared to be fine. White House communications director Ben LaBolt later explained on Twitter that the mishap occurred because there was a sandbag on the stage.
The fall happened during the U.S. Air Force Academy graduation ceremony in Colorado Springs, where Biden delivered the commencement address. Biden was helped back up and pointed out that he had tripped over something.
As governments shirk their responsibilities, non-profits are more important than ever
Wednesday, 31 May 2023 08:08 Written by theconversationKevin Gosine, Brock University; Darlene Ciuffetelli Parker, Brock University, and Tiffany L. Gallagher, Brock University
You’ve likely walked past that non-profit youth centre or literacy program in your neighborhood countless times. You’ve probably never needed to make use of it and never given it a second thought.
But on your next stroll, take a moment to consider the work that organization does, the challenges it faces and the vast benefits it brings to your community.
In an age of proliferating social troubles and government retreat, Canadians must be aware of the critical role played by the non-profit sector.
Recent decades have seen the welfare state withdraw in favour of free-market principles. In a neoliberal era, where profitability is prioritized over social duty, all orders of government in Canada have shirked much of the responsibility for providing social services onto non-profits.
Importance of social connections
As non-profits have become saddled with more obligations, they are handcuffed by limited funding. Long-term funding arrangements between governments and non-profits have been replaced by provisional and competitive funding. While non-profits are expected to do significantly more, they are relegated to coping with far fewer resources.
This has serious implications for the long-term well-being of communities, especially those already marginalized and under-served.
Not only are non-profits now providing critical services and social supports for which the state previously took responsibility, they are also settings where vital forms of social capital are produced.
Social capital refers to networks of trust, belonging and support developed among people within a given community (bonding social capital), and between people who identify with different communities or social groups (bridging social capital). Social capital enables people to work together toward mutual well-being and goal attainment.
Social capital doesn’t just happen
Communities must find ways to create worthwhile forms of social capital. And that’s where non-profit organizations can fill a gap. However, constantly scrambling for money leaves these organizations little time, resources and capacity to provide programming that fosters social capital.
Our research on community literacy organizations illuminated the role of non-profit organizations in helping people cultivate social capital. We conducted interviews and focus groups with program leads, staff and service users at eight non-profit organizations in southern Ontario to learn how they support literacy in their communities.
We found that producing social capital enabled them to serve communities in ways that transcended their primary mandates.
It is unrealistic to expect people to build social capital on their own, devoid of enabling social infrastructure. The challenge of creating meaningful social connections is daunting. Especially as society becomes increasingly individualistic. Religion — once a stalwart source of community — continues to decline and technology is rapidly displacing face-to-face human interaction. Urban planners and community stakeholders need to provide the settings and opportunities for people to come together, connect and collaborate.
We found that non-profit community programs serve as settings where people from marginalized backgrounds can build beneficial forms of social capital. Such local initiatives provided individuals with recurrent and predictable channels to interact, share lived experiences and work together.
For example, mothers of children with disabilities participated in self-help groups where they shared their experiences, exchanged information and generally supported one another. Civic projects, such as a community garden started at one organization, brought together residents, young and old.
Non-profit programs provide people with opportunities to interact with different community members and forge meaningful interactions with people outside their social group that mitigate prejudice and foster trust and understanding.
Over the course of our research, we saw what started as bridging social capital strengthen into bonding ties between program participants, and in many cases between program users, staff and volunteers. The significance of these bonds was powerfully conveyed by one participant who took part in our study:
… what I take away from this group [is] that there are good people still left in a world that’s so scary, and people that are there to support. And whether I’m here or not, they’re always willing to help somebody else that’s in need. And… knowing that the option of … being there and the people that come together for this group–it’s really incredible to know that you have somebody.
The programs we studied connected individuals to new people, organizations, supports and resources and provided ongoing opportunities to build bridging social capital.
While the primary purpose of the non-profit organizations was to improve literacy, these programs accomplished much more. By providing a judgement-free safe space where participants had opportunities to share and collaborate, these organizations fostered social capital within communities.
The community organizations we studied had recently lost their primary funding provided by a regional anti-poverty program. Program leads and staff remained committed to supporting service users but struggled to do so given the need to devote more time and resources to addressing funding insecurities.
Benefits of social capital
When social capital is actively fostered, social trust is elevated. Research has demonstrated that the more non-profit organizations there are in a community, the lower the crime rate. Non-profit organizations help to lessen crime by enhancing levels of social capital and trust and expanding opportunities and hope.
Strengthening people’s social and organizational ties broadens their horizons and improves their well-being. Non-profits play a crucial role in fostering and sustaining such social capital.
If governments expect communities to be viable and fend for themselves amid diminishing public support, local non-profits cannot be relegated to financial precarity. By starving the non-profit sector, governments are ironically undermining the capacity of communities to live up to the neoliberal ideals of self-reliance and local resourcefulness.
Kevin Gosine, Associate Professor, Department of Sociology, Brock University; Darlene Ciuffetelli Parker, Professor, Department of Educational Studies; Director, Teacher Education, Brock University, and Tiffany L. Gallagher, Professor, Department of Educational Studies and Director, Brock Learning Lab, Brock University
This article is republished from The Conversation under a Creative Commons license. Read the original article.
Dismay over King Charles’s coronation raises questions about Canada’s ties to the monarchy
Monday, 15 May 2023 04:45 Written by theconversationJeffrey B. Meyers, Kwantlen Polytechnic University
The coronation of King Charles was a cringe-inducing display of white European hereditary privilege and ostentation that angered many, both in the United Kingdom and the Commonwealth.
That anger, or in some cases simple apathy or collective eye-rolling, should not be ignored because the monarchy and the Crown are not merely symbols, they’re a massive expense.
The cost of the coronation to the British taxpayer has been estimated at £100 million (almost $170 million in Canadian dollars) — extremely costly in a post-Brexit period of economic uncertainty and decline for the U.K..
Some of the vast private wealth and land holdings of the Royal Family are also connected directly to England’s role in colonization and the slave trade.
Despite all this, the monarch remains the head of state for many Commonwealth countries, including Canada.
The U.S. style of republicanism
While the American experiment in republicanism isn’t looking especially good at the moment amid the shambles left by Donald Trump’s presidency, the country’s founders were correct in recognizing that democratic legitimacy and monarchical power cannot be easily reconciled.
In fact, their biggest mistake and that of subsequent generations may simply have been to permit the presidency to retain elements of absolute or unfettered power in the form of executive privilege.
From George W. Bush’s disastrous war on terror to the Trump administration’s outright repudiation of democratic norms, recent presidents have not hesitated to behave like kings.
In Canada, we can benefit from both the lessons of the United States and the U.K. to avoid idealizing a republic with a powerful president and at the same time acknowledging that a traditional monarchy, even a purely symbolic or constitutional monarchy, is no alternative.
As I have argued before, each Commonwealth nation would have different legislative and constitutional processes to follow to sever ties with the British monarchy. Canada’s in particular would be complex and difficult, but not necessarily impossible.
It would require unanimous consent of all provincial legislatures and the federal Parliament. In practice, this would probably not be possible without referendums in each province. Because of this, some leading constitutional lawyers in Canada regard the question as a non-starter.
But if Canadians aren’t careful, they may one day find that events in the U.K. make the decision for us.
Here’s how.
Different political systems
Suppose current British demographic trends and polling data pan out and a decade or two from now a younger, more diverse British population loses patience with the monarchy.
Like Canada, the U.K. has a constitution and the monarchy is essential to it. But unlike Canada, the U.K.’s constitution is largely unwritten. Changing the British Constitution can at least theoretically be done by an ordinary act of Parliament and without the complexity of co-ordinating 10 sovereign legislatures.
Another difference? The U.K. is a unitary and not a federal state. This means British parliament, unlike Canada’s, can unilaterally amend its constitution to address the status of the monarchy if it wishes.
Similarly in the U.K., any conventions around public consultation would also be arguably less complex and more straightforward than in Canada because of the British system of government. This could lead to a bizarre situation in which the British monarch ceases to be the British head of state but remains the Canadian one.
To my knowledge, this would be a completely uncharted territory and a constitutional crisis of the highest magnitude.
Rather than continuing to sit nervously on the sidelines observing America’s presidential system lurch from crisis to crisis, or celebrating the coronation of Britain’s new king as our own, Canada should learn from the errors of both the republican model and monarchical model and do something different.
Looking ahead
We might start by recognizing forms of political association, governance and policymaking that are less European and owe more to Indigenous models.
Mary Simon, Canada’s governor general and the King’s representative in Canada — as well as first Indigenous person to occupy that colonial office — is correct when she says many Indigenous people look to the treaty relationship with the Crown, which predates Confederation itself, as part of their strategy of decolonization.
But it’s tough to reconcile a European hereditary monarchy with a Canada in which Indigenous people are attempting to take control over their own destiny.
Similarly, for many Canadians who immigrated to Canada from parts of the former British Empire in the Caribbean, Africa and India, finding the old colonial monarchy waiting for them here is no sign of dynamism.
It will be up to the current generation of Canadians to decide if now is the time to begin taking this question more seriously or whether to leave it to the United Kingdom to decide for us.
Jeffrey B. Meyers, Instructor, Legal Studies and Criminology, Kwantlen Polytechnic University
This article is republished from The Conversation under a Creative Commons license. Read the original article.
King Charles III’s coronation oath is a crucial part of the ceremony – experts explain
Saturday, 06 May 2023 08:14 Written by theconversationThe crowning is perhaps the most famous part of British coronations. Yet easily overlooked – and equally important – is the coronation oath, which has been a fundamental component of the ceremony since medieval times.
It is of such significance that the remainder of the rite cannot proceed unless it has been sworn. Why? The oath is the essential counterpart to the recognition and acclamation.
The recognition is the moment at the beginning of the ceremony when the monarch is presented to the people for approval. The acclamation is the moment in which the people accept the new monarch.
Together, these three acts establish a contractual relationship between sovereign and peoples. By public commitment to the promises and values enshrined in the oath, the monarch is forming a bond to the largely uncodified constitution (the UK’s constitution has never been assembled into a single written document).
There has been talk of a shorter service for Charles III’s coronation. But to omit these vital stages would amount to constitutional vandalism.
Before primogeniture (the law of firstborn inheritance), the British monarchy was elective. In Anglo-Saxon times, the witan (the leading nobles or council of the country) chose the new sovereign. And until at least the middle ages, a secular enthronement, preceded the coronation service.
Today’s acclamation comes from that tradition of electing a sovereign. This is when the officiant (usually the Archbishop of Canterbury), standing with the candidate in Westminster Abbey, asks whether or not the congregation recognises the candidate’s entitlement to be crowned.
Those present acclaim the person as their monarch, crying “God save the King/Queen”, representing the country. Some historians contest the acclamation’s importance on the grounds that the respondents are unrepresentative of the people and unlikely to reject the candidate: “as if there was any choice in the matter”.
Nevertheless, it is vital to what follows that they are asked at all. In Portugal, under the Avis dynasty, the acclamation assumed such supreme importance that the crowning withered away altogether. By contrast, Russian peasants were never asked for consent to recognise a new Tsar.
This piece is part of our coverage of King Charles III’s coronation. The first coronation of a British monarch since 1953 comes at a time of reckoning for the monarchy, the royal family and the Commonwealth.
For more royal analysis, revisit our coverage of Queen Elizabeth II’s Platinum jubilee, and her death in September 2022._
Following the acclamation, the candidate must commit to their side of the contract – the terms set out in the coronation oath. “Coronation oath” is a slight misnomer. It is actually a series of promises in question and answer form, sealed by an oath sworn “in God’s presence”. The text of the promises has evolved, but the core has remained.
In the version lasting (with tweaks) from 1308 to 1685, the candidate promised to confirm the laws of their predecessors, to maintain peace, to administer impartial justice with mercy and to preserve and enforce the laws that parliament would pass.
The makings of a modern oath
The Glorious Revolution of 1688-1689 ushered in a major change when a revised oath became statutory. The version used for Queen Elizabeth II’s coronation in 1953 had three essential parts and retained the oath’s medieval core.
Firstly, would she promise: “to govern the Peoples … according to their respective laws and customs?” Secondly, would she cause: “Law and Justice, in Mercy, to be executed in all … [her] judgements?” Lastly would she: “maintain the Laws of God and the true profession of the Gospel” as well as “maintain in the United Kingdom the Protestant Reformed Religion established by law”?
Afterwards, came the oath proper. Elizabeth II swore:
The things which I have here before promised, I will perform and keep. So help me God.
As this shows, the monarch swears to uphold certain key values: law, justice and mercy (although Charles has said he sees himself as a “defender of faiths” and the service is likely to reference other denominations and religions).
That obligation speaks to the nation, but also broadcasts a global message of what the United Kingdom stands for.
The coronation oath marks the moment the monarch’s actions and words correspond to the trust placed in them by the people when they recognised and acclaimed them as their king or queen. Together, the two stages form a contract on the basis of which the service can proceed to the crowning.
There is a sense that uncrowned (and therefore unsworn) monarchs were, and are, not fully sovereign. Would Richard III have usurped the crown so easily had one of the princes in the tower been crowned? Might Lady Jane Grey (the nine-day queen) have seen off Mary Tudor if she had reigned long enough to undergo coronation? As it was, Mary I led the only successful coup d’état of the 16th century.
Writing of Edward VIII, who abdicated in 1936, the late Queen Mother commented that it was fortunate that “he was never crowned”. Clearly, an abdication post-coronation would have been more problematic because the coronation oath establishes accountability.
Breach of contract can, for a sworn monarch, have terrible consequences. The charge of flouting their coronation oaths was levelled against Edward II (deposed and murdered), Richard II (ditto), Charles I (deposed and executed) and James II (deposed and exiled). Those kings could not be trusted because they had broken their sacred word.
The oath is at once a conduit for tradition, a constitutional pillar, a source of legitimacy and authority and a marker of national values. The coronation oath binds monarch to peoples and to the governmental system of the state, but it also communicates some of the principles – law, justice and mercy – upon which that governance rests.
Oaths are still an important part of employment in the British armed services, police force, parliament, privy council and law courts. Each of those oaths mentions allegiance, service or loyalty to the sovereign. How incongruous it would be if the person at the summit of the constitutional pyramid had no oath to swear themselves.
David Crankshaw, Lecturer in the History of Early Modern Christianity, King's College London and George Gross, Visiting Research Fellow, King's College London
This article is republished from The Conversation under a Creative Commons license. Read the original article.
Popular News
National Day of Mourning offers Canada a chance to rethink worker health and safety
Saturday, 06 May 2023 08:11 Written by theconversationJulian Barling, Queen's University, Ontario and Alyssa Grocutt, Queen's University, Ontario
Canadians go to work each day expecting to return home safely, but for too many workers and their families, this expectation is unrealistic. According to the Association of Workers’ Compensation Boards of Canada, there were 1,081 workplace fatalities in 2021 alone.
Each year on April 28, Canadians remember and honour those who have been killed or suffered injuries or illness at work. This day, known as the National Day of Mourning, was established by the Canadian Labour Congress in 1984 and made official in 1991.
Four decades have passed since the National Day of Mourning’s first observance, and the annual toll from workplace fatalities in Canada continues to remain high. But just how deep and pervasive is the problem? And what can we do about it?
Widespread suffering
Those who consume news media can be forgiven for thinking the number of murders in Canada each year vastly exceeds the number of work-related fatalities. One reason for this is the excessive news coverage of murders relative to other causes of death like workplace fatalities.
The real numbers tell a different tale. About 700 people are murdered annually in Canada, while close to 1,000 people die at work each year. But one study from the Journal of Canadian Labour Studies argues the actual number could be 10 to 13 times greater.
The suffering goes well beyond the 1,000 workers who die each year. Within the workplace, colleagues who have witnessed horrendous tragedies are affected, as are leaders who have to break the awful news to family members and motivate surviving employees.
Outside the workplace, the emotional and financial burden on family members has been ignored for too long. What if the news media devoted as much attention to workplace safety incidents as we did to murders? Would the public demand that management, workers and government authorities work together to enhance workplace safety?
Myths about worker control
The National Day of Mourning presents us with an opportunity to reflect on workplace fatalities and the enormous toll they take on affected families, co-workers and organizational leaders, and commit to making a difference.
We can start by dispelling some major misconception that are inhibiting progress in workplace safety and health. One misconception among managers is that, because workplace safety is so important, every aspect of employees’ work requires control.
Yet, based on extensive interviews with senior managers and employees and an analysis of documentation from 49 manufacturing firms in the United Kingdom, researchers found the opposite is true.
Among the five key types of human resources approaches, only one was associated with fewer workplace injuries: higher levels of empowerment, which included autonomy and employee participation. Even managers that ceded small, incremental amounts of control to employees had a positive impact.
Myths about safety costs
A second common misconception is that government safety inspections can be costly; yet again research suggests otherwise.
According to a comparison of more than 400 workplaces that were not targeted for safety inspections in California, and an equal number that were randomly selected for inspections between 1996 and 2006, random safety inspections work.
Five years after random inspections, companies saw a 9.4 per cent reduction in injury rates, and a 26 per cent reduction in costs associated with the injuries.
These gains in safety were achieved without any cost to employment numbers, sales, credit rating or likelihood of firm survival, which are frequent concerns in the face of government safety inspections.
Given this, policymakers should feel reassured that increasing the number of safety inspectors is a wise investment in both injury reduction and cost reduction.
Myths about sick leave
The National Day of Mourning’s calls for reconsideration of workplace safety are particularly relevant in the era of COVID-19. The pandemic highlighted the misconception that paid sick leave hurts organizations.
Year-after-year, more people die at work from health-related issues, such as respiratory diseases and occupational cancers, than from safety incidents.
A 2020 study from Ontario’s Peel region revealed that 25 per cent of the employees surveyed went to work when they had COVID-19 symptoms; 88 workers even did so after being diagnosed with COVID-19.
Why? Because they could not afford to lose any pay. If we are to protect employee health and limit the spread of infection, we need to de-politicize perceptions around basic workplace programs such as paid sick leave.
Worker health programs and policies need to be implemented based on the best of evidence, rather than being a subject for negotiations between labour and management or the whims of the government.
Paid sick leave policies and programs are primary tools in preventing the spread of infections, thereby benefiting employees and protecting organizations and their communities. Employees should be reassured that they will not lose pay when they protect themselves and others by staying home when ill.
A new approach is needed
We need to change the widespread perceptions that workplace safety requires the tight grip of management, that random safety inspections hurt organizations and detract from profitability, and that paid sick leave is an expensive luxury.
On the contrary, employee autonomy and engagement, random safety inspections, and paid sick leave are some of the practices that management should welcome to develop safe and healthy workplaces.
Another small action that could have wide-ranging benefits is to change the very language of occupational safety. For too long, “workplace accident” has been the term used for any workplace safety incident or injury.
Why is this problematic? By definition, “accident” implies an event that is unpredictable, unplanned and uncontrollable. If that is indeed the case, we should be forgiven for not taking any action.
Yet post-injury and inquest reports tell us that the opposite is true: these incidents are invariably predictable, preventable and controllable. The time has come to change how we think about occupational health and safety.
Julian Barling, Distinguished Professor and Borden Chair of Leadership, Smith School of Business, Queen's University, Ontario and Alyssa Grocutt, PhD Candidate in Organizational Behaviour, researching workplace safety, at Smith School of Business, Queen's University, Ontario
This article is republished from The Conversation under a Creative Commons license. Read the original article.
How schools and families can take climate action by learning about food systems
Saturday, 29 April 2023 13:15 Written by theconversationGabrielle Edwards, University of British Columbia
News about the climate crisis alerts us to the urgent need for drastic global changes. Given this, it’s not surprising that one study surveying thousands of young people found most respondents were worried about climate change, and over 45 per cent said worries about climate change affected them daily.
Young people are experiencing high levels of climate anxiety which is characterized by feelings of fear, worry, despair and guilt and can negatively affect psychosocial health and well-being.
Taking climate action is one proposed way to reduce climate anxiety by turning negative emotions in response to the reality of urgent challenges into positive action.
Engaging with food systems presents a major opportunity to act on the climate crisis, as they contribute 21 to 37 per cent of global greenhouse gas emissions. Both home-based discussions with parents or caregivers and school curriculums have a place in helping young people connect relationships with food to advocating for change to food systems or making more sustainable choices to benefit our shared planetary health.
What is a food system?
A food system includes everything that happens to food from farm to fork. The food system also includes all the people involved in each of those steps, including us.
Every time we eat, we participate in the food system. Yet, due in part to the increased number of steps between farm to fork, and the fact that in our dominant global economy food is positioned as a product to consume, there is a growing disconnect between people and the food system.
This disconnect has both contributed to current issues caused by food systems, and continues to perpetuate them. These issues include biodiversity loss, ecosystem degradation and global inequalities related to both labour practices and resource extraction.
Impact of daily choices
Many of us rarely consider the impact our daily food choices have on the environment. Those that do seldom see our own potential in engaging with and transforming the food system beyond eating on the basis of conscience.
Recognizing our role in the food system can be empowering, as it presents opportunities to act on the climate crisis.
Primary and secondary schools are a logical place to engage students in these issues as they are locations where young people spend most of their day and institutions that have goals of promoting an educated and engaged citizenry.
Despite the potential of educational institutions to engage young people in issues related to food systems, many school curriculums around the world, including throughout Canada, fail to do this.
Beyond nutrition, cooking
For example, research about primary school curriculums in 11 countries including Australia, England, Japan, Norway and Sweden finds that curriculums tend to focus on nutrition education or cooking skills with little to no mention of the ways current food systems are destroying our environment or perpetuating gross social injustices. Research about Canadian curriculums has similarly found curriculum policies tend to focus on eating in healthy ways as a matter of individual choice.
Although much curriculum does not take a holistic approach to food systems education, there are many third-party organizations that have created resources for educators examining food systems in a more comprehensive way.
Nutrition and cooking are important for individual health. But this limited focus can be disempowering for young people as it does not consider the positive impact people can have on transforming food systems to be more just and environmentally sustainable.
By showing the next generation ways to change our food systems for the better, we can not only reduce climate anxiety, but also ensure the next generation is equipped with the knowledge and skills to create a more just and sustainable future.
Taking action locally
So how do we support these important issues in our schools? If you are a concerned parent, you could join the parent advisory committee at your child’s school or write to your school district to find out if there are any positive local initiatives and to express concern.
You could also write to your provincial or territorial legislative representative to advocate for the inclusion of these issues in the curriculum.
Outside of school, parents or caregivers could find ways to engage children in discussions around food systems that go beyond nutrition. For school projects where a child has a choice about the topic, or as a home project, encourage your child to research different organizations in your area that are involved in sustainable food systems work. Together, visit a local farm or starting a small indoor or outdoor garden.
How a meal arrives on a plate
Another activity to start thinking about the global impact of food systems is to explore how a meal comes to be on your plate. You could ask questions like:
- What are the ingredients?
- Where in the world did all those ingredients originate?
- Who was involved in growing the ingredients, in transporting them and in creating the food being consumed?
- Were all those people treated fairly?
- Was the environment harmed in the production of the food?
Analyzing even a simple meal can lead to complex thoughts and discussions around food systems and reveal stark social and environmental issues.
By looking beyond nutrition, food can become a powerful tool to empower young people to take climate action which, in turn, can lead to reduced climate anxiety and increased feelings of hope for the future.
Gabrielle Edwards, PhD Candidate in Curriculum Studies, University of British Columbia
This article is republished from The Conversation under a Creative Commons license. Read the original article.
Cities must take immediate action against ‘renovictions’ to address housing crisis
Saturday, 22 April 2023 13:21 Written by theconversationBrian Doucet, University of Waterloo and Laura Pin, Wilfrid Laurier University
Amid all the discussions about constructing new housing, existing affordable housing is being overlooked. A recent study found that 322,000 affordable homes were lost across Canada between 2011 and 2016 compared to the construction of only 60,000 new houses for those in greatest need of housing.
In cities like Hamilton, Ont., the situation is even more dire: for every new unit of affordable housing built, 29 are lost. While most of these homes still exist, they are now much more expensive. And that’s largely because of renovictions.
Renovictions occur when landlords evict tenants, renovate the vacated units, then lease the units at much higher rents. The lack of rent control on vacant units creates a financial incentive for landlords to evict long-term tenants, many of whom pay below market rates.
To be clear, when we speak about landlords in this context, we are primarily referring to large financialized landlords that own hundreds of buildings and thousands of units and whose business model is based on profit by dispossession — not just “mom and pop” landlords.
In Ontario, provincial rules around renovictions are weak. Doug Ford’s government recently introduced Bill 97, the Helping Homeowners, Protecting Tenants Act. Despite its name, the bill does not constitute a significant improvement for renters. The onus still falls on tenants to exercise their legal right to return to the residence and find temporary accommodation in the meantime.
Despite spending billions on housing, the federal government also isn’t making significant inroads to protect tenants or preserve existing affordable housing. The National Housing Strategy has produced little affordable housing for people in need.
This means it’s up to cities to use whatever powers they have to make a difference.
Anti-renoviction bylaws
On April 20, Hamilton’s Emergency and Community Services Committee will debate whether to pursue new bylaws to crack down on renovictions.
As housing researchers, we believe tough anti-renoviction bylaws are one of the best single measures a city can implement to make a dramatic and immediate impact on housing affordability.
There is precedent for this. New Westminster, B.C., passed an anti-renoviction bylaw in 2019 that heavily fined landlords who did not allow tenants to return after renovations were completed.
The result: New Westminster virtually eliminated renovictions.
The bylaw withstood two court challenges. It was only repealed after the British Columbia government enacted similar legislation province-wide, albeit a more watered-down version of New Westminster’s bylaw.
Hamilton has the opportunity to be a national leader in housing affordability by ending an unjust practice that destroys the lives of tenants and erodes the city’s affordable housing stock.
The census doesn’t track renovictions; for some planners, politicians and policymakers, this lack of official data means there isn’t a problem.
At the April 20 committee meeting, councillors were poised to hear many first-hand accounts of renovictions from tenants. Their lived experiences are the data. But our research shows that this is only the tip of the iceberg.
Hamilton consultant report
City staff have been looking into whether Hamilton can legally enact a New Westminster-style bylaw. Their consultant’s report concluded it was not within the city’s powers. However, the report missed two key components.
First, the consultant’s report states that, in light of the provincial protections from renovictions in Ontario, a New Westminster-style bylaw would be irrelevant. This is not accurate. There are significant differences between the New Westminster bylaw and current B.C. legislation and the Ontario guidelines under the Residential Tenancies Act (RTA).
While the RTA has some protections against renovictions, they are inadequate; very few tenants who leave their units due to renovations return, and even fewer return at the same rent.
What is different about New Westminster’s bylaw? Unlike Ontario, in New Westminster, the onus was shifted to the landlord to demonstrate that tenant occupancy could not continue during renovations, and, importantly, to provide alternate accommodations while the renovation work was taking place.
Second, Hamilton’s report argues that B.C. municipalities have more authority as a result of their community charters to enact this type of bylaw. This is contrary to the legal opinion of ACORN Hamilton, a tenant advocacy and organizing group, which suggests such a bylaw would be in the purview of an Ontario municipality.
Moreover, prior to the court challenges in B.C., which upheld the New Westminster bylaw, it was not clear that B.C. municipalities had this authority either.
Walking the talk
Hamilton’s city council understands the urgency of the housing crisis. The city’s own data demonstrates a dramatic increase in N13 applications (notice to end tenancy because a landlord wants to demolish, repair or convert a rental unit) and subsequent renovictions.
Earlier this month, Hamilton declared a state of emergency over homelessness.
Our question to Hamilton’s civic leaders is this: with more than 15,000 units rented at less than $750 a month lost over the past decade, where do you think many people who are renovicted end up?
Creating a tough anti-renoviction bylaw would be a big step to turn nice words into bold action.
Cities can’t just talk. They need to take immediate action. Neither the province nor the federal government have any meaningful legislation to help renters. Evidence from elsewhere suggests tough anti-renoviction bylaws have a dramatic impact on affordability. City councils must do everything they can to protect tenants and affordable housing.
Brian Doucet, Canada Research Chair in Urban Change and Social Inclusion, University of Waterloo and Laura Pin, Assistant Professor, Faculty of Arts, Wilfrid Laurier University
This article is republished from The Conversation under a Creative Commons license. Read the original article.
Foreign exchange: several African countries have a shortage of US dollars - why this happens and how to fix it
Monday, 10 April 2023 18:58 Written by theconversationChristopher Adam, University of Oxford
A number of African countries, including Kenya, Egypt, Zimbabwe, Nigeria, Ghana and Zambia, are currently experiencing shortages of US dollars. The dollar is the dominant currency in international transactions. These countries rely on the US currency to pay for foreign debts, essential goods and industrial inputs. Development economist Christopher Adam explains to The Conversation Africa’s George Omondi what causes US dollar shortages and how they can be remedied.
What is a dollar shortage?
Global trade is conducted in the currencies of the world’s major economic powers, principally the US dollar, the European Union’s euro, the Japanese yen and, to a lesser extent, the Chinese renminbi and the UK’s pound sterling. Individuals, firms and government elsewhere in the world need these currencies to import goods and services and make other payments overseas.
A dollar shortage is simply a situation where the demand for this foreign currency exceeds the available supply, at the current exchange rate.
Depending on how the exchange rate is determined, a dollar shortage will present itself in different ways.
In countries operating a fixed exchange rate regime – where the national currency is pegged to a hard currency – the shortage may be physical. The banks that normally supply their customers with dollars may simply have none to sell or are forced to ration their limited stock.
But most countries today operate some form of flexible exchange rate. Their central banks don’t intervene in support of a particular exchange rate. Here, there may be no actual shortage. Dollars may still be available but can only be purchased at a higher cost. There’s a shortage only in the sense that the same amount of domestic currency buys fewer imports.
Even with a fixed exchange rate, dollars can usually be obtained on the parallel or black market, though at a less favourable exchange rate. It takes more of the domestic currency to buy the hard currency.
The domestic currency’s loss of value against the US dollar is often taken as an indicator of the severity of the dollar shortage.
What causes dollar shortage and what are the impacts?
The immediate cause of a dollar shortage is a deterioration in the country’s balance of payments, meaning a country’s financial transactions with the rest of the world. This might be due to some unexpected event like a natural disaster that destroys a country’s dollar-earning tourism sector. It could also be due to increased demand for essential imports such as food and medicines. Other causes include an increase in debt service payments falling due and a fall in remittances from workers abroad. The worsening balance of payments may also reflect deterioration of the country’s terms of trade meaning the value of what a country exports relative to what it imports.
World prices are determined by the actions of the large economies of the world. Small economies – including most developing countries – are price-takers: they have little or no capacity to alter their terms of trade.
What’s behind the current dollar shortage in Africa?
Many African countries now face a combination of disrupted exports and worsening terms of trade. Exports grew substantially in the later 2010s because of high and rising world prices for primary products. Then the 2020s opened with a series of shocks that have contributed to the dollar shortage.
COVID-related lockdowns and the associated global recession drove down prices for many of Africa’s key exports. Tourism – an important source of dollar earnings – came to a halt. The resurgence of global inflation and the resulting tight monetary policy (higher interest rates) have driven up prices for key imports and the cost of foreign borrowing.
On top of this, prices for oil, food and fertiliser spiked when Russia invaded Ukraine. Rising oil prices ease dollar shortages for oil producing countries such as Angola and Nigeria but have an adverse impact on other countries.
The effect is stark. When imports are fewer and more expensive, prices rise and spending falls. When the squeeze on imports reduces investment, there is lower growth and less economic progress.
Can the dollar shortage be avoided?
The only sure-fire way to avoid a dollar shortage is self-sufficiency – referred to in economics as autarky. But this is not a realistic option and certainly not for countries at early stages of development. Low-income developing countries need not just essential imports like food, fuel and medicines. They also need imported capital goods and intermediate inputs to develop their own productive capacity.
Over the medium term, as countries become able to produce more of the goods and services people want and need, they will depend less on imports. And they will be able to export more. Their vulnerability to periodic dollar shortages will ease. But this will take time.
Dollar inflows from trade, supported by remittances and aid inflows, may be temporarily augmented by foreign direct investment and dollar borrowing from official and private lenders. But capital inflows must eventually reverse as debts are repaid and foreign investors seek dividends and repatriation of their capital. If well used, though, capital inflows can support the export-led growth strategies that the most successful developing countries have pursued.
What should be done?
The moderation of global inflation and the recovery of global growth are likely to bring an improvement in the terms of trade and the recovery in export demand. There is little domestic policymakers can do about this but wait.
While they do so, they can take policy measures to address the immediate reality of the dollar shortage. Few of these measures are easy.
The usual advice is to cut public spending. That will reduce the demand for imports. It’s politically difficult. Governments are also usually advised to encourage the production of exports and import substitutes. That is challenging and takes time.
Effective adjustment therefore will also rely on external support. This means new and additional balance of payments support from the international financial institutions and multilateral development banks. And it means debt-restructuring initiatives such as the G20’s Common Framework mechanism.
Periodic dollar shortages are an enduring fact of life for many low-income countries, even as growth and development mean they are likely to become less frequent and less severe over time. The current pressures experienced by some countries in Africa are certainly severe, but these can be managed if countries maintain the high-quality macroeconomic management that they have developed over the last decade, especially if this domestic economic discipline is accompanied by decisive support from the international financial institutions and external development partners.
Christopher Adam, Professor of Development Economics, University of Oxford
This article is republished from The Conversation under a Creative Commons license. Read the original article.