Policy Paper
Accountability
November 2014
Prepared by:
Drew Ursacki
Vice President External Affairs
Brock University Students’ Union
Christopher Yendt
Chair, Board of Directors
Brock University Students’ Union
Alex Hobbs
Research and Policy Coordinator
Brock University Students’ Union
Zachary Rose
Research Analyst
Ontario Undergraduate Student Alliance
Sean Madden
Executive Director
Ontario Undergraduate Student Alliance
With Files From:
Roland Erman
President
Brock University Students’ Union
Andrew Kemble
Undergraduate Student
Brock University
Executive Summary
The question of how to hold Ontario’s publicly supported universities accountable to the
needs of students and the Province is a relatively complex one. On the one hand,
government control of universities raises serious concerns about the institutional
autonomy that safeguards the ideals of academic freedom and innovation. On the other,
given that public investment in higher education exceeds $8,000 per student, the public
has a right to know where, for what and how tax dollars are being spent. Over time, the
Ontario university system has developed a series of accountability mechanisms that have
attempted to acknowledge both realities. While these initiatives have been worthwhile,
the results have not been entirely effective or accountable. Moreover, they do not
adequately provide an avenue for quality improvement. To address these shortcomings,
this paper makes recommendations for changes to current accountability mechanisms
that will allow a greater degree of transparency in the system, but will also move
universities towards certain public goals.
University Governance
More than any other body, university Boards of governors have the most legal and
fiduciary responsibility for university operations. Unfortunately, boards are primarily
self-selecting, with the power to appoint the vast majority of their own members. Student
and faculty representation on boards of governors is low across Ontario. To change this,
the government should exercise its under-utilized ability to appoint board members who
are knowledgeable in student and public priorities in higher education. Additionally, the
share of boards comprised of students should increase, acknowledging the substantial
funding contribution students have made to universities over the past two decades.
Accountability Mechanisms
Currently, Multi-Year Accountability Agreements (MYAAs) are little more than glorified
data reporting mechanisms. Meanwhile, Strategic Mandate Agreements (SMAs) are new
and untested tools with little funding or policy aligned to their success. Though the goals
of these systems are ostensibly to incentivize institutional behaviour towards certain
government priorities, the agreements set vague targets and offer few rewards for
positive behaviour. Likewise, they do not provide any disincentive for lack of progress
towards public goals. This paper proposes adjustments to these systems, whereby greater
clarity and firmer expectations accompany target setting. Once these are established,
institutions should be rewarded financially for progress towards goals. Data reporting
should continue, but should not be expanded, and it should not be the sole focus and
requirement surrounding MYAA funding.
Targeted Funding
Targeted funding is an important tool available to the government to guarantee that
monies invested in higher education achieve a defined purpose. Unfortunately however,
the current state of reporting for funding envelopes is quite unaccountable. While
institutions complain that report-backs are numerous and overburdening, students and
the public do not have access to them. This paper recommends that all funding report
backs be rolled into one annualized reporting strategy, and that they be made publically
available. Furthermore, a small percentage of government funding of higher education
takes the form of performance funding for employment and graduation rates. Due to the
2
inability of institutions to control student performance in-class or in the labour market,
this paper recommends that performance funding be eliminated and turned into an
envelope to support quality improvement.
Ombudsperson Offices
As publicly supported institutions, universities should be held accountable for their
actions, and students should have a venue for appeals if they disagree with these actions.
Existing ombudsperson infrastructure on campuses needs to be strengthened where it
exists and created where it does not. Campus ombudsperson offices are able to work
within the individual culture of an institution, providing students with more tangible
advice to resolve conflicts or inequities. However, as system-wide insight and
coordination may also be needed from time to time, the Provincial Ombudsman should
be given jurisdiction over universities, as well. Where possible, cooperation may emerge
between these two levels.
Quality Assurance
Currently, quality assurance in Ontario is conducted by the universities themselves, who
use a framework created in 2010. This framework requires that each institution create a
quality assurance protocol that conforms to an agreed upon set of standards. Periodic
reviews of academic programs are measured through this protocol. Students are
concerned that the learning outcomes reflected in this framework are too broad to be
objectively judged in a review process. This paper recommends re-visiting the
undergraduate degree level expectations, and emphasizing teaching processes and
pedagogies in the program review process. The Province and universities must also strive
to better publicize the data they already collect on employment outcomes, earnings
outcomes, and the extent to which post graduation employment corresponds with
individual fields of study.
Glossary
Terms are listed in the order in which they appear.
Boards of governors: This paper refers to the governance bodies outlined in the
legislative acts of each Ontario university as “boards of governors.” These boards have
final legal and fiduciary responsibility for university operations and serve as the final
decision-making body on issues related to student fees, expenditures, and budgeting. At
two universities, these boards are called Boards of Trustees, but serve the same purpose.
Senates: University senates are the governance bodies most often responsible for
educational matters at the institution. This can include recommendations to the boards
of governors on funding, program reviews and approval, and long term academic
planning.
Multi-Year Accountability Agreements: Refers to a set of agreements made
between universities and the provincial government in 2005. Institutions indicate
strategies, programs and performance targets in regard to various goals set by the
Ministry of Training, Colleges and Universities, reporting back annually on progress
made.
3
Report-back: A submission on behalf of a university to the government, which details
progress made towards provincially mandated goals with provincially controlled funds.
Currently, institutions complete a report-back for the Multi-Year Accountability
Agreements, as well as all targeted funding envelopes.
Strategic Mandate Agreement: A three-year strategic plan negotiated between an
institution and the provincial government. These plans seek to guide institutions’
development along several metrics, as well as contribute to the Province’s overall goals
for the university sector.
Funding Envelope: A portion of money distributed to institutions, the use of which is
earmarked for a specific purpose or objective. Funding envelopes have been used by the
government as a way of influencing university priorities
National Survey on Student Engagement (NSSE): A survey conducted by all
universities of student engagement in the classroom as well as the broader community.
NSSE scores are reported on annual MYAA report-backs.
Ancillary Fees: Fees administered by universities or student unions, in addition to
tuition, for activities that are not directly related to teaching and learning, including
student support services, athletic facilities, health facilities, and student clubs.
Performance Funding: A funding model wherein institutions are funded based on
their performance according to a pre-defined set of objectives.
Key Performance Indicators (KPIs): Three measurements on which institutions
are required to report in order to receive performance-based funding from the Ontario
government. This funding changes based on changes to the KPI numbers themselves.
Current KPIs include graduation rates, employment rates, and OSAP default rates.
Ombudsperson Office: Ombudsperson offices are services that offer impartial and
non-binding advice towards the resolution of disputes between members of a campus
community. They are bound to offer independent and impartial advice, regardless of who
is funding them. They can also be referred to as ombudsman and ombuds offices.
Quality Assurance Framework: An agreement made between universities in 2010 to
adopt and enact a set of quality assurance criteria. This involves the creation of an
Institutional Quality Assurance Process (IQAP) at each university, based on the
standards set out in the Quality Assurance Framework. Each institution uses their IQAP
as the basis for periodic reviews of academic programs.
Ontario Education Number: A randomized, unique number assigned to primary and
secondary education students in Ontario. It is used for the purposes of performance,
mobility, and demographic tracking of students within and among publicly subsidized
institutions in Ontario.
Ontario Post‐Graduate Survey: A survey conducted on behalf of the Ministry of
Training, Colleges and Universities to determine employment outcomes, earnings,
and degree‐relatedness of jobs. Survey is conducted 6 months and two years after
graduation.
4
Introduction
In principle, accountability mechanisms in the university sector exist in order to ensure
that publicly subsidized institutions are adhering to public goals and priorities. The
question of whether or not institutions should be held to such ends is relatively
uncontroversial; virtually every stakeholder agrees that there should be some sort of
meaningful accountability mechanism in place.1 How those mechanisms take shape,
however, have been and continue to be the subject of vigorous debate within the sector.
There are two questions around which the debate seems to focus. First, for what should
universities be held accountable? Universities fill a variety of different roles that differ
between and within institutions. Moreover, these roles are valued differently across the
sector: government, for example, is often interested in economic return on its
investment, ensuring that students can obtain employment after graduation, whereas
faculty associations tend to place more emphasis on the development of students as
critical thinkers and engaged citizens. Ultimately, the matrix of measures used to hold
universities accountable will need to account for the priorities of a variety of different
stakeholders in order to accurately reflect public priorities.
The first question is theoretical in nature, requiring measured thought on the mandate of
public universities and their place in a democratic society. The second question is more
technical: Once the areas for which universities are to be held accountable are decided,
how are universities to be held accountable? In other words, how can a given
institution’s progress toward achieving public goals be measured? This question requires
one to conceptualize accountability mechanisms such that they might be meaningfully
and significantly measured. This is no small task, in large part because the various
mandates of our public universities are enormously complex and not easily quantifiable.
For example, how can critical thinking be accurately measured and reported? There are
some promising practices in place in comparable peer jurisdictions, but no one tool can
accurately capture critical thinking in all its complexity. Nor are labour market outcomes
so easily quantified as it might first appear. As part of the current accountability
framework, the provincial government requires institutions to report employment rates
both six months and two years after graduation. But what kind of work have the
graduates found? Are they working a poorly compensated job completely unrelated to
their education, or have they gone on to apply their degree in the “knowledge economy”?
Merely measuring employment rates gives no indication of whether or not students are
graduating from university only to return to poorly paid, unskilled labour, saddled with
thousands of dollars in debt. Again, more sophisticated tools are required to ensure that
universities are meeting this public priority.
Thus far the debate has been limited to the above questions. There is a third question,
however, which has been largely glossed over: To whom should universities be held
accountable? The answer has been so far assumed; since the public funds universities,
universities must be held accountable to the public. The government serves as the
public’s representative in this formulation of accountability. But if all those who fund
post-secondary institutions are those to whom institutions must be held accountable –
you get what you pay for, so to speak – then there is no reason that institutions should
1 Kisner and Hill, Higher Education in Tumultuous Times: A Transatlantic Dialogue on Facing Market Forces and
Promoting the Common Good. American Council on Education (Washington DC: 2010).
5
not also be held accountable to their students. Tuition and ancillary fees have risen
exponentially in the last two decades, comprising more than 50 per cent of all university
operating funding in 2013-14. And yet there has been no serious discussion about how
universities can be held accountable to both the public at large and to their students.
To sum up: there are three questions that should guide the post-secondary education
sector’s discussions of accountability. They are:
1. To whom should universities be held accountable?
2. For what should universities be held accountable?
3. How should universities be held accountable?
University Governance
Much of the contemporary literature on university accountability assumes that those
who contribute financially should be the ones to whom universities are held accountable.
Since the public invests so much in education, they – through government – are entitled
to keep track of the money spent, and implement whatever measures of accountability
they feel are sufficient to ensure its good use. This view has intuitive appeal; those who
make the investment are entitled to hold the entities in which they invest to account for
their expenditures. If the entities do not want to be held to account, or feel that they are
being micro-managed by their investors, then they are free – hypothetically - to refuse
the money and seek their funding elsewhere. For the purpose of this paper, this view will
be referred to as the “shareholder” approach.
On this view, there is another group that could ostensibly make claim to shareholder
status: students. It is widely known that the burden of funding university education in
Ontario has increasingly been passed on to students over the past two decades. As of the
2012-13 school year, students contributed as much, or more, to the operating budget of
universities as the provincial government at nearly every institution.2 While shareholders
in general are not always granted decision-making influence proportional to their
contributions, such as taxpayers for example, students have no other recourse to
influence university policy or priorities directly. While the public (through government)
can set targets in accordance with their priorities, students are often relegated to token
representation on governing bodies, despite their financial contributions.
However, the shareholder view is not the only way to approach this question. One could
instead look to those who are integral to the proper functioning of the university:
students, staff, faculty, administration, and the public at large. Without the support of
each of these groups, universities could not exist. Their stake in the university thus gives
them the right to hold the university accountable. For the purposes of this paper, this will
be called the “stakeholder” approach.
Principle One: Those who are integral to the functioning of a university
should be responsible for ensuring its accountability.
In many ways, university governance structures already follow the stakeholder – rather
than shareholder - model for accountability. University senates and Boards of governors
2
Calculated from Canadian Association of University Business Officers (CAUBO) 2012-2013 data.
6
often set seats aside for those who have no financial investment. For instance, Nipissing
University has a seat for an appointment by its Aboriginal Council on Education. A
number of universities reserve seats for the Mayor of the town in which they exist. And
all Ontario universities have seats reserved for faculty and students. It makes little sense
to describe the presence of these groups on boards except in terms of the stakeholder
approach to accountability. Students believe that this is as it should be and that a strict
shareholder view could be detrimental to effective university governance.
While it is a responsibility of the stakeholders to hold an institution’s governing body
accountable for the decisions that it makes, the bodies themselves have a duty to be
forthcoming with information and context. Reporting structures may already exist for
the purpose of informing other stakeholder groups about the work undertaken by the
board, but these reporting mechanisms are not normally used for reporting this
information to student groups and associations.
A shareholder view whereby governance seats are distributed to stakeholders in
proportion to their funding contributions may seem enticing to students who feel little
ability to control tuition increases, but could do the university community as a whole a
disservice. Universities are environments that must balance multiple objectives:
teaching, research, and service to the surrounding community. To leave this balance
completely up to the whims of government funding would inevitably tip the scales,
especially in light of the uneven composition of university governance boards. Further, a
direct shareholder environment would be sensitive to shifts in government policy – years
where funding changed due to different program priorities may cause shareholder
groups to demand changes to governance, which over the long term could encourage
relative instability and unpredictably in governance.
While student associations and groups do bear some responsibility for holding
institutional governance accountable, this responsibility does not exist in a vacuum and
is certainly not one sided. There is a responsibility by all parties and all levels of
governance within institutions to develop policies and strategies to combat apathy and
create an inclusive way to address faults in accountability.
It is understood that various aspects of a university’s governance must be held in-camera
or in confidence. Actions surrounding the confidentiality of Human Resources, officials,
and similar matters have legal restrictions that require privacy. However, there are
many aspects that have routinely been kept confidential without the necessity to do so.
Such needless confidentiality increases the difficulty for other groups and parties to view
binding actions with any legitimacy, as it clouds the process and conceals the individuals
involved with decision-making. The transparency and accountability of these governing
bodies can only be addressed in a positive way if institutions look at the degree to which
confidentiality is really required in many of these closed door sessions.
Concern One: Students are not adequately represented on university boards
of governors and senates.
Student representation on high-level university governance bodies is inconsistent. As
table 1 shows, student representation on boards of governors in particular is as high as
16.6% and as low as 2.7%. This is an unacceptable situation for a number of reasons.
First, it is difficult - if not impossible - to have student representation across all board
7
committees. Since there are approximately 2-3 students on boards, there are not enough
students to cover the numerous committees that are a part of each governance structure.
Second, this low number of student representation means that students do not have a
strong voice. For example, in the case of a board that has 37 total seats of which only two
are held by students, their perspective is clearly outweighed by all other representatives.
This is particularly important in terms of formal votes. While student representation
may still be heard at the board level in some respect, its influence is eroded by this low
ratio. Considering that the average student contribution to university operating budgets
exceeds 50%, the student voice should certainly be better represented in its membership.
While student representation is often higher
ultimately underrepresented. For the sake
pedagogies and educational policy, students
this governing body in particular; it deals
perfectly positioned to provide insight.
on senates (see table 2), students are still
of innovation and oversight in teaching
should have a stronger representation in
with academic affairs, and students are
Board composition varies by university, but each follows a more or less standard format.
A handful of seats are reserved for faculty, staff and students, typically two per group.
The following chart demonstrates student representation on university boards across
Ontario:
Table 1: Student representation on university governing boards in Ontario
University
Algoma*
Brock
Carleton
Guelph
Lakehead
Laurentian*
Laurier
McMaster
Nipissing*
OCAD
Ottawa
Queen's**
Ryerson*
Toronto
Trent
(Unclear if 1
or 2)
UOIT*
Waterloo
Western
Windsor
(Board
Bylaw does
not specify
students)
York*
2
2
2
2
1
2
2
1
2
2
2
2
3
6
Board
Graduate
Student
Seats
0
1
2
1
1
0
1
1
0
1
1
1
0
2
2
0
26
7.7%
7.7%
2
3
2
2
2
1
25
36
30
16.0%
13.9%
10.0%
8.0%
8.3%
6.7%
0
0
32
0.0%
0.0%
2
0
32
6.3%
6.3%
Board
Undergraduate
Student Seats
Total
Board
Seats
% of Board
Comprised
of Students
30
32
32
24
30
25
34
35
26
27
32
25
24
48
6.7%
9.4%
12.5%
12.5%
6.7%
8.0%
8.8%
5.7%
7.7%
11.1%
9.4%
12.0%
12.5%
16.7%
% of Board
Comprised of
Undergraduate
Students
6.7%
6.3%
6.3%
8.3%
3.3%
8.0%
5.9%
2.9%
7.7%
7.4%
6.3%
8.0%
12.5%
12.5%
*Student seats not specified as graduate or undergraduate **Includes Rector ***One seat set aside for part-time students.
8
While there is no real uniformity for board composition across the PSE sector - especially
between universities - they tend to follow a fairly standard practice of reserving a few
seats for faculty, staff, and students. The above table provides some level of context to
this discussion, namely, the population of boards across the province and the percentage
of positions held for students on these boards.
In addition to these seats the executive head or heads (President, Chancellor, etc.)
generally hold an ex-officio seat (this varies from being voting to no-voting depending on
the institution. As universities are generally bicameral in their governance structure, the
other body (Senate) is permitted to appoint a representative or two in order to provide a
communication bridge between the two other then the ex-officio members from the
administration.
Table 2: Student representation on university senates in Ontario
Undergraduate
Graduate
Total voting
Percentage of voting
members
seats held by students
Algoma
4*
N/A
43
9%
Brock
6
2
71
11%
Carleton
10
3
74
18%
Guelph
26
7
169
14%
Lakehead
8
96
9%
1
Laurentian
8
75
12%
1
Laurier
7
1
77
10%
McMaster
6
6
66
18%
Nipissing
3
N/A
56
5%
OCAD
3
1
63
6%
Ottawa
6
2
78
10%
Queen's
14
1
68
22%
Ryerson
15
1
72
22
Toronto
N/A
N/A
N/A
N/A
Trent
10
1
49
22%
UOIT
2
1
38
8%
Waterloo
8
4
91
13%
Western
14
4
102
18%
Windsor
10
2
84
14%
York
27
1
167
17%
* One of which is dedicated for a First Nations student
Further still, some boards allow for the appointment of other stakeholders to the board.
These can include alumni representatives, members of Aboriginal Education Councils,
Mayors, or members of the local city council or public school board. However, all of
these additional representatives have one thing in common: they collectively constitute
less then half of the population of the board. The remaining positions are populated
either by appointment from the provincial government whose recommendations come
from the board, or they are selected through internal means by the board itself. This
ultimately means that the board is responsible for filling vacancies in the board.
The issue with this kind of operating procedure is that there is a lack of significant lack of
accountability to the members and internal stakeholders of the institution. When a
9
board has the unchecked ability to appoint its own members there is no impetus to
ensure that these new members will be accountable to the stakeholders. Students have
the opportunity to contribute to university governance in a variety of ways, in some cases
by voicing concerns through committees that have significant decision-making power.
However, all of these opportunities for input fail to address in a meaningful way the level
of student contribution at the institution. Committees in particular are challenging for
relatively small groups of students, as there are only so many committees a student can
sit on, and there are generally more committees than student representatives.
While students act as the major shareholders in their institutions, comprising a near
majority of annual funding through tuition dollars, representation comprises a mere ten
percent on average, which falls distinctly lower once subdivided between undergraduate
and graduate students. This is certainly not enough to change the course of a decision
should students have serious disagreement or objection to it. Despite the fact that there
will often be disagreement over annual decisions that affect students, such as tuition
increases there needs to be a way to adjust the system in order to address in a
meaningful way the lack of accountability of university boards of governance. Allowing
boards to continue to elect officials who may have little to no knowledge or connection
with post-secondary education merely perpetuates a lack of stakeholder accountability.
Concern Two: University Boards of Governance have been increasing their
total membership without considering the student voice.
Since 2011 the amount of total members on university boards has increased. While these
increases have not been rampant and unchecked, they have resulted in shifts to the preexisting proportions of board members. While in 2011 the average percentage of student
representation on Boards in the province was 10.1% it has fallen to 9.6%, a net reduction
of 0.5%. When only undergraduate student representative positions are factored into this
equation the percentage remains the same. It is worthwhile to note that several
university boards did in fact increase the overall percentage of student members on their
boards (usually done through an overall reduction of total board members rather than
through the addition of more dedicated student seats), though these few outliers have
not been enough to address a provincial average reduction.
The concern stems from the way that this reduction in student voice has taken place:
rather than an overt reduction of student seats, an increase in other members of the
board has diluted student share of board vote.
While the percentage is not
mathematically significant it is indicative of a potential trend to water down student
representation by means other than removing dedicated spaces.
Concern Three: The distinction between undergraduate and graduate
student representation on university Boards of Governance is not always
clearly defined.
Many university boards of governance do not clearly define the constituency groups that
members must represent in order to serve on the Board. This lack of definition tends to
apply to student groups more than any other. While academic representatives may be
mandated to come from various faculties or departments across the institution, student
categories are often unacknowledged.
10
In Table 1 all Universities who are noted with a single asterisk do not differentiate
between undergraduate and graduate student representation, simply allocating those
spaces for membership to ‘students’ (Algoma3, Laurentian4, Nippissing5, Ryerson6, York7,
Trent 8 ). This lack of distinction between undergraduate and graduate student
representatives has the potential to create confusion for stakeholder groups.
Additionally, there is concern that without clear differentiation between graduate and
undergraduate students, one group could be excluded from the process simply because
the board would still have ‘student’ representation from the other. This means that one
student group could go underrepresented or entirely unrepresented.
Concern Four: Some University Boards of Governance do not allocate
dedicated seats for student representatives.
In addition to the concern noted above, two universities do not have dedicated spaces
allocated for ‘student’ representation in any capacity. Instead, both groups are required
to go through the same internal processes that the other stakeholder groups must go
through in order to gain a place on the board. Both the University of Windsor, and UOIT
do not allocate specific membership spaces for students (either graduate or
undergraduate). The University of Windsor does not make any mention of student
representation on the Board of Governors at all, while UOIT does create dedicated space
for students (four representatives) but these spaces could also be filled with university
staff members. 9, 10
Concern Five: Students do not have adequate representation on university
boards and senate committees.
To look further into this issue, student representation is very low, if not nonexistent, in
some senate and board committees. Although senate and board meetings will bring forth
any recommendations/policy changes to its entire body, the preliminary decisionmaking is done at a committee level. Furthermore, the senate and boards usually act as
‘rubber stamps’ for approval on motions brought to the floor. In other words, many of
these groups trust the discussions and decisions made at a committee level and the
subsequent motions are not discussed as deeply, if at all. For example, at Brock
University, there are seven committees on their Board of Trustees but only two of those
“Bylaws 6.9,” Algoma University, June 2014,
http://www.algomau.ca/media/style_assets/pdf/Algoma_University_By-law's_6.9.pdf
4 “Bylaws,” Laurentian University, accessed November 2014, http://laurentian.ca/bylaws
5 “By-Laws – Board of Governors,” Nipissing University, May 2012,
http://www.nipissingu.ca/about-us/governance/board-of-governors/Pages/By-Laws.aspx#II
6 “By-Law No.1,” Ryerson University,” April 25, 2011,
http://www.ryerson.ca/content/dam/about/governors/documents/governance/General_ByLaws_of_Ryerson_Universit
y%202010-11.pdf
7 “Protocol For Composition of the Board of Governors,” York University, 2013,
http://www.yorku.ca/secretariat/board/documents/ProtocolForCompositionOfTheBoardOfGovernors.pdf
8 “Board of Governors,” Trent University, April 28 2006, http://www.trentu.ca/secretariat/boardofgovernors_bylaw.php
9 “Board of Governors Bylaw,” University of Windsor, Feb 22 2011,
http://www.uwindsor.ca/secretariat/sites/uwindsor.ca.secretariat/files/board_bylaw_1__general_bylaw_approved_feb_22_2011_last_amended_bg140624_new_logo_1.pdf
10 “UOIT Bylaws,” University of Ontario Institute of Technology, June 11 2003,
http://www.uoit.ca/footer/about/governance/board-of-governors/uoit-by-laws.php#article3
3
11
committees have a student seat. If a motion is structured at one of the committees where
a student does not sit, no student discussion can be made until the time the motion is
brought to the floor of the board. Often, this is too late for student input to be properly
considered.
Although many senates do have student representation on all their committees, due to
many conflicting class schedules (often these committees are structured around the
availability of their chairs), students are not always able to attend. By allowing an
additional student seat on each committee, this would better ensure student
representation across each one.
Recommendation One: The provincial government should utilize its ability
to appoint members to boards of governors knowledgeable in institutional
and MTCU priorities.
It is currently the conception of many stakeholders that university boards are controlled
by representatives from the institution, and by appointees of Ontario’s Lieutenant
Governor. In theory, this system represents a balance of power held between the
province and institutions. As has been explored previously however, the province has
tended to appoint members to university boards on the recommendation of the boards
themselves.
One way the provincial government could take a greater role in helping universities
accomplish their missions would be to actually exercise its power to appoint members to
university boards. These members could simply be officials knowledgeable in university
operations and the province’s higher education objectives, serving to ensure these
priorities are being reflected in university decision-making. These members would be
able to ensure that university boards could make decisions effectively in the context of
government priorities. To preserve institutional autonomy, these members would not
have a plurality of seats, but would simply serve as true representatives of the public
interest at the university governance table. Additionally, it is not our expectation that
these members would somehow not place the needs and reputation of the institution at
the top of their priority list. Rather, they would come at their roles with the mission of
helping institutions operate efficiently alongside government priorities.
The government’s many new plans for higher education are yet to be widely adopted or
understood within the university community (Putting Students First, Long-Term
Capital and Infrastructure Plan, Mental Health Strategy). Students believe that
representatives from the provincial government should be in place at institutions to help
ensure successful implementation.
Principle Two: Students should be represented to a greater degree on
governing bodies than is currently the case.
Boards should embody a partnership between the various stakeholders in the university.
Boards of governors and senates should contain meaningful and effective representation
from different constituency groups. Student organizations, faculty associations,
administrators, government representatives, community groups should all be
represented.
12
However, among these groups, students stand out as the only partner group that has
significantly increased its contribution to university finances on a per-capita basis. Since
1979, students have increased their contribution to operating budgets from one fifth to
more than one half through tuition and ancillary fees. Greater representation would
allow student voices to reflect this increased contribution.
Recommendation Two: The selection process for student representatives on
their institution’s boards of governance must reflect student government’s
selection processes.
The boards of governance of universities across the province should not apply additional
restrictions to membership and student participation beyond the rules and regulations
that students have created in their own selection processes. While these boards may
seek to have nomination processes for external, faculty, or other such members, student
membership should remain entirely vetted and selected by the mechanisms that their
representative associations have agreed upon.
Recommendation Three: Strategic plans, such as the Strategic Mandate
Agreements, should be reviewed by formal governing bodies on which
students are appropriately represented.
In order to ensure good governance and that strategic plans are taken as legitimate,
binding, strategic documents, they must be passed by a formal motion at the Board of
Governors or its institutional equivalent. Boards of governors, when properly
representative of the university community, can hold the institution genuinely
accountable for the goals laid out in documents such as the SMAs. Adequate stakeholder
consultation is essential for reasons of both accountability and efficiency.
In this vein, for an SMA to be legitimate, students at a given institution must indicate
that they have been engaged and consulted. This could be confirmed on the agreement
with the signature of an authorized representative of each university’s student
government or through an equivalent process. However, as the largest constituency on
campus, it is not enough that institutions fill in a box indicating that a town hall was held
on a certain date to discuss MYAAs. The government should set a certain threshold for
student consultation centrally, with students associations indicating that the threshold
has been met or not met.
Recommendation Four: In order to avoid board inflation without
appropriate student representation, the Province should mandate that a
minimum of 13% of seats on university Boards of governors be reserved for
undergraduate student representatives, and a minimum of 25% of seats on
university senates be reserved for undergraduate student representatives.
As mentioned above, student membership as a percentage of total board populations has
decreased at universities across the province. This ‘dilution’ effect in membership means
that while no student seats specifically were targeted for removal, they have instead been
a victim of membership growth elsewhere. Student seats have lost their weight through
this process; in order to stop this potential trend, the Government of Ontario should seek
to establish a mandatory minimum percentage of student participation on their
institutions board of governance.
13
Some university senates, such as the University of Windsor Senate, already include
provisions for ratios for student representatives. In Section 5.1 of Bylaw 1 of University
of Windsor Senate, it is stipulated that
The elected student representation shall equal one-quarter of the faculty representation on
Senate. The number of student representatives shall be determined for each academic year
by dividing the number of faculty members with voting privileges by four and rounding to
the nearest whole number. 11
While this is certainly an excellent step towards a mandatory minimum for student
representation, the final number of representatives should be established as a percent of
the total membership, not as a percent of a subset group.
A minimum percentage of membership held by the student would ensure that even with
an increase of board membership, the student voice would increase proportionately.
Some universities in various provinces across Canada set specific numbers for their
governing bodies within their governing documents.12,13 Two provinces have gone even
further by establishing minimum proportions for student seats. Alberta has mandated
that its technical institutions must reserve 2 out of their 16 seats for students, or
approximately 13% of the total membership.14 Similarly, British Columbia requires two
students per every 15 seats, or approximately 13%, and 3 students per every 21 members
for the University of British Columbia, which is approximately 14% of total
membership.15
The Ontario government should mandate that universities reserve 13% of seats for
undergraduate students on Boards of governors and Trustees. Similarly, University
Senates should be required to adopt a minimum proportion as well. At Queen’s, Trent,
and Ryerson, undergraduate student representatives comprise approximately 22% of
total senate membership. 16 The province should require university senates to amend
their bylaws and establish that at least 25% of their total membership will be comprised
of undergraduate students.
It is important that boards seek to create partnerships with various stakeholder groups at
the university, and representation on the board should form a significant part of these
partnerships. Boards of Governance should include meaningful and effective, more than
token, representation from these different constituency groups. While external
representatives are integral, they should not come at the expense of student
organizations, faculty associations, alumni, and community groups. All of these
memberships deserve to have representation on institutional boards.
“Senate Bylaw,” University of Windsor, accessed October 2014,
http://www.uwindsor.ca/secretariat/sites/uwindsor.ca.secretariat/files/bylaw_1_senate_membership_and_election_procedures_vpsa_-_amended_141010.pdf
12 “University Secretariat,” University of New Brunswick, accessed November 2014,
http://www.unb.ca/secretariat/governors/compositionandmembership.html
13 “University Governance,” University of Manitoba, accessed November 2014,
http://umanitoba.ca/admin/governance/governing_documents/governance/1-2-1.html
14 Postsecondary Learning Act, Statutes of Alberta 2003, P-19.5. www.qp.alberta.ca/documents/acts/p19p5.pdf
15 University Act, British Columbia Laws, C.468.
http://www.bclaws.ca/Recon/document/ID/freeside/00_96468_01#section2
16 See Table 2: Student representation on university senates in Ontario
11
14
With this in mind, students remain as the only partner out of the aforementioned list of
groups who have witnessed their per-capita contribution to university finances increase
while simultaneously witnessing a reduction in their membership. Since 1979, students
have increased their contribution to operating budgets from one fifth to one half through
tuition and ancillary fees. Greater representation would allow student voices to reflect
this increased contribution.
Recommendation Five: There should be an opportunity for a student
presence on every committee of university Boards of governors and Senates.
Much of the more detailed work of a board of governors or senate is delegated to
committees and subcommittees. Often this includes considerable influence and decisionmaking ability. In the spirit of including the critical student voice in governance,
universities should not only guarantee a greater student presence in their board
composition in general, but should ensure space exists for students on all committees.
Recommendation Six: Universities should make their board membership
and student representation totals transparent, visible, and easily available.
While many universities have clear outlines for the total membership and composition of
their boards, this is not a universal trend. Rather, many boards have ambiguous
membership rules that define the number of students but not the total number of
members. Institutions should make a concerted effort to identify this lack of transparent
information and remedy it for the access of not only the student stakeholders, but for the
public attempting to access this information as well. Detailed breakdowns should be
posted simply on institutional websites.
Through a greater awareness of the proportion of the total that student representatives
comprise, student stakeholders will be able to more effectively utilize their positions to
affect the direction and decision making process of their institution.
Recommendation Seven: University boards of governance should amend
their bylaws to specify dedicated places for student membership.
Student populations are the major stakeholders of their respective institutions; it is
therefore imperative that these populations have guaranteed spaces on their boards of
governors and senates to represent themselves. There is no reason that other
stakeholders (such as faculty representatives) should have membership guaranteed while
student populations should not be identified at all. University Boards must have their
membership compositions modified in order to clearly identify student spaces.
Furthermore, these spaces should be allocated by Undergraduate and Graduate student
membership, rather than a catchall ‘student’ representative space. University board
compositions often, if not always, identify the roles that other academic and
administrative persons will play on the board, so why then are student spaces not treated
with similar value and distinction, as groups with unique needs and perspectives?
Universities must make sure that the additional spaces allocated for student
representatives include the same rights and privileges as other Board and Senate
positions, without additional restrictions. In some cases student membership has been
subjected to limitation in terms of ability to chair meetings, attend in-camera sessions,
and access to all pertinent material and documents.
15
Recommendation Eight: University Boards and Senates should provide
more flexibility and alternative methods of attendance to work around
student schedules.
Student schedules are rarely consulted when it comes to organizing meeting times; this
often makes it very difficult to get student participation on the board or committee.
Students should also be allowed to call in by teleconference, or as a worst-case scenario,
leave their remarks with another and have them be read out during the appropriate time.
Recommendation Nine: All in-camera sessions of boards, senates, and their
committees must include the presence or input of at least one student from
the respective board, senate, or committee.
On a governance level, all senate and board committees should remain open for public
access and attendance. Only during matters that are deemed confidential should these
groups move in-camera. In some cases, board committees work as private sessions, even
though their discussions and motions do not have to be concealed - it is through keeping
these select committees private that conflict between outside stakeholders and their
boards can emerge. Boards and senates should act under the assumption that they have
nothing to hide, and only use in-camera and private motions when it is absolutely
necessary. For example, contract negotiations would require an in-camera session.
In cases where a governing body or any of its committees do go in-camera, a student
member of that body should be present, or, if due to a student’s own attendance barrier,
should be fully briefed and afforded a vote through proxy.
Recommendation Ten: If a student is required to come before a governing
body’s disciplinary panel, that student should have the ability to request a
replacement for any member of the body with reasonable cause. The final
decision on the participation of the member in question shall rest with the
committee.
Depending on the issue or conflict at hand, students may have to physically come before
a panel or committee to present their case or refute a verdict. Given the potential for
discomfort, the student should have the ability to request a replacement for any member
of the body whose presence they feel is inappropriate and may interfere with objective
and clear-minded proceedings. For example, a student may wish for a certain member to
be removed from a disciplinary panel if dealing with personal information that they do
not want the individual to know or access. A student’s comfort and privacy should be
taken into account at all times when appearing before a governing body.
Furthermore, considering the sensitive nature of student discipline and academic
appeals, the university should work to ensure that such proceedings are anonymous
whenever possible.
16
Accountability Mechanisms
Strategic Planning Processes
Since whom and for what universities should be held accountable have been addressed,
one turns to the final – and most difficult – question: how should universities be held
accountable? Currently, there are a great deal of mechanisms by which Ontario
universities report progress on a number of different indicators and targets to
government. The recent introduction of the Strategic Mandate Agreements (SMAs) has
put much of the system in flux, and rendered many previous initiatives potentially
redundant. This paper will seek to improve upon the current framework - recommending
adjustments in accordance with recent developments - in order to ensure that the right
people are holding universities accountable for the right priorities.
Principle Three: The setting of strategic long-term goals must be done as
part of a cogent plan that sufficiently addresses institutions as well as
Ontario's post-secondary landscape as a whole.
Long-term goals at universities will only serve a purpose insofar as they are tied to a
broad, system-wide vision. Previous attempts at strategic planning in the sector have
been characterized by insufficient planning around provincial targets. Moving forward, it
is essential that all mandated planning surrounding access for underrepresented groups,
learning quality, program diversity, and all other priorities is coordinated at both
institutional levels and across the system. Success can only be so defined when
institutions are meeting their targets, and the aggregate of those successes results in
improvements across Ontario.
The Strategic Mandate Agreements (SMAs) of 2014 represent a potential improvement
in this regard. These documents outline how universities plan to contribute to Ministry
goals in terms of: economic development, teaching and learning, student populations,
research, academic programming, and mobility. In the SMAs, universities describe their
areas of strength, growth, and specialization, as well as commit to take concrete steps
towards MTCU's goals for Ontario. However, the particulars of the goals are not fully
defined. Priorities are identified, as are measures of success, but not the conditions of
success. As of the time or writing, the SMAs are relatively new, so it remains to be seen
how well they will perform in practice.
Principle Four: The setting of strategic long-term goals must be reflective of
the needs of both the university stakeholders and government.
Universities do not operate in isolation. Boards and administrative officials may have
significant autonomy in the running of an institution’s daily affairs, but their long-term
viability is contingent on funding from the provincial government. As significant
beneficiaries of Ontario tax dollars, it is incumbent upon institutions that they act in the
public interest.
It might be argued that universities fulfill this goal simply by carrying out their mission;
first and foremost, universities are institutions of collaborative learning. Whether that
learning is by students in the classroom assisted by a professor, or by a professor in a lab
with a team of research assistants, all members of the university community are either
committed to learning themselves or else to assisting others in that project. This is in
17
itself a public good, and should not be underestimated when assessing the public utility
of Ontario universities.
But the principle of institutional autonomy is not enough to justify handing universities
free reign with public funds. The elected provincial government often receives a mandate
from their constituents to advance particular principles in the university system.
Therefore, universities must accept that, in return for sufficient and stable funding from
the provincial government, they must commit wholeheartedly to strategic planning
which aligns with a system-wide plan. Likewise, the provincial government must oversee
a strategic planning regime that is effective.
Concern Six: Current Multi-Year Accountability Agreements have not been
successful as strategic planning tools or to hold universities accountable for
their performances.
Until the recent introduction of the SMAs, the Multi-Year Accountability Agreements
(MYAAs) were the primary method of strategic planning and reporting. Introduced by
the 2005 Rae Review, which led to the Reaching Higher Plan for post-secondary
education, these tools were designed to set strategic targets for the improvement of
access, quality, and accountability. Through the MYAAs, government and institutional
targets would align, and through MYAA reports, institutions would demonstrate their
progress and receive a small portion of their operating funding in return. Former
Premier Bob Rae made clear in the review that their purpose should be two-fold: first, to
ensure that institutions can count on stable funding over a multi-year period; second, to
hold institutions accountable to targets negotiated between them and the government.
As a planning tool, MYAAs are lacking. Though the government set general targets such
as “quality,” it did not determine how quality would be enacted or measured, nor was it
understood how individual institutions would contribute to any system-wide targets.
Instead, universities were made aware of general areas of focus, and asked to report
annually on these topics by highlighting any initiatives at their institution that they
deemed relevant. A strategic tool must be more prescriptive, and influence behaviour
from the top down.
Moreover, they are lacking as an accountability measure. Only a very small portion of
university operating funding is withheld until the report back for the previous year is
submitted. This funding has never been denied. There have been no consequences for
failing to meet targets set out in MYAAs, or for providing unsatisfactory responses. The
effect on institutional behaviour has been virtually nonexistent.
While MYAA and their report-backs persist to this day, they have not met their potential
as strategic planning tools. Rather, they have become something of a bare minimum
check, with schools highlighting broad, nonspecific strengths and receiving funds almost
automatically, just for submitting the form. They have proven to be somewhat toothless.
Recommendation Eleven: The government should align the MYAA reportback mechanism and funding with the new SMAs to create a strong financial
incentive.
18
The Strategic Mandate Agreements represent a step forward in provincial accountability
planning. They set out provincial goals and establish how universities will pursue those
goals, offering specific metrics to measure success.
These SMAs should be treated as the centerpiece for university accountability, and
should concordantly be tied to a robust system of incentives. MYAA report-backs should
be rolled into SMA reporting as well, as the funding that is either granted or withheld
depending on performance should be tied to the metrics set out in the SMAs. This policy
advice has also been recommended by HEQCO:
The evidence suggests that strategic funding targeted to specific desired outcomes is a
forceful and dramatic incentive that steers the system and influences the behaviour of
institutions. Some proportion of institutional funding should be tied to specific outcomes
that are aligned with government objectives.17
Additionally, the government should consider increasing the amount of a university’s
operating budget that is contingent upon strategic performance. The current amount is
too low to be cause for much consideration on the part of universities.
Recommendation Twelve: The government should utilize funding levers to
assist or encourage universities to meet their strategic goals.
The government should tie substantial funding levers to the fulfillment of SMA priorities.
Additionally, so done, the government should take care not to allow it to go the way of
the MYAAs, where the granting of funds became a matter of routine.
There are an infinite number of reasons why universities may not be performing to
targets, many of which are through no fault of their own. While universities should not
be punished for failure to achieve targets where clear mitigating factors exist, the current
reality of no-consequences cannot continue in a new direction of system-wide targets. If
an institution lists a commitment to a metric and a plan to achieve it, failure to achieve
the target should be cause for question. While funding should not be withheld in a kneejerk manner and institutions should be able to re-orient unmet plans without penalty (in
fact, more resources in some scenarios may be necessary and worthwhile) a clear lack of
progress cannot continue to be rewarded. The government should withhold funding from
universities until such a time as they can demonstrate that they either have or will soon
implement conditions laid out in their SMAs or other strategic planning document, or
can otherwise demonstrate the need for extra funds in order to do so. Currently,
approximately 4 per cent of operating funding is withheld until MYAA report-backs are
filled out. This recommendation would transform this policy lever from an incentive to
report MYAA data to an incentive to perform according to the strategic plan.
It should be made clear that this is not a recommendation to withhold enough funding to
severely affect the viability of post-secondary institutions. However, attaching a
percentage of funding to success and withholding it where necessary could effectively
motivate institutional behaviour.
17 Higher Education Quality Council of Ontario (2013). ”Quality: Shifting the Focus. A report from the Expert Panel to
Assess the Strategic Mandate Agreement Submissions.” Toronto: Higher Education Quality Council of Ontario. p 7.
19
Recommendation Thirteen: For the purposes of applying the funding levers
associated with SMAs strategic progress should be evaluated based on
outcomes as well as methodology and approach.
In order to have a reasonable and holistic view of progress, universities should not face
an system of incentives based entirely on the fulfillment of strict end goals. Although
goals and targets are essential, some financial incentive should be tied to making
substantive implementation changes towards those goals.
The government should adjust its funding levers to allow a small portion of SMA-aligned
funding to encourage the implementation of systems that will be conducive to the
sustainable and lasting prioritization of system goals, in addition to funding for meeting
the goals themselves.
Principle Five: Students should be included as major stakeholders in goal
setting as well as drafting and revising strategic plans.
Students are major stakeholders in post-secondary education who are often absent from
the table, despite the fact that their dollars pay for over 50% of the costs of education –
more than government and private contributions combined. While they are sometimes
present on institutional governing bodies, their representation is quite small, as was
discussed above. The relatively small presence of the student voice in planning for the
future of the system is inappropriate, given the impacts they will feel as a result of
changes. Indeed, now that students contribute more to Ontario universities' operating
funds than the government does, they deserve a much stronger voice.
When drafting or re-examining strategic plans and long term goals, universities and the
government should seek broad consultation with students beyond those who sit on
institutional boards or senates, such as the recognized Student Associations on their
campuses.
Concern Seven: Students were not sufficiently consulted in the setting of
strategic goals and initiatives during the SMA process.
Though some student organizations in Ontario universities were consulted during the
drafting of institutional SMAs, which included plans for system-wide differentiation and
specialization between the schools, the decision to take this step or not was left entirely
to the discretion of individual university administrations.
It should be a fundamental part of the process for universities to consult with all key
stakeholders, especially students. That there is nothing compelling institutions to do so
during the drafting of long-term arrangements with the government is troubling.
Recommendation Fourteen: All long-term strategic planning or renewals of
goals should require wide-ranging, formal input from student associations.
The establishment of provincial post-secondary frameworks of strategic plans should
require significant input from students before they can be settled. In addition to a
sufficient student presence on the governing bodies of universities, recognized student
associations at each campus should be given the opportunity to review, critique, and give
20
input on the plans under discussion. This should be done early in the process, so that
student feedback and student priorities can be reflected in the bulk of the planning.
The government should make it a policy that any long-term planning initiatives be
undertaken only after this manner of consultation with students.
Strategic Priorities
Principle Six: Long-term strategic planning should include specific, systemwide and government-mandated targets.
Successful planning depends on clear and logical goal setting. With targets in mind at the
system level, universities will have a more complete understanding of how they can
contribute the government’s vision, and the public will be able to more clearly follow the
successes in the system and areas that are underperforming. The current SMA
framework includes system-wide metrics for a number of things, including for example,
enrolment of aboriginal, first generation, and francophone students as well as students
with disabilities. Institutions, likewise, align their metrics with these. This effort is
laudable, and should continue to be measured and pursued accordingly.
A number of priorities should be pursued through the SMAs and continuing strategic
initiatives through the setting and reporting of specific, system-wide targets, tied to
funding incentives.
Concern Eight: The SMA agreements, though they refer to institutional and
system-wide metrics, do not set sufficiently specific or numerous systemwide targets.
The SMA agreements establish metrics of progress; for example, the enrolment of certain
underrepresented groups will be a measure by which each university can demonstrate
successfully improving and broadening access. However, MTCU has not established a
specific enrolment number or percentage as a goal. A more effective system would, aside
from establishing priorities and metrics, additionally tie them to a particular target.
Measures are most useful when compared with a benchmark. Under the current
agreements, metrics have been established in order to track university performance
along certain government priorities, but they do not include systemic targets. Without
these targets, it will still be possible to observe progress or change, but not to determine
whether a university has succeeded or fallen short. The SMAs focus the priorities, but do
not define success.
Recommendation Fifteen: The government should collaborate with
individual institutions to set specific long-term enrolment targets for
undergraduate and graduate students.
Currently the SMAs include many of these as metrics, that is, measures of performance.
These are generally listed under the “student population” aspect of the SMAs. However
consequences for underperforming cannot exist without particular targets as a frame of
reference.
21
Arising from conversations with universities, the government should set specific sectorwide enrolment goals and allow institutions to claim different portions of the goal. Thus,
each institution would have local target based on the institution’s desire or commitment
to the province’s goal. Institutions should report on programs in place built toward
achieving those targets, and should provide a substantive evaluation of their progress in
any annual reporting.
Recommendation Sixteen: The MTCU should collaborate with individual
institutions to set targets for the implementation of outreach and other
barrier-removing initiatives to ensure growing access for underrepresented
and mature students.
While the SMAs themselves establish enrolment of underrepresented and mature
students as performance metrics, as they do with undergraduate and graduate students
in general, there is no clear indication of what those enrolment numbers should be.
Despite concentrated efforts from various parties, accessibility barriers still exist for
these students. In light of this, simply setting enrolment targets does not address the
greater issues surrounding access for these students. Rather than viewing these students
as quotas, the MTCU should work with individual institutions to ensure that they are
actively engaging in outreach initiatives that work to increase enrolment of these groups.
For example, one outreach initiative could be educational campaigns for remote areas
that address the benefits of a university education, and the ability for those individuals to
enroll. On an institutional level, an example of a barrier-removing initiative could be the
formation of a committee or sub-committee whose mandate is to specifically identify and
address any accessibility barriers that may prevent underrepresented and mature
students from attending university. This methodology would be a much better reflection
of an institution’s ability to foster accessibility for these groups, and would serve as a
methodological or approach-based goal which could be reflected in the SMAs, as per
Recommendation Twelve.
Furthermore, these initiatives should not end once students from these cohorts are
enrolled in an institution. Ongoing supports are critical to the success of students, and
the MTCU should continue to work with individual institutions to ensure that the
barrier-removing initiatives are also focused on ensuring student success once at the
university. Combined, these efforts would continue to reduce accessibility and
graduation barriers to ensure continued growth in enrolment and success from
underrepresented and mature students.
Recommendation Seventeen: The government should set specific long-term
targets for the percentage of small class sizes made available to students,
and require institutions report these, broken down by department.
One particular system-wide target that should be set is class sizes. Class size reporting
requirements currently rely on a flawed data set provided by universities. The data
dramatically overestimate the number of small class experiences students actually have.
It reports sections of a class as if they were each independent classes; as a result, the
2009-2010 MYAA report backs tell us that 40 per cent of first year classes in Ontario
contain fewer than 29 students. In reality, most of these students will be sitting in a very
crowded lecture theatre. Current class-size reporting allows institutions to report
22
multiple sections of the same class as independent classes. 18 As a result, current
estimates drastically over-report the number of small classes, reporting one large lecture
as multiple small classes.
Universities should develop a mechanism that accurately reflects average class size. This
task is best left to the administrative units responsible for data collection and tracking
registration at the university, but the government should monitor this process and
ensure it is being conducted appropriately. The data thereby collected should be broken
down by program, and publicly reported in yearly annual reports. The government
should set a system-wide target for the number of small classes offered at Ontario
universities and hold institutions accountable for offering these.
This recommendation should not be taken as an endorsement of class size as a holistic
measure of educational quality. However, it is certainly a metric that will indicate to
students the kind of classroom experience to expect when attending a particular
institution. Furthermore, if an institution selects a target for class size, they should be
held accountable to maintaining it.
Recommendation Eighteen: The government should set specific, long-term,
comprehensive faculty hiring plans to meet enrolment demands, as well as
report the number and average teaching load of faculty.
The 2006-2007 MYAAs asked institutions to report on annual net new hires for full-time
tenured and full-time limited-term and part-time faculty. MYAA report backs dropped
this reporting requirement in 2009-2010. The Ministry set no targets for faculty hiring,
and does not provide any more detailed information about type of faculty hired.
For instance, institutions currently do not report the average teaching responsibilities of
full-time faculty, which by most reports have been dropping over time as research
expectations rise, and teaching is offloaded to contract academic staff. The amount of
professors per student is a meaningless measurement if these professors are not
operating at full teaching capacity.
Just as the government should plan for enrolment growth, so should it plan to meet the
demands of an increasing number of university students by hiring full-time faculty to
teach them. Trent University’s SMA approaches this by submitting - as a metric of
success for teaching and learning - the proportion of first-years taught by at least one
full-time, tenured professor. Future SMAs should include long-term hiring plans tailored
to expected areas of program growth. Annual report backs should indicate progress
toward achieving the goals of the plan, with explanations for any unforeseen variance.
Furthermore, universities should provide more detailed information about the number
and type of faculty they currently employ and plan to hire, broken down by department.
Part-time faculty and contract faculty are not reported at many institutions, despite
carrying significant portions of the institution’s teaching responsibility. If an institution
chooses to download its teaching to part-time staff, it should at least be known to the
public.
Michael A. Adams, Tim Bryant, Yolande E. Chan, Kim Richard Nossal, Jill Scott and John P. Smol, Imagining the
Future: Towards an Academic Plan for Queen’s University (Kingston: 2010), page 23.
18
23
Recommendation Nineteen: University accountability reports should
include a complete breakdown of support services offered to students by
university administrations, as well as the degree to which they are
supported by compulsory ancillary fees and university operating budgets.
The first MYAAs asked institutions to describe “strategies and programs that will
support increased participation of Aboriginal, first generation and students with
disabilities.” Later report backs did not include anything further until 2010-2011, when
institutions were asked to provide highlights of an activity contributing to the
improvement of the learning environment in a box labelled “support.” The SMAs do not
point to any metrics or priorities along student support lines, and that is unacceptable.
Given the significant concerns students have over the quality of academic and nonacademic student support services, institutions should be required to detail all such
services offered to their students, and their plans moving forward.
Students are also concerned about how these support services are funded. As teaching
costs have increased, less and less institutional operating funding has been available for
support services. Students have experienced a marked increase in ancillary fees as a
result, but have no access to information about where student support services derive
their funding. The SMAs should address decreasing reliance on student funds for
services, and should enact metrics along these lines. Reports would therefore
differentiate between operating funding and funding levied through ancillary fees. At the
very least, institutions should be required to guarantee a certain level – and minimum
type = of student support services.
Recommendation Twenty: University accountability reporting should
include a detailed breakdown of all ancillary fees levied against students.
As provincial funding for universities has declined in recent years, students have paid an
increasing share of university operating budgets. This cost has largely been levied
through tuition fees, regulated by the government. But some costs are also downloaded
to students through the raising of ancillary fees, specific fees other than tuition that must
be approved by student governments. These fees have risen steadily in recent years.
These fees are often explained to students at the point of payment, but are rarely
reported transparently to the public in the same way that tuition fees are. Future
university reports should include a thorough breakdown of all ancillary fees, including
compulsory fees, non-compulsory user fees and program fees. Reporting this data in
such a way will allow the public to have a better conception of the real cost of education,
which is often assumed to be tuition alone.
Recommendation Twenty-One: University accountability reporting should
ensure that all results of the National Survey of Student Engagement are
published on institutional websites in survey years.
In the absence of a simple indicator for determining quality, many institutions rely on
the National Survey of Student Engagement (NSSE). First implemented in 2000, by
2006 31 Canadian institutions – including all Ontario universities – were participating in
24
the survey. Today, all Ontario universities administer the NSSE to first- and fourth-year
students in first-entry undergraduate programs.19
The NSSE asks 105 questions, seeking to gage student satisfaction across five variables:
level of academic challenge, active and collaborative learning, student-faculty
interaction, enriching educational experiences, and supportive campus environment.
Many institutions publish full reports of NSSE data on the accountability sections of
their websites, though some only provide the full range of reports available. For example,
on its website, Queen’s provides a breakdown of the data by means, by benchmark, and
by individual response to questions, resulting in a very robust data set to which the
general public can access. By contrast, Brock University has traditionally published only
the benchmark comparison and an executive summary, though they have recently added
‘select comparisons’.20
All institutions should be required to provide a complete breakdown of NSSE results for
public consumption. Though the validity of NSSE as a measure of quality has
legitimately been called into question by many stakeholders, it provides students a
perspective through which to compare the student experience across institutions. Since
all Ontario institutions are conducting the NSSE utilizing student and government
dollars, it is only fair that this data be made available in its complete form.
Targeted Funding
The majority of a university’s operating budget is funded through student fees and
through a basic operating grant from the government. These funds can be spent on
anything the university deems necessary, and can be referred to as “unrestricted
funding.” Government funding not included in a university’s basic operating grant – and
thus contingent on certain outcomes or directed toward certain programs – is called
“envelope funding.” Unlike unrestricted funding, envelopes are tied to and must be spent
on the program for which they were created. Universities cannot use those funds for
some other purpose. For most envelopes, money is dispersed to institutions at the
beginning of the fiscal year, after the government has received a report for how the funds
were spent in the previous year.
Funding envelopes are not new in Ontario. According to former HEQCO president
James Downey, universities in the pre-World War II era were essentially funded entirely
through envelopes.21 University presidents would send a budget to the Premier outlining
expenses and expected revenues, and would ask for a grant for the difference. The
Premier would decide whether to fund a university’s operations. As more universities
were founded and the concept of university autonomy began to take hold, the
government began to play a less direct role in university operations. Still, the
government has retained the right to direct funding toward specific programs, and thus
maintains envelope funding to this day.
19 H. Zhao (2011). Student Engagement as a Quality Measure in the Ontario Postsecondary Education System: What We
Have Learned About Measures of Student Engagement, Toronto: Higher Education Quality Council of Ontario.
20 Both institutional websites accessed October 8, 2014.
21 James Downey, “Accountability versus Autonomy.” Presented at Meeting of Vice-Presidents, Conference Board of
Canada Quality Network for Universities, Nov 13 2008.
25
Principle Seven: Funding envelopes are important tools for ensuring the
accountability of government investment in post-secondary.
Funding envelopes are an effective way for the public to hold universities accountable.
While there are some examples of poorly designed or insufficiently funded envelopes –
some of which are outlined below – in principle, funding envelopes can be used to
provide effective direction to the system and to improve performance. For example, the
Access to Opportunities Program (ATOP) was established in 1998 to increase enrolment
in computer science and high-demand engineering programs. The government set a
target at 23,000 extra spaces by 2004-05; this target was reached in 2002-2003 through
institutions creating spaces with the enveloped funds.22 The program clearly met its
goals, and demonstrates that envelope funding can effectively influence institutional
behaviour toward the achievement of public goals.
There are some who claim that envelope funding is inherently flawed, and that it is one
of the reasons funding has not kept pace with costs. The argument goes that the
importance of increasing the basic operating grants to meet inflating costs has been
ignored, with new funds being partitioned into targeted envelopes. Proponents of this
view claim that by tying ever-increasing amounts of funding to specific goals, the
government forces institutions to pull money away from the core function of the
university, such as instruction, infrastructure, library holdings and equipment. As a
result, quality suffers.
However, evidence from the government’s Final University Operating Transfer Payment
Totals indicates that, while funding envelopes had increased as a percentage of total
funding over the history of their use, 2007 saw a sharp increase in per-student
unrestricted funding and a graduate decline in envelope funding that has persisted to
date. As the graph below indicates, over the past 10 years, the vast majority of funding
growth has gone toward basic operating grants, not envelope funding. It is also worth
noting that tuition fees, an increasing proportion of overall university budgets, go almost
entirely to the unrestricted operating budgets – with the exception of some that is set
aside for financial assistance. While envelopes are not as flexible as unrestricted funding,
they do not encroach on the vast majority of university budgets, and thud remain
important and useful.
Graph 1: Percentage of university operating funding enveloped, 2004-13
2012/2013
2010/2011
Enveloped Funds
2008/2009
Unrestricted Funds
2006/2007
2004/2005
0%
22
50%
100%
Ontario Ministry of Training, Colleges and Universities Operating Funds Distribution Manual, 2010, page 24.
26
Moreover, should areas in urgent need of funding be identified, far from preventing
universities from addressing them, envelopes can ensure that financial resources are
allocated directly to these issues.
Principle Eight: The government must closely monitor funding envelopes in
order to ensure the envelopes are being dispersed in accordance with their
mandate.
The government and universities should take proactive steps to ensure that the
accountability mechanisms for all funding envelopes are posted transparently and
accessibly. Stakeholders have the right to be kept updated about the status of programs
funded through envelopes. Should the government provide funds to support a certain
program, the public should be able to go to a post-secondary institution’s website and see
what funds have been used to support. For instance, if the government were to launch a
new envelope supporting mental health counselling, the report-backs detailing how this
funding has been spent should be accessible to the community.
Stakeholders should not have to submit freedom of information requests to access this
data, nor should institutions have any reason to not provide it barring reasonable
confidentiality constraints for legal and human resources reasons.
Concern Nine: Report backs for individual funding envelopes are not made
available to the general public, and the government does not publish a
system-wide analysis on the progress of funding envelopes.
The government publishes data from only one of its funding envelopes: performance
funding. It publishes sector-wide statistics, broken down by programme, as well as
requiring institutions to publish their own statistics on their institutional websites. While
this is a step in the right direction, both mechanisms fall short of ensuring adequate
transparency of university accountability mechanisms.
First, the website publishes data from only three indicators: graduation rates,
employment rates after graduation and OSAP default rates.23 If a member of the public
was looking for another tool on the website – for instance, information about how
Women’s Campus Safety funds are being used, or an institutional Multi-Year Aboriginal
Plan for Post-Secondary Education – they would not find nothing.
Institutional websites go a step further by additionally including their MYAAs and MYAA
report backs, but they still lack much vital information. Like MTCU’s website, they lack
information about projects undertaken through funding envelopes provided by the
government. Instead, MYAA report backs focus on output numbers – the number of
students with a disability, or aboriginal students, or students in a cooperative education
programs. While these measures are certainly useful, they do not address the use of
programs or policies in influencing those numbers, nor are there real implications
should those numbers decrease. There is no way a member of the general public can
learn about whether or not individual envelopes are successful without issuing a freedom
of information request.
23 “Graduation, Job and Canada-Ontario Integrated Student Loan Default Rates,” Ministery of Training, Colleges and
Universities, accessed October 22 2014, http://www.tcu.gov.on.ca/pepg/programs/osaprates/
27
Neither of these mechanisms succeeds in enabling stakeholders to make their own,
independent evaluation of programs funded through government envelopes. Nor do they
let the students who are intended to benefit from these investments know how the funds
are being used. They must instead rely on government or university press releases. This
is neither transparent nor accountable.
Concern Ten: There is great inconsistency between reporting requirements
for different funding envelopes, and some reporting requirements are far
too weak while others are overly redundant.
While universities are required to report on the status of all programs funded through
envelopes, reporting requirements vary significantly between envelopes. For example,
the Women’s Campus Safety Grant – a grant of up to $50,000 to each university for
education, advocacy and infrastructure aimed at preventing sexual assault – holds each
year’s funding until the report for the previous year’s is submitted. Little in the way of
substantive evaluation of the report back is conducted before funds are released. This is
corroborated by evidence suggesting that the fund is sometimes used for purposes other
than the prevention of sexual assault. For example, Queen’s University used $17,000 of
their funding to expand the availability of mental health first aid courses on campus in
2011-2012.24 While certainly a laudable initiative, it is difficult to reconcile this decision
with the goal of the fund.
There is also anecdotal evidence to suggest that institutions face difficulty with the
consistency and level of reporting they are required to do to access even small pools of
funds. Such inefficiency can be a drain on institutional resources that could otherwise be
directed to student priorities.
Concern Eleven: The current performance-funding formula neither
provides enough money to improve performance nor uses the right
performance indicators to assess improvement.
Another funding envelope, called the performance fund, provides money to institutions
based on their performance on three indicators: graduation rates, employment rates 6
months after graduation and employment rates 2 years after graduation. The fund was
implemented by the Ontario government in 2001, and initially split universities into
three tiers, based on their performance. The top tier received two-thirds of the fund, the
middle tier received one-third of the fund, and the bottom tier received no funding at
all.25 This system arguably did not fairly distribute funds; often, the difference between
being in the top and bottom third was within the statistical margin of error. The formula
was quickly changed to more fairly assess university performance. The Ministry set a
benchmark for each particular indicator at 10% below the system average. In order to be
eligible for funding, the institution must have met the benchmark. The amount of
funding for eligible institutions was then calculated using a formula that takes both size
“AMS, Safety Fund build capacity for mental health training”, Queen’s University, August 11, 2011
http://www.queensu.ca/news/articles/ams-safety-fund-build-capacity-mental-health-training
25 Ontario Confederation of University Faculty Associations, The Measured Academic: Quality controls in Ontario
Universities (Toronto: 2006). page 19.
24
28
of the institution and their success in achieving high rates.26 This system is currently in
place today.
There remain problems with the current performance framework, however. First, Key
Performance Indicators (KPIs) do not provide overly useful information. Measuring
aggregate employment rates recently after graduation gives no indication of whether the
graduate is working in a firm related to their studies, or even if they are working fulltime. More importantly, the data does not tell us whether the graduate is satisfied with
the relationship between their field of study and their employment after graduation.
Moreover, as indicated by the latest data, there is very little variation between
employment KPIs:
Table 3: Variation in KPIs, 2012 Cohort27
Highest
Graduation
Rates
Employment
– 6 months
Employment
– 2 years
88.3% (Nipissing)
Median
78.6% (Guelph) and
76.8% (Waterloo)
Lowest
Average
63.9% (OCAD)
77.1%
100.0% (Hearst)
86.5 (Algoma)
82.0% (OCAD)
86.5%
100.0% (Hearst)
92.8% (Brock)
86.0% (Algoma)
92.2%
Finally, there is simply not enough money in the performance fund to influence
institutional behaviour. KPI funding has declined as a percentage of government funding
to universities over time to just 0.66% of provincial government funding. And this figure
does not include tuition, another massive component of university revenue. KPI funding
comprises a miniscule proportion of a university operating budget. Combine this fact
with the difficulty of influencing these indicators, and there is little possibility that
institutions are changing their behaviour in any significant way due to the performance
fund.
Table 4: Key Performance Indicators as a percentage of provincial funding
over time
Year
2004
/05
2005
/06
2006
/07
2007
/08
200
8/09
2009
/10
2010
/11
2011
/12
2012
/13
2013
/14
KPIs as %
of funding
0.94
0.86
0.82
0.77
0.75
0.72
0.70
0.68
0.67
0.66
Recommendation Twenty-Two: The government should require universities
to include envelope funding report backs in their annual report.
Most universities already publish annual reports to promote transparency and
accountability. These include audited financial statements in addition to highlighting
accomplishments of students, staff and faculty for the year. In their current form, they
essentially serve as advertising material for each institution. But they could be changed
to become extremely effective tools for communicating progress toward strategic goals to
stakeholders.
26
27
Ontario Ministry of Training, Colleges and Universities Operating Funds Distribution Manual, 2010, page 9.
Ontario Graduate Survey. 2013
29
Each university should explicitly report on the progress of programs funded through
government envelopes. They should include goals set for the programs, descriptions of
how funding was used to achieve those goals, and an evaluation of whether those goals
were met. All annual reports should be accessible from institutional websites.
If the government took this step, it would not be the first jurisdiction to do so. The
Alberta Post-secondary Learning Act grants the Minister of Advanced Education and
Technology the power to mandate the inclusion of data in university annual reports.28
Each school must publish a report detailing its progress toward goals articulated by the
government, in exchange for funding.
Recommendation Twenty-Three: The government should publish an annual
analysis of sector progress toward the goals of all funding envelopes
currently in place.
The government should compile and synthesize all institutional annual reports,
publishing a sector-wide analysis of programs funded through envelopes. The Alberta
government does just this, publishing a document which tracks the progress of various
programs throughout the sector. The indicators are straightforward, their targets clear
and their progress honestly reported.
Ontario’s Ministry of Training, Colleges and Universities should publish such a
document. It would be of great use for all stakeholder groups, who would now have ready
access to data concerning funding envelopes. What would result is an informed public
debate about post-secondary policy, rather than simple ideological statements.
Moreover, members of the general public could easily educate themselves about the
relative success of funding envelopes, and could better participate in the debate.
Recommendation Twenty-Four: The government should eliminate the
performance funding envelope and redirect the funding toward quality
improvement and SMA adherence.
The performance-funding envelope neither collects useful information nor provides
enough funding to noticeably affect institutional behaviour. The fund should be
eliminated and either rolled into the basic operating grant or an envelope towards
quality improvement. While performance funding may seem like an attractive notion at
first, it is beleaguered with huge conceptual flaws. In theory, outcome-based funding
serves as an incentive for institutions to increase performance along whatever metrics
performance is measured. Institutions that measurably improve are granted a certain
portion of the performance-funding envelope, while institutions that do not are granted
nothing. In practice, institutions are measured according to metrics they have little-to-no
control over, making any measure of performance dubious at best. Even if a better set of
metrics were to be developed, performance funding would simply reward institutions
who perform well and do nothing for areas of the system in need of improvement.
We are not alone in this recommendation. The Council of Ontario Universities and the
Ontario Confederation of University Faculty Associations – representing two of the
largest stakeholders in the province, faculty and administration – agree with this
28
Post-Secondary Learning Act, Statutes of Alberta, 2003, http://www.qp.alberta.ca/documents/Acts/p19p5.pdf
30
position.29 The consensus of sector stakeholders is that the Ontario government should
take steps to immediately eliminate this funding envelope.
Ombudspersons Offices and the Provincial Ombudsman
Universities have a duty to be accountable to their students. As they are major
stakeholders and primary contributors to operating budgets, it is imperative that this
element of accountability not be forgotten. All members of the university community
have a right to be treated fairly by their peers and mentors. Students have a right to be
treated fairly by professors and professors have a right to be treated respectfully by
students. Unfortunately, a variety of circumstances can lead to unfair treatment in the
university context.
The question has been put to the Ontario legislature as to whether the investigative
power of the Ontario Ombudsman should be extended to a broader swath of the public
sector, including universities. As of the time of writing, legislation that would empower
the Ombudsman to investigate universities has passed first reading.
OUSA support measures to increase the accountability of the relationship between
students and universities. This paper offers the following comments and
recommendations to that end.
Principle Nine: Those whose complaints are not being answered through the
standard channels should have access to an independent, objective
ombudsperson to help settle their grievance.
While the public can trust that the vast majority of university interactions are performed
in good faith, mistakes and personal biases sometimes enter usually fair and balanced
processes. This is the case for all large organizations, and universities are no different.
But as universities are unique in that they must be held accountable to their
stakeholders, and as such they must subject their practices – such as grading and
financial transactions – to independent review.
The best form this review could take is that of a dedicated ombudsperson’s office,
empowered to take complaints, make investigations, and issue public reports to
university governing bodies. Ideally, each university should have its own, allowing those
with grievances access to a local professional with extensive knowledge of the practices of
their institution. The establishment of independent ombudspersons’ offices at each
Ontario University would represent a great stride forward in operational accountability.
Concern Twelve: Many universities do not have an ombudsperson’s office,
and of those that do, few have the necessary jurisdiction and authority to be
effective agents of change.
Only 12 of 20 universities currently have an independent ombudsperson’s office, and
their powers differ greatly. Table 7 details the funding and jurisdiction of each of the 12
ombudspersons’ offices. Most ombudspersons’ offices are joint ventures between student
29
The Measured Academic, page 21.
31
associations and university administrations, funded bilaterally to ensure distance from
the administrative machinery of either organization. Every ombudsperson’s office
remains independent of university structure in order to provide objective and impartial
advice to those with a grievance. Some ombudspersons’ offices report to advisory
committees, some to academic vice-presidents or provosts, some to presidents and a
handful to governing bodies.
Table 5: Funding Arrangements of Ombudsperson Offices at Ontario
Universities**
Funded By
Students and Administration
Institution
Algoma
Brock
Carleton
Lakehead
McMaster
Ottawa
Ryerson
Western
Administration
Guelph*
Queen’s*
Toronto
Wilfrid Laurier
York
* Ombudsperson has oversight only of campus hospitality and residence
**All institutions in Ontario not listed do not have ombuds offices.
While it is a testament to student association and university initiative that so many have
joined together to create ombudspersons’ offices, they should not be forced to do so.
Neither have an abundance of funds with which to finance the operations of
ombudspersons’ offices. This means that a sudden shortfall of funds could result in the
closure of this essential service – such was the case at the University of Windsor, when
the ombuds’ office was folded into the human rights office.
Recommendation Twenty-Five: The government should provide complete
funding for every university to establish its own independent
ombudsperson’s office, managed at arms-length from the university.
The funding for ombudspersons’ offices must come from a reliable, external source. Only
the government can provide such a guarantee, through dedicated funding envelopes. The
government should provide each university with the necessary funding to create their
own ombudspersons’ office. An ombudsperson typically has the power to investigate and
report on various kinds of infractions or errors in operation of an organization. This will
allow students, staff and faculty to have recourse to solve any grievances they might have
through an objective, independent body with in-depth knowledge of their institution.
This funding should stipulate that the office will have authority over academic and
financial matters and may make non-binding recommendations to the appropriate
institutional authorities.
Moreover, these offices should be required to publish annual reports to the Board of
Governors or its equivalent, which may contain recommendations for solving systemic
32
institutional challenges. Boards should be responsible for making these reports available
to the community at large.
There are some who suggest that universities should come under purview of the Office of
the Ontario Ombudsman in the same way that Ontario colleges already are. They argue
that it is equipped with the necessary infrastructure to immediately begin taking
complaints and resolving grievances.30 While they are right to critique the current lack of
consistent recourse for those whose complaints to universities are not being answered,
their solution is arguably problematic. The Ontario Ombudsman is a governmental office
designed to hold the provincial government to account. Ontario universities are not
governmental organizations or part of the Ontario Budget, despite their public nature.
Moreover, each university is significantly different in terms of operations and grievance
system. To subject them all to one office in Toronto would be to risk glossing over their
important differences and missing key information in the grievance process. However,
while it may be more costly to fund an ombudspersons’ office at each university, such a
move would ensure that the needs of all stakeholders are adequately addressed.
Recommendation Twenty-Six: The government of Ontario should swiftly
adopt legislation to expand the powers of the Ontario Ombudsman to
oversee the “MUSH” sectors.
In the summer of 2014, the Ontario government proposed Bill 8, the Public Sector and
MPP Accountability and Transparency Act. This act would allow the Ontario
Ombudsman to investigate a broad area of the public sector not currently under the
authority of that office: municipalities, universities, school boards, and hospitals – the
so-called “MUSH” sector.
Students believe that the bodies that govern these critical sectors ought to be subject to
the expert, arms-length scrutiny that has been demonstrated by the Ontario
Ombudsman office. The governments of British Columbia and Newfoundland have taken
similar steps, expanding the mandate of their ombudsperson offices to the university
sector.31 André Marin, the present Ontario Ombudsman, has signalled that the office is
eager “to offer more constructive input on how MUSH bodies can be held to account”
and that if the bill is successful, “thousands of Ontarians who have problems with these
organizations will finally benefit from our help.”32
Despite the good intentions and best efforts of university faculty and administrators,
issues do emerge. Universities are so large and so multifaceted that it is unreasonable to
believe that there will be no wrinkles in the system. Inequities and inequalities exist
throughout, and there will always undoubtedly be controversies surrounding issues of
compliance with regulations such as the ancillary fee protocol, for example. On topics
such as these, an impartial, expert “watch-dog” is ideal to provide advice and
recommendations.
“Ontario Ombudsman, Case Update annual report” Ombudsman Ontario, accessed August 11 2011,
http://www.ombudsman.on.ca/About-Us/Who-We-Oversee/MUSH-Sector/Case-update---Annual-Report-20102011.aspx
31 “The Push for Mush: How does Ontario Measure Up?” Ombudsman Ontario, accessed October 2014,
https://ombudsman.on.ca/About-Us/Who-We-Oversee/MUSH-Sector.aspx
32 André Marin, “Annual Report 2013-2014: Ombudsman's Remarks,” Ombudsman Ontario.
https://ombudsman.on.ca/Files/sitemedia/Documents/AR14-Os-remarks-EN.pdf
30
33
Expansion of the Ontario Ombudsman office into the MUSH sectors – universities in
particular – need not necessarily create redundancies with individual ombudsperson
offices established at each institution. Rather than overlap jurisdictions, they could
collaborate to ensure that appropriate advice is given from the appropriate level. The
Ontario office could be responsible for large scale, systemic issues that affect multiple
institutions, while the local offices could serve as impartial investigators for more
particular or internal institutional issues.
Quality Assurance
Ensuring that the services provided by Ontario universities are of sufficient quality to
guarantee student success is central to the post-secondary system. Students in Ontario
rely on universities to give them the skills necessary to succeed; if the methods used to
report on the quality of education are inadequate, it becomes extremely difficult to
determine how effective university education is. Often such problems can be solved
through standardization, however such an approach would be inappropriate in this case
– a 2011 study found that all stakeholders are heavily opposed to standardized curricula,
for example. 33 While standardization simplifies cross-comparisons and ensures
consistency, it would remove the freedom that university educators enjoy.
It is obvious to professors and students alike that having curricula set by the Ontario
government is no one’s best interest. However, due to the variety in the post-secondary
landscape, it is difficult to fine reliable, widely applicable indicators of educator
effectiveness. The responsibility for overseeing quality assurance falls on the Ontario
Council on Quality Assurance.34
The Ontario Universities Council on Quality Assurance, known as the Quality Council, is
the principle body responsible for assuring the quality of both undergraduate and
graduate university programs. The Quality Council is an arm’s reach organization of the
Council of Ontario Universities, and its mandate is to “[oversee] quality assurance
processes for all levels of programs in Ontario’s publicly assisted universities, and helps
institutions to improve and enhance their programs.” 35 The Quality Assessment
Framework used by the Council revolves around the Institutional Quality Assurance
Process (IQAP).
IQAPs are quality assurance processes that are unique to every university but follow the
same framework, which is outlined by the Quality Council. All IQAPs must be approved
by the Council, and are a key means for the government to evaluate whether it will
support and fund universities. Ontario universities must, based on the Quality Assurance
Framework, write four protocols into their IQAP:
1. Protocol for New Program Approvals
2. Protocol for Expedited Program Approvals
3. Protocol for Cyclical Review of Existing Programs
Ontario Confederation of University Faculty Associations and the Canadian Federation of Students, The 2011
OCUFA/CFS Study on Post-Secondary Education Ontario Results (Toronto: 2011).
34 Quality Assurance Framework, Ontario University on Quality Assurance, 2012
35 Ibid. p 2
33
34
4. The Audit Process36
Both the protocol for new program approvals as well as the protocol for expedited
program approvals rely heavily on an internal evaluation process. The internal
evaluation process of the Quality Assurance Framework begins with evaluation criteria
that are to be filled out by the university. The process is as such:
Figure 137
Principle Ten: The quality assurance structure of all university programs
should be subject to a framework that defines what a student should have
learned after each level of instruction
Students should expect the same level of knowledge out of their degree irrespective of
which Ontario University they attend. In order to deal with the potential problem of
variable learning outcomes across different institutions, the Ontario Council of Academic
Vice-Presidents put in place University Undergraduate Degree Level Expectations
(UUDLEs). UUDLEs indicate what students should expect out of their degree. The
Quality Council uses UUDLEs as a central benchmark when evaluating new programs
and determining if they are acceptable.38
Concern Thirteen: University Undergraduate Degree Level Expectations are
too vague and subjective
Under the current framework, UUDLEs are largely up to the interpretation of the
university. This is the result of inherent ambiguity in the language and scope of the
UUDLES, an example of which can be seen in section 1(b) of the Undergraduate and
Graduate Degree Level Expectations:
Ibid.
Ibid. p 9
38 Ibid. p 1
36
37
35
Table 639
Baccalaureate/bachelor’s
degree
This degree is awarded to
students who have
demonstrated the following:
1. Depth and
breadth of
knowledge
b) Broad understanding of
some of the major fields in a
discipline, including, where
appropriate, from an
interdisciplinary perspective,
and how the fields may
intersect with fields in related
disciplines
Baccalaureate/bachelor’s
degree:
honours
This degree is awarded to
students who have
demonstrated the following:
b) Developed understanding
of many of the major fields in
a discipline, including, where
appropriate, from an
interdisciplinary perspective,
and how the fields may
intersect with fields in related
disciplines
Clearly, there is little distinction between the bachelor’s level and honours level.
Additionally, there is very little embedded in the language of the expectation that
intuitively suggests how it might be measured or observed, making it both vague and
difficult to assess.
Recommendation Twenty-Seven: The Quality Council should adopt the
Lumina Foundation’s Degree Qualifications Profile learning outcomes when
evaluating academic programs.
Universities are diverse, which may be why UUDLEs may be interpreted so broadly.
However, a model by the Lumina Foundation (though developed with the American
education system in mind) could take into account the diverse characters of Ontario
Universities while still offering more cogent and measurable learning outcomes.40 The
Lumina Degree Qualifications Profile is similar to the UUDLE however it takes into
account and builds on different levels of degrees, five different areas of learning, and
many different configurations of university foci.
Moreover, Lumina’s qualifications profile is especially useful because it frames its
learning outcomes as demonstrable skills that are far easier to understand and assess.
For example, under the broad category of ‘knowledge,’ a student at a bachelor level
should be able to “define and explain the boundaries and major sub-fields, styles, and/or
practices in the field.”41 Though similar to something one might find as a UUDLE, it is
more specific and far clearer regarding how it could be demonstrated. As such, the
Lumina Model is able to accommodate the various needs of different universities while
also being able to outline specific expectations of learning at those universities.42
Ibid.
M.C. Lennon, B. Frank, J. Humphreys, R. Lenton, K. Madsen, A. Omri, and R. Turner. (2014). Tuning: Identifying and
Measuring Sector-Based Learning Outcomes in Postsecondary Education. Toronto: Higher Education Quality Council of
Ontario.
41 The Degree Qualifications Profile, Lumina Foundation For Education (2011). p 12
42 Lumina, The Degree Qualifications Profile, 2013
39
40
36
Principle Eleven: Universities should work to ensure that
undergraduate programs are of a quality acceptable to students.
their
Students are at the center of university education, as such, it is imperative the programs
offered by universities provide an excellent learning experience. The Quality Council
recognizes the importance of having university programs that tailor to students in their
Quality Assurance Framework, by employing internal evaluation criteria that ensure new
programs, as well as changes to existing programs, are of a quality acceptable to
students.43
Concern Fourteen: Internal Evaluation Criteria are too vague and
subjective.
The internal evaluation criteria used employed within IQAPs are highly subjective and
easily biased. Although there is a check and balance mechanism employed in the form of
an external review, the body that performs the external review is up to the discretion of
the university. After the internal university evaluation, the university gives their
program assessment to the Quality Council, which determines whether or not the
proposed program or program change is acceptable through a council vote. As well,
whether or not something is considered a ‘program change’ is up to the IQAP formulated
by the university.44 A university can change a program considerably while avoiding
labelling the changes “significant” by their own standards, thereby circumventing
evaluation from the Quality Council.
Recommendation Twenty-Eight: the Quality Council should develop a
mandated internal check system that a university must satisfy in order for a
new program or program change to be approved.
With an internal check system that mandates certain aspects of education be
incorporated, the Quality Council will be able to objectively assess programs. Within the
current Quality Assessment Framework there are clauses that must be satisfied, however
whether or not they are satisfied is up to interpretation of both the university as well as
the Quality Council. Section 2.1.4 of the Quality Assessment Framework is a good
example of how internal checks are fairly subjective.
2.1.4 Program content
a) Ways in which the curriculum addresses the current state of the
discipline or area of study.
b) Identification of any unique curriculum or program innovations or
creative components.
c) For research-focused graduate programs, clear indication of the
nature and suitability of the major research requirements for degree
completion.
d) Evidence that each graduate student in the program is required to
take a minimum of two-thirds of the course requirements from among
graduate level courses.45
The Ontario Quality Council Quality Assurance, 10
Ibid. 14
45 Ibid.
43
44
37
Rather than rely on this kind of subjective judgement, which offers only the loosest
guidelines, it is recommended that the Quality Council refine its process by betterdefining exactly what is considered acceptable along these terms.
Principle Twelve: Universities should work to ensure that program quality is
adequate.
Universities should operate in a manner that ensures proactive quality control over
programs and instruction. The Quality Council has a protocol of cyclical program review,
which allows a university a maximum of eight years of program implementation before it
is reviewed. The review mechanism is substantial and is meant to recognize
shortcomings within the university that may not have been apparent at first. An audit
process is also enacted once every eight years to ensure that the university has fulfilled
its cyclical review mandate. In between review periods, it is assumed that the IQAP
formulated by the university will be enough to keep a university accountable to its
students.46
Concern Fifteen: IQAPs are not structured to sufficiently ensure university
accountability to their students.
When the Quality Council accepts a new program or a program change, it is assumed
that the IQAP formulated by the university will be effective and sufficient quality control
to proceed. Unfortunately, the university has no internal check mechanism for that
IQAP, meaning that a university does not necessarily have to fulfill that IQAP in order to
receive funding.
Concern Sixteen: program review periods of eight years are too long upon
first ratification of the program.
Although the cyclical program review framework developed by the Quality Council is also
meant to ensure that there are additional accountability and quality control mechanisms
inherent in undergraduate and graduate programs, the enforcement ultimately comes
down to the cyclical program review itself.47 A university could be running an inadequate
program for eight years from the date of first ratification, and have given degrees to
several cohorts of students, only to later determine that the program was inadequate.
Recommendation Twenty-Nine: the Quality Council mandates a review of
new programs at the end of their third year of implementation before
moving to eight-year cyclical program reviews.
This paper recommends that two years after the quality council approves a new program
or program change, they perform a program review. Eight years is too long time a to
leave a newly designed program or change unexamined. A three-year program
implementation period would address the issue of having a long wait time between
program beginning and first review, and would also ensure that a university properly
46
47
Ibid. 18
Ibid.
38
follows their IQAP in respect to that program. If an institution passes the two-year
review, it may continue with its eight-year cyclical review, if it does not, suggestions will
be made and the institution will have another two years to take corrective action. Having
a three-year implementation period for all new or altered programs will ensure that
programs are always of a quality acceptable to students.
Principle Thirteen: Ontario universities should have in place an adequate
measure of teaching quality.
Teaching quality is central to the academic process; it ensures that the material of a
program is engaging and understood by the student for whom the program was
designed. The Quality Council recognizes the importance of teaching quality within their
accountability framework. It mandates that all programs have in place a teaching
assessment mechanism that allows universities to make an assessment as defined under
the UUDLEs, which as previously defined, are quite broad.48
Principle Fourteen: Courses vary between program, institution, and even
instructor. As such, they need to be evaluated on an individual level.
One of the main challenges when seeking to evaluate quality is the inherent variability of
courses. While consistency and reliably should be striven for system wide, it behoves us
to remember that courses are all different. As such, it is appropriate to stress evaluation
methods that take this factor into account, and can not only provide comparable
indications of performance and success, but can also be fine-tuned and customized to
provide valuable information particular to a given course or instructor.
Principle Fifteen: Student evaluation of teachers (SETs) and course
evaluations can and should offer valuable feedback for professors.
SETs can be invaluable in fine-tuning the teaching process. By incorporating meaningful
student feedback, courses and instructors can become more effective and more engaging
for students. Furthermore, a well thought-out SET can force a student to think critically
about his or her own learning experience and learning style.
Students are an excellent source of quality control information: exposed to academic
content for the first time, they are uniquely sensitive to failures of instruction which
internal faculty or administration reviewers - being already familiar with the content –
may fail to detect.49
Concern Seventeen: Ontario universities do not have a standard method of
teaching assessment.
The ambiguity of the UUDLEs is also present in the Quality Council’s mandate for
teaching assessment. A mandate that gives universities too much leeway in determining
what an acceptable form of teaching assessment is may lead to similar programs having
48
49
Ibid. 1
Ibid.
39
vastly different methods of teaching assessment. Neither SETs nor any other methods
are administrated consistently system-wide.
Concern Eighteen: Currently, SETs are not always implemented in such a
way as to as to yield valuable or constructive information.
Unfortunately, SETs are not presently being administered appropriately or effectively.
The content and structure of the instruments used to collect student feedback vary
widely between universities, programs, or classes, and are variable in terms of depth or
breadth of information sought. Moreover, owing in part to collective bargaining
obligations, the content of student feedback and the extent to which it is reviewed and
received is often hidden. The result is disillusionment on all sides: students do not
recognize the value of SETs and will give unhelpful or trivial responses (if they respond
at all), and faculty become convinced that students are incapable of giving fair or high
quality assessments, and so resist SETs for fear of being judged according to unreliable
or petty feedback.
Recommendation Thirty: the Quality Council mandate a qualitative
assessment mechanism that is both formative and summative in nature,
that focuses on what is best for current as well as future students, and that
contributes to a professor’s case for promotion or tenure.
Universities in Ontario would benefit from having a system that mandates programs
have a specific form of teaching assessment. Of interest is the Start, Stop, Continue
framework, whereby students are guided to voice their opinion on how the teaching
method of the instructor should be changed from a critical thinking standpoint. The
framework comprises of three main portions as indicated in its title:
1. Start; indicates what the instructor should start doing
2. Stop; indicates what the instructor should stop doing
3. Continue; indicates what the instructor should continue doing
A 2014 study found this qualitative approach to teaching assessment was more
successful, and more preferred by students, as a teaching quality assessment
mechanism.50 It frames students’ thinking in such a way that they offer deeper and more
constructive feedback. Questions that are purely open ended more frequently lead to
one-word answers or shallow responses, whereas the introduction of this method
markedly increased the generation of high quality feedback.51
Due to instruction being identified as a central aspect of a professor’s job, SETs should
be viewed as a critical consideration for tenure and promotion. The introduction of the
Start Stop Continue model should not discourage instructors from adding their own
specific qualitative assessment onto their SETs regarding teaching aspects that those
instructors deem valuable to their personal development. Any specific evaluation criteria
decided upon by an instructor should not affect tenure and promotion, and should be
50 Alice Hoon, Emily Oliver, Kasia Szpakowska & Philip Newton, “Use of the ‘Stop, Start, Continue’ method is associated
with the production of constructive qualitative feedback by students in higher education,” Assessment & Evaluation in
Higher Education, 2014.
51 Ibid.
40
held separate from a standardized model mandated by the quality council. The SETs
employed by all Ontario universities should be formative and summative in nature,
allowing students to evaluate an instructor in the middle and end of a unit of instruction
in order to ensure that subsequent instruction is of a quality acceptable to current
students, as well as future students.
Recommendation Thirty-One: Universities should be given appropriate and
dedicated funding to develop teaching and learning centres, through which
supports should be offered to instructors in order to allow them to adapt to
teaching evaluations.
Universities should have Teaching and Learning centres in place with a mandate to
process SETs and assist instructors in their development. The information gathered by
SETs should be sent to teaching and learning centers within the university in
anonymized format, where they will be reformatted into digital text. Teaching and
learning centers should work to filter and disqualify responses that are vulgar, hateful, or
deliberately unconstructive. In addition, teaching and learning centres should utilize
their expertise to advise professors in effective teaching strategies based on the feedback
collected. Along with positive evaluation responses, effective incorporation of these
strategies and responsiveness to change should reflect well on an instructor when
evaluated for promotion, tenure, and contract retention. Furthermore, there should be
resources made available by Teaching and Learning Centers in order to assist instructors
in their development.
Principle Sixteen: Course evaluation and SET results should be made public.
The current state of SETs is such that, even when effective methods are used, students
remain unclear on what happens to their input or if it is ever even considered. Students
believe that the results of well-administered SETs should be made available to the public.
Current and prospective students will find it valuable to check the feedback that has been
given to instructors or courses by those who came before them.
Concern Nineteen: The lack of transparency regarding SET results leaves
students and the public uninformed.
The content of student feedback and the extent to which it is reviewed and acted upon is
unclear, leading to widespread disillusionment in the process. Additionally, the potential
value that prospective students would find in reviewing this feedback from their peers is
kept from them. Furthermore, the public has no way of knowing if universities are
effectively using this valuable information to positively impact the learning experience.
Recommendation Thirty-Two: The government should strive towards
publicizing SETs, as well as the steps that are taken resulting from the
feedback therein.
Though it may be no small matter to publicize SET information, the government should
consider this a priority moving forward. Making the information available in some form
would benefit current and prospective students. Moreover, allowing students to see their
input being taken seriously, as opposed to vanishing quietly, will improve student
attitudes towards the process and lead to higher quality and quantity of participation.
41
Principle Seventeen: The needs of underrepresented groups should be
considered a matter of quality assurance, and should not be overshadowed
by the majority.
It is in accordance with Canadian values and constitutional law that no minority group
be ignored or disadvantaged because they are underrepresented.52 Underrepresentation
should not lead to being overlooked in any process where minority groups have a stake.
Universities in Ontario are no exception. HEQCO addresses this issue by discussing how
an aspect of educational quality is ensuring that that minority groups are not overlooked,
as they are at the current moment.53 Programming, and resulting positive outputs, that
supports these students and their success should then be a key component of all
universities’ performance.
Concern Twenty: Program and teaching assessment may leave out
recognition of minority groups and their specific needs
As of 2014, the Quality Assessment Framework used by the Quality Council does not
employ significant efforts to compensate minority group. Instead of highlighting
minority groups’ concerns regarding quality, they are aggregated with the general
student population, and their particular concerns become drowned out or relegated
outside of the quality framework.
A study by the HEQCO, for example, finds that those who have disabilities are less likely
to complete an undergraduate degree.54 It should therefore be viewed as important by
the quality council to ensure that undergraduate university programs are acceptable to
those students in an attempt to bridge the gap in degree attainment between disabled
and non-disabled students.
Recommendation Thirty-Three: The Quality Council develop a framework
that adequately represents recognized underrepresented groups
If the Quality Council were to implement a framework that directly takes the needs of
underrepresented students into account, it would go a long way to ensure that academia
is truly accessible to all. It is recommended that the Quality Council implement a
framework that properly represents all students equally when implementing quality
assurance practices in order to allow programs to maintain a quality that is acceptable to
all students.
Data Collection
Principle Eighteen: Students and their families should have information
informing their choice of post-secondary pathway and program of study.
Canadian Constitution Act 1982;Canadian Charter of Rights and Freedoms, s. 15(a)
R. Finnie, S. Childs and A. Wismer (2011). Under-Represented Groups in Postsecondary Education in Ontario:
Evidence from the Youth in Transition Survey. Toronto: Higher Education Quality Council of Ontario.
54 U. McCloy and L DeClou (2013). Disability in Ontario: Postsecondary education participation rates, student
experience and labour market outcome. Toronto: Higher Education Quality Council of Ontario.
52
53
42
Post-secondary education is becoming increasingly complicated, as are expectations of
what it should be doing and what personal and social benefits can be reasonably
expected from it. Arguably, the stakes to make the right choices are higher than ever as
well – both for individuals and for those who direct funding and policy for universities –
with youth unemployment remaining a persistent concern and resources remaining
limited throughout the entire government.
Good information is crucial in all of these areas. Students who are in the midst of
planning their future deserve to know what they can reasonably expect in terms of
general educational experience and possible outcomes from their program of study. The
province should be committed to collecting, using, and distributing such data for their
own purposes and the use of students and their families.
Concern Twenty-One: The Province is lacking multi-dimensional and
longitudinal data on student mobility and performance.
The province does not generally have the means to track individual student performance,
the choices made by certain students, broad services usage, or other quantifiable aspects
of a student’s time in university. Universities do tally some demographic data, but these
are typically isolated from the aggregate. For example, a university may know that a
student with a disability has registered at their institution or with accessibility services,
but may not know how they are performing, which support services they may be
accessing, or if they are changing programs.
This may mean that we are missing data that might support important public policy
interventions on behalf students. If a particular population should prove more mobile
than others, or may have a longer time to completion, then we can explore why. Current
data collection processes at universities lack these multiple dimensions. We might have a
good idea of a student’s time to completion, but cannot compare that with other
meaningful data. The data that is collected tends to be isolated in time as well, providing
sporadic data points over the course of an academic career and not a consistent
longitudinal study for any one student or group of students.
Concern Twenty-Two: Proposed legislation surrounding university data
collection may create privacy concerns.
The Strengthening and Improving Government Act 2014 amends the Acts governing
both the Ministry of Education and the Ministry of Training, Colleges and Universities to
grant their respective Ministers broader powers in mandating and accessing data
collected within Ontario’s publicly funded learning institutes. Specifically, the bill allows
the Minister of Training, Colleges and Universities to collect personal about students for
the purposes of performance measurement (both of universities and students), funding,
program refinement, and accountability. While enhanced data collection is an important
step in improving on those things, it is important that individual students remain
protected from identification – particularly where it relates to services and processes
where a degree of confidentiality is expected, such as health services, academic
counselling, and learning support services. Data should not be used in anything less than
a meaningful aggregate, otherwise it has it limited public policy use. The only reason for
individualized personal information to be used by the Ministry would be interventions
on behalf of individual students – a role for which the Ministry may not be best suited.
43
The Bill does present limits on the use of personalized information: the Minister is not to
collect personal information where other information will suffice, and the Minister is not
to use more personal information than is required for the purposes of a particular study
or decision.55 While these are welcome, there is little to no reason for explicit identifiers
such as a students name to be included in data that is turned over to the Ministry.
Concern Twenty-Three: The public is lacking comprehensive data on
employment outcomes for university and college graduates.
The Ministry of Training, Colleges and Universities works with institutions to survey
graduates at six months and two years after graduation about their employment
outcomes. They also survey graduates about their earnings, the relevance of their
employment to their field of study, whether or not they undertook internships, and if
they are undertaking additional post-graduation studies.
However, the data are reported in a very generalized way. While data are collected along
dimensions of full- and part-time employment, as well as graduates working in multiple
jobs, employment is almost always reported as a binary: employed or not. Another
example is how earnings are presented as an average across all respondents while
employment is broken down by area of study. OUSA is concerned that all crosstabulations possible with the data derived from the questions on the survey are not fully
made available to the public. Students are sympathetic that, depending on response
rates, that data might be less meaningful at a deep level, but expect that some more
detailed conclusions are possible with existing data.
Recommendation Thirty-Four: The government use the Ontario Education
Number to collect longitudinal and aggregated data on multiple dimensions
of university and student performance, use of services and access. This data
should be made publicly available in an anonymized format.
Admittedly, there is little reason to expect the Minister to want to collect individualized
information. However, the Bill also calls for the issuance of Ontario Education Numbers
to all post-secondary education students in publicly funded institutions in Ontario. The
Ontario Education Number is an anonymized and randomized number that already
provides data tagging for all students who have registered in primary and secondary
school in Ontario. These numbers would be sufficient to providing aggregate data for
policy purposes. They also already provide longitudinal data that can allow us to
understand which primary and secondary students choose which post-secondary
pathways.
Most would also have the important feature of providing an anonymizing step between
an individual’s name and the other information that is important to collect. With the
legislation mandating the issuance of the OEN to students who do not already have them
(largely out of province students who have not attended primary or secondary school in
Ontario) there is no reason for a student’s name to be made available to the Minister.
Continuing existing uses for financial assistance, simple verification of enrolment and
the like are acceptable. OUSA therefore recommends that the government expand data
collection and disclosure in the spirit of Bill 151, but that they use the Ontario Education
55
Government of Ontario. Strengthening and Improving Government Act 2014. 15.2.
44
Number as the main identifier in place of a student’s name or other explicit identifier for
data consistency.
Recommendation Thirty-Five: The government make available all data from
the Post-Graduate Survey that satisfies statistical quality standards. This
data should be formatted for public consumption and its significance
properly communicated.
As addressed above, there are many dimensions of university outcomes that can be
gleaned from the existing survey that are not broadly available. Information relating to
whether a particular course of study is more likely to result in employability, an
internship, relative certainty of earnings, or the need for future education – while not
necessarily the most important considerations in every case - should be available in a
student’s decision-making process should they want it.
In order to allow students to make the most informed choices possible, the government
should make available as much data as can be considered statistically sound from the
Post-Graduate Survey. This data should be packaged in a manner that is accessible to the
public and should provide context as to what the data means, including margins of error
where appropriate.
45
Policy Statement
WHEREAS Those who are integral to the functioning of a university should be
responsible for ensuring accountability.
WHEREAS Students should be represented to a greater degree on governing bodies than
is currently the case.
WHEREAS The setting of strategic long-term goals must be done as part of a cogent
plan that sufficiently addresses institutions as well as Ontario's post-secondary landscape
as a whole.
WHEREAS The setting of strategic long-term goals must be reflective of the needs of
both the university stakeholders and government.
WHEREAS Students should be included as major stakeholders in goal setting as well as
drafting and revising strategic plans.
WHEREAS Long-term strategic planning should include specific, system-wide and
government-mandated targets.
WHEREAS Funding envelopes are important tools for ensuring the accountability of
government investment in post-secondary.
WHEREAS The government must closely monitor funding envelopes in order to ensure
the envelopes are being dispersed in accordance with their mandate.
WHEREAS Those whose complaints are not being answered through the standard
channels should have access to an independent, objective ombudsperson to help settle
their grievance.
WHEREAS The quality assurance structure of all university programs should be subject
to a framework that defines what a student should have learned after each level of
instruction.
WHEREAS Universities should work to ensure that their undergraduate programs are of
a quality acceptable to students.
WHEREAS Universities should work to ensure that program quality is adequate.
WHEREAS Ontario universities should have in place an adequate measure of teaching
quality.
WHEREAS Courses vary between program, institution, and even instructor. As such,
they need to be evaluated on an individual level.
46
WHEREAS Student evaluation of teachers (SETs) and course evaluations can and
should offer valuable feedback for professors.
WHEREAS Course evaluation and SET results should be made public.
WHEREAS The needs of underrepresented groups should be considered a matter of
quality assurance, and should not be overshadowed by the majority.
WHEREAS Students and their families should have information informing their choice
of post-secondary pathway and program of study.
BIRT The provincial government should utilize its ability to appoint members to boards
of governors knowledgeable in institutional and MTCU priorities.
BIFRT The selection process for student representatives on their institution’s boards of
governance must reflect student government’s selection processes.
BIFRT Strategic plans, such as the Strategic Mandate Agreements, should be reviewed
by formal governing bodies on which students are appropriately represented.
BIFRT In order to avoid board inflation without appropriate student representation,
The Province should mandate that a minimum of 13% of seats on university boards of
governors be reserved for undergraduate student representatives, and a minimum of
25% of seats on university senate be reserved for undergraduate student representatives.
BIFRT There should be an opportunity for a student presence on every committee of
university boards of governors and Senates.
BIFRT Universities should make their board membership totals and student
representation on these boards transparent, visible, and easily available.
BIFRT University boards of governance should amend their bylaws to specify dedicated
places for student membership.
BIFRT University Boards and Senates should provide more flexibility and alternative
methods of attendance to work around student schedules.
BIFRT All in-camera sessions of boards, senates and their committees must include the
presence or input of at least one student from the respective board, senate, or committee.
BIFRT If a student is required to come before a governing body’s disciplinary panel,
that student should have the ability to request a replacement for any member of the body
they believe may act with impropriety.
BIFRT The government should align the MYAA report-back mechanism and funding
with the new SMAs to create a strong financial incentive.
47
BIFRT The government should utilize funding levers to assist or encourage universities
to meet their strategic goals.
BIFRT For the purposes of applying the funding levers associated with SMAs strategic
progress should be evaluated based on outcomes as well as methodology and approach.
BIFRT All long-term strategic planning or renewals of goals should require wideranging, formal input from student associations.
BIFRT The government should collaborate with individual institutions to set specific
long-term enrolment targets for undergraduate and graduate students.
BIFRT The MTCU should collaborate with individual institutions to set targets for the
implementation of outreach and other barrier-removing initiatives to ensure growing
access for underrepresented and mature students.
BIFRT The government should set specific long-term targets for the percentage of
small class sizes made available to students, and require institutions report these, broken
down by department.
BIFRT The government should set specific, long-term, comprehensive faculty hiring
plans to meet enrolment demands, as well as report the number and average teaching
load of faculty.
BIFRT University accountability reports should include a complete breakdown of
support services offered to students by university administrations, as well as the degree
to which they are supported by compulsory ancillary fees and university operating
budgets.
BIFRT University accountability reporting should include a detailed breakdown of all
ancillary fees levied against students.
BIFRT University accountability reporting should ensure that all results of the National
Survey of Student Engagement are published on institutional websites in survey years.
BIFRT The government should require universities to include envelope funding report
backs in their annual report.
BIFRT The government should publish an annual analysis of sector progress toward
the goals of all funding envelopes currently in place.
BIFRT The government should eliminate the performance funding envelope and
redirect the funding toward quality improvement and SMA adherence.
BIFRT The government should provide complete funding for every university to
establish its own independent ombudsperson’s office, managed at arms-length from the
university.
48
BIFRT The government of Ontario should swiftly adopt legislation to expand the
powers of the Ontario Ombudsman to oversee the “MUSH” sectors.
BIFRT The Quality Council should adopt the Lumina Foundation’s Degree
Qualifications Profile learning outcomes when evaluating academic programs.
BIFRT The Quality Council should develop a mandated internal check system that a
university must satisfy in order for a new program or program change to be approved.
BIFRT The Quality Council mandate a review after three years of program
implementation before moving to eight-year cyclical program reviews.
BIFRT The Quality Council mandate a qualitative assessment mechanism that is both
formative and summative in nature, that focuses on what is best for current as well as
future students, and that contributes to a professor’s case for promotion or tenure.
BIFRT Universities should be given appropriate and dedicated funding to develop
teaching and learning centres, through which supports should be offered to instructors
in order to allow them to adapt to teaching evaluations.
BIFRT The government should strive towards publicizing SETs, as well as the steps
that are taken resulting from the feedback therein.
BIFRT The Quality Council develop a framework that adequately represents recognized
underrepresented groups.
BIFRT The government use the Ontario Education Number to collect longitudinal and
aggregated data on multiple dimensions of university and student performance, use of
services and access. This data should be made publicly available in an anonymized
format.
BIFRT The government make available all data from the Post-Graduate Survey that
satisfies statistical quality standards. This data should be formatted for public
consumption and its significance properly communicated.
49