Site Loader
Project AWARE-C Evaluation Plan Guidance


(Joanne Oshel)
I will go ahead, and I pass it
onto our two presenters, Renay Bradley of the
Now Is The Time TA Center, and Michelle Bechard
of SAMHSA. And then also
helping out Michelle is Ingrid Donato from SAMHSA. And she will be answering
questions in the chat box as we go along. So, I will pass it
on to get started. (Michelle Bechard)
So, I’m glad to see that everyone in the chat box
could put in that they are with AWARE-C
because this really is intended for AWARE-C. If you’re an SEA grant
or LEA and you really want
to listen to this, you’re perfectly welcome
to do it, but you don’t have to develop
an Evaluation Plan that’s related to this guidance. As Joanne said, this is the
second webinar that we’ve had. We did the other one yesterday. We had a lot of questions, and so
we’ve revised it just a little bit, and we’ve actually put
some slides in here where we’ll stop
and answer questions, so rather than waiting
till the very end. And I think there’s some
information that we can give particularly related to
the tracked measures. We had a lot of questions
about that, when the data entry
is going to start, how do I collect the data. So, there’s probably some
additional information based on the questions
that we got yesterday that we’ll be able to provide
you today. Alright, I hope that all of you
are
aware of what AWARE stands for, which is Enhancing Wellness
and Resilience in Education. The goal of this program is to
improve mental health literacy among adults who serve youth,
as well as to build process and capacity for comprehensive
mental health approaches in both states and communities. If you have not–I don’t think
any of you have actually met any of us in person,
and so we include our picture. I’m at the top. I’m a team lead for
the Project AWARE program. Ingrid Donato
is in the middle. She is the Branch Chief
for the Mental Health Promotion Branch
here at SAMHSA, under which the Project AWARE
staff are housed. And then Renay Bradley, who’s
the Technical Assistance Liaison with the NITT-TA Center. And she will be conducting
the majority of this webinar. Here’s our agenda for today. We’re going to do an overview
of the objectives and the purpose
of the evaluation. We’re going to introduce you
to the Guidance document that should have been sent
out to all of the AWARE-C Project Directors on March 2nd. We’re hoping that if–there are
other people that probably also need to see that
development Guidance, evaluators,
other program staff, forward it to them. If not, we can make sure
that you get it. The other thing I wanted to
go back and say is we sent out copies of the PowerPoint slides
last week, and so those were also sent
to the Project Directors, so hopefully you got that. If somebody on the line did
not get the PowerPoint slides, then they’ll be posted on
the NITT-TA Center website. You can pull it from there. Renay is going to walk you
through how to use the template that’s included
in the Guidance to complete your
Evaluation Plan. I’ll talk a little bit about
resources that are available to help you complete
your Evaluation P And as I said,
there were several times during this webinar that you’ll
be able to ask questions, and we hopefully will be
able to answer them. So, if you are familiar with
the request for applications, you’ll know that these are
the program objectives for AWARE-C. Again, increasing mental
health literacy of adults. Increasing the capacity of
these same people to respond to behavioral health issues
of either adolescents or transition-aged youth,
depending on what population of focus
you selected. Conducting outreach
and engagement strategies with these youth
and their families to increase awareness of and promote
positive behavioral health. To link these youth
to mental, emotional, behavioral health assistance
and services. And increase the number of
collaborative partnerships that you work with that serve
youth in your community. And the purpose, again,
this also was included in the Request for Applications, is the purpose of
the evaluation is to assess what you are doing
with your program for providing timely information
through the collection of data for measuring your progress,
achieving outcomes, and making
data-driven decisions. And your evaluation should be
designed to the extent to which you have achieved
the overall AWARE-C program objectives,
as well as any other objectives that you’ve identified yourself. At this point, I’m going
to pass it on to Renay, who will walk you through
the Evaluation Plan Development Guidance document
that we developed. (Renay Bradley)
Thank you so much, Michelle. Hi, everyone. So, we’re going to go ahead
and just basically go through, and I’ll provide you with
an overview of the different sections of this
Guidance document. So, we developed this really to
support you as you move forward to complete an Evaluation Plan and submit it to SAMHSA
next month. On the page in front of you,
you’ll see it’s just a little bit of a screenshot
that has the cover page for the Guidance document. And it notes again that the
document’s intended to serve as a guide for all of
the AWARE-C Grantees as you develop your plan. This is just an overview of each
of the individual sections, so Michelle has
already covered what is included in section one,
the AWARE-C Grant Objectives and the overarching purpose
of the Evaluation. I’m going to very briefly
highlight what can be found in all of the other sections. But as Michelle said earlier, we
will spend the bulk of the time today focusing
on section five, the Evaluation Plan
Template Worksheet. All right, so this begins just
a quick overview of section two on evaluation basics. This is page five of
the Guidance document. So, this section of the document
includes information basically on what is an evaluation
and why we do this, the definition of program
objectives and outcomes, what different information with
regards to measuring objectives, including tips for creating
what we call SMART objectives. And we’ll definitely get
into that a little bit more when we start to go
through examples in the context
of the worksheet. It gives examples
and definitions for qualitative and quantitative data, and then ideas for how
you can use data. We decided to include some
of this information because we recognize
that the AWARE-C Grantees have varying levels of knowledge
and comfort with evaluation, so we wanted to just provide
a little bit of information for those that
might not have a professional evaluator
on their team. In all of the examples that are
provided in the section on the evaluation basics, and anything else
basically in this guide, it’s all focused and relevant
to the AWARE-C Grants and the Youth Mental Health
First Aid
training program. So, this is just section 3, which starts on page 11
of the guide. It gives an overview
of the specific AWARE-C evaluation requirements. So, as you may know
at this point, all of the AWARE-C Grantees
are required to participate in three different levels
of evaluation. The focus of this guide and the
focus of your evaluation reports should just be on two
of those three levels. So, the first one is collection
and reporting of the required GPRA performance measures We’ll talk about that today. The second is developing a local
Evaluation Plan to assist with assessing progress,
and meeting the goals and objectives of the grant. So, that’s the more specified
that you get to basically design
within some parameters that SAMHSA provides. The third required evaluation
level is participation in the national evaluation
of the AWARE-C program. So, information about that
national evaluation item three on this slide will be provided
to all of the AWARE-C Grantees by your SAMHSA Project Officer
and the Now Is The Time Project AWARE
National Evaluation Team. So, there’s another TA
center that focuses on that. So, you will get information
on the national evaluation via other mechanisms. Again, this webinar and your
Evaluation Plan should only focus on the GPRA
performance measures, as well as your local
Evaluation Plan. So, I’m going to turn this back
over to Michelle to just do an overview
of the data collection and performance measurements,
otherwise referred to as the GPRA data. (Michelle)
And I’m actually going to turn it over to Ingrid,
who was able to make it. (Ingrid Donato)
Hello, everyone. Hello, you wonderful
AWARE-C Grantees. Thank you for joining us
on this webinar. So, we received a lot of
questions about this yesterday, wonderful questions, so I wanted
to walk you through this slide personally to help answer
some of those questions. Before I do, I just wanted
to add a little bit more about the national evaluation
that you heard about. This is an evaluation that’s
currently being developed now. It will need to go
through a clearance and approval process,
which takes quite some time, so it’s something you’ll be
getting information about in the future. And so now
you can focus on developing your
Evaluation Plans, and we’ll be filling you in
on information about the national evaluation
later on. But I want to talk to you about
this slide, your GPRA slide. For you AWARE-C Grantees,
we have a wonderful shortcut that I want to kind
of remind you of. When you think of GPRA,
you can think of TRAC. All of your indicators fall
under the TRAC data collection. This isn’t the case for all
of our AWARE Grantees who have other GPRA measures
that they’re collecting. But for you, you will
be collecting these three TRAC indicators
for your GPRA requirements. These are collected
by you all the time. These aren’t new measures. You should have processes in
place to be collecting them, or working with your Project
Officer and the TA Center on how to do this. For the sake of your Evaluation
Plan development, you are going to be
answering questions about how you’ll be
collecting this information, how you’ll be utilizing it,
and how you’ll be reporting it. The actual numbers you
will be reporting quarterly into the TRAC data system. And we are going to give you
a whole training specifically on how to enter
that information. You need to collect
three indicators for this. The first one, the number
of individuals who receive training
in prevention or mental health promotion. In TRAC,
it’s shortcutted as PR1. We’ve made it
very simple for you. You’re going to be reporting
the number of individuals trained as Mental Health
First Aid– Mental Health First Aiders
during the reporting period, which is quarterly. You should be keeping
track already about who’s taking your courses, so we’re hoping this
isn’t going to be hard. The second one is
the easiest of them all, the number of people
credentialed or certified to provide mental health
related practices consistent with the grant. In TRAC, it’s shortcutted
as a WD3. And, again, it is simply
the number of individuals who are certified Mental Health
First Aid instructors. And then the final one,
which is another one that you’ll be working on–
I know my grantees have been working
very hard on this, and I know all of you are–
is the number of individuals referred to mental health
or related services. In the TRAC system,
it’s shortcutted as R1. And this is those—that data
that’s
being collected on the number of youth that your First Aiders
are referring to services. So, you should be having systems
in place to be collecting or figuring out how you’re going
to be tracking what your First Aiders are doing when
they’re interacting with youth. And you’ll be reporting
them there. Again, for the purpose
of your Evaluation Plan, you’re going to be
answering some questions. We are going to start entering
these data starting in June. So, again, you have some time. And we will be–we will be
conducting separate trainings all on how to orient you
to the TRAC data system. All right, and I will pass
it back to you, Renay. (Renay)
Thank you so much, Ingrid. So, at this point, I think we
wanted to just pause and just if there
are any questions with regards to
the GPRA measures, the TRAC system,
or anything else that we’ve covered thus far,
we can just take a moment to address those questions. (Michelle)
Renay, this is Michelle. I see a question
from Allison Gareth, who asked for the number
of individuals referred, do we literally only need
the number? (Renay)
You literally only need a number. How easy are we
making that for you? So, it’s a number that you’re
submitting quarterly, mm-hmm. (Michelle)
Okay, the second question comes from Karen. And I’m sorry, I’m not going to
try to pronounce your name– (Renay)
Solaria. (Karen)
Solaria. (Michelle) The data entry starts in–
the data entry starts in June, but should we be
collecting it now? (Ingrid)
Karen, what a wonderful– sorry, Michelle. What a wonderful question. And, yes, you should already be
collecting these data now. So, if you’re conducting
those trainings, you should be figuring out,
you should be counting those numbers of people who
are participating in trainings. You should have a number
of people who are being trained as instructors. And you should be getting those
systems in place so that you’re reaching out to your Mental
Health First Aiders– your Mental Health First Aiders
on a regular basis. We are recommending at
a minimum monthly to find out if they are engaging
you, if they are– if they are providing any
referral information to them. And, again, we use that
broad definition of referral. It doesn’t mean that you are
referring them to therapy. It is a broader one of,
are you doing something? Are you referring
them to a hotline? Are you referring them to
a wide variety of options? (Michelle)
Yeah, I think the guidance– if you go back to the Grantee
Guidance Manual provided to all of you,
we’ve used the word “link” in defining resource. So, you’re really “linking” a
youth
to a service, to a support, or a resource. It’s not the typical referral
that particularly mental health clinicians
think of. I have to go back.
Hold on, Carl asked a question. What if we have not done
any trainings yet? If you have not done
any trainings yet, then you don’t have–
you haven’t done any trainings. And so the hope is that by
the time June comes around, you will have done some
First Aider training. It’s okay when you enter data
into TRAC if you put a zero. So, for example,
for the measure that’s looking at the number of instructors,
in the first quarter, you might have trained
three people to be instructors. But in the next quarter,
you might have trained no one. So, it’s perfectly okay
to put a zero. But if you haven’t conducted
any trainings yet, that’s perfectly okay, though
I would definitely work with your Project Officer on
when you will be doing it. (Ingrid)
And I think there was another question from Rachel
that said more specifics about the number of individuals
who are referred. If someone is trained as a Youth Mental Health
First Aider right now, do we have to collect that
data ongoing throughout the duration of the grant,
the answer is yes, yes. For the 3 years that
your grant is ongoing, you will be collecting
that referral information from your Mental Health
First Aid trainers. It would be in your best
interest to be working with your Project Officer
and the TA Center to develop systems in place
for that collection. People have done very simple
things through SurveyMonkey. They’ve done app development. There’s a lot that’s already
been done by our AWARE family to do this requirement,
but it is a requirement only through the duration
of the grant. Once that 3 is done–
3 years is done, you do not have to collect
that information. And I just want to emphasize
that the TA Center has been an amazing resource on
this particular issue, how to be tracking
those trainers. If you haven’t done
your training yet and you haven’t had
this system in place, this is the perfect time
for you to get that in place. (Renay)
Thank you so much, Michelle and Ingrid. (Michelle)
Okay, thank you. (Renay)
So, this slide just includes information about
the three additional performance outcome measures
that all of the AWARE-C Grantees are required to first identify
and choose, basically, and then eventually submit
and share with SAMHSA. So, the three are–
you need to first have one– at least one adolescent
or transition age youth outcome performance measure. The second is at least
one community or population level outcome
performance measure. Another one related
to the provision of behavioral health
services in your selected geographic catchment area. And then any other process
and outcome measures that you as an individual
grantee have established as part of your program. We also would love
information on all of that. All of this stuff is then going
to be required by you to report annually within your
Annual Performance Report. And we’re going to talk
about these a little bit more, because within
the Evaluation Plan, you’ll be required to provide us
with a definition, what will you actually use
to focus on each of these, including some of these
specific details. So, we talked earlier
about having SMART objectives. So, SMART is an acronym
for specific, measurable, attainable, relevant,
and time-bound. So, the Guidance document
that we’ve provided you with that we’re going to get
into really sets things up. It’s structured in a manner
that will help you to provide all of the details about
your measurements so that we can actually ensure
that there’s SMART. So, this means
providing information including these
types of details. So, what will change as a result
of the program implementation? How will the change be measured
or what tools will you use? What do things look like
before program implementation? How much of a change
is expected? And then for whom
will the change occur? And in terms of the timeline,
when is it expected that the change will occur? And, again, the template and the
worksheet that we’ll focus on is structured to help
you generate and include all of these details
whenever they’re required. So, we have already generated
what’s called the Outcomes Performance Measures
handout, thank you. So, this was used in the context
of another AWARE-C webinar that was held in the past,
and we will provide a link to that webinar. But we created this handout,
and we’ve included it at the end of the Guidance
document. So, it includes examples
of each of those three required Outcome Performance Measures
that you could potentially include within your grant. And we’ve structured those
examples to be relevant for different types of grantees. So, examples:
Local Education Agencies, community-based organizations,
and health service providers. Within that handout,
examples are all stated as SMART objectives with
the required level of detail. And, again, they’re relevant
to AWARE-C Grantees. And in some instances,
they’ve even provided examples of potential data sources
that could be used in terms of capturing those
specific measures. So, the Plan
due date. The AWARE-C Evaluation Plans
are due to your Project Officer by April 15th, so exactly
1 month today. Again, the Guidance document
is designed to help ensure that you’ve included
all of the details, and adequately addressed
all of the requirements that SAMHSA provides
for your evaluation of your AWARE-C project. And of course, if you need
any additional help with completing
that Evaluation Plan, definitely contact
your Project Officer, or come to the Now Is The Time
Technical Assistance Center for help. Again, if you’re able to sort
of use the template, which I’ll demonstrate in a bit,
and you’re able to answer all of the questions within it,
that is your ability to successfully complete
the Evaluation Plan. So, we’re really trying
our best to provide you with the support that you
need to prepare that within the next month. We again included
a brief section within this Guidance document
that includes frequently asked questions
with regards to Human Subjects Protection. So, again, we understand
that AWARE-C Grantees have sort of varying levels
of knowledge and comfort with regards
to evaluation. Not all of you will have
professional evaluators on your teams,
so we’ve included some background information
regarding the protection of human subjects,
really just to guide you so you can successfully complete
the two questions that are asked within
the Evaluation Plan about this. So, this section gives you
information that would help you be able to determine
whether or not IRB review is needed for your
Evaluation Plans. And then if you are required
to submit IRB, you know, how you could actually do that so that you can receive
IRB approval. It also provides some examples
of best practices for the protection
of human subjects. And one other thing
I wanted to mention is that when all of you submitted
applications for this RFA, all grantees were required
to provide a little bit of information regarding
their plans for actually doing just this, providing protection
of the human subjects within the context
of your evaluations. So, I would definitely suggest
that you go back to your original
grant submission and see what you actually provided
to SAMHSA at that point, because it could just be
that you could copy and paste that information
right into your Evaluation Plan, and you will have addressed
all of the issues. So, we’ve got a polling question
coming up next, so you can go ahead and take a couple
minutes to just read through. So, the TA Center would just
like to know at this point, is there any other additional
support that you would like after this webinar? So, would you like more
support with evaluation basics? Do you have more questions about
AWARE-C Evaluation Requirements? Or would you potentially
like more support regarding Human Subjects Protection? So, go ahead and take a minute
to just tell us your answer. And if it’s none of the above,
go ahead and check none of the above. And then after this, we’re going
to take one more break to have another, you know,
spot for questions and answers. So, take a moment to fill out
the polling question. And then if you have
specific questions, feel free to start typing them
into the chat box, and we’ll take a moment
to just pause and answer those questions
after the polling question. All right, so it looks like
none of the above is the most popular answer here,
which is perhaps good to hear. So, it seems like just a few
folks have some questions about basics in the AWARE-C
Evaluation Requirements and Human Subjects Protections. And, again, if you do
want further support in any one of those areas, definitely reach out
to your GPO, or reach out to the Now Is The Time
Technical Assistance Center, or ask your questions
now in the chat box because we’re going
to go ahead and just pause and take a moment to try to
answer some of these questions. So, I’m going to go ahead
and see what I can find here. So, I’ve got Rachel asking,
“I’d like to hear more specifics “about the number of
individuals who are referred. “If someone is trained as
a First Aider right now, do we have to collect that data
throughout the duration?” I think actually Ingrid
already answered that as a
“yes”. We want that definitely
to be collected throughout the all three years. Next, Rachel asks, “Do you have
any data on the response rate for the First Aiders responding
to those monthly surveys?” So, I think this will probably
vary based on grantee. You know, so from my perspective
at the TA Center, we have certainly worked
with folks to help them set up different strategies
to encourage the instructors and the First Aiders
to actually respond to your monthly requests
for the referral members. So, some of the strategies that
I’ve heard that have worked are, for example, when you actually
train the First Aiders during the training, definitely
tell them and let them know at that time that this is
the expectation that you will– they will all be hearing
from you on a regular basis. And so they should start to use
some type of assistance to keep track of the numbers
of folks that they’re referring to different services
and supports so that they can be prepared
and start to do that. You definitely want to collect
all the different types of contact information that you’ll
use to communicate with them, so I would suggest
getting emails, getting cell phone numbers. Just make sure you know
how folks want to be connected to you. And there are definitely
many different resources that we can also sort of
put you in touch with. We’ve done a newsletter
and a webinar in the past that talks about development
of referral systems. And I think Ingrid or Michelle
mentioned that previously, but there is something called– it’s this app,
and it’s called Appy Pie. And I’m not sure, some of our
grantees have tried to use this. And I think it is
sort of a phone, a smartphone based
system that you can implement that not only gives
your First Aiders– if they have a smartphone– the ability to keep track
of the numbers, but it also has the ability
to then sort of zap those numbers to you,
submit the numbers to you so that you can actually
be keeping track of them, and then being
prepared to submit them into the TRAC system
each quarter. Okay, so let’s see. “Can you give us some
examples of these measures?” So, I think you’re referring
to the different performance measures that you have
flexibility to choose. So, in the handout at the end
of the Guidance document is a list of different,
you know, examples. So, I would say refer
to that specific handout. It’s the last section
within the Guidance document. And it goes over some examples
that might be applicable to different agencies, so that
Local Education Agencies, community-based organizations,
health services providers for how you could actually
structure a performance outcome to meet each of those
different requirements. And we’ll also go over some
examples as I start to get into the context of how to fill out
the template or the worksheet in that Guidance document. Okay, let’s see. Oh, thank you, Kari DeCelle, for
putting
the link to the webinar on the performance measures. That’s very helpful.
More stuff on Appy Pie. So, what else here?
I think that’s it. So, does anybody–I’m just going
to pause another second and see, any other questions about
what we’ve covered thus far? If not, we can start to go on
and get into the details of actually using the template
and the worksheet. I think we’re good. So, what you’ll see is I’ve just
taken some screenshots of the Evaluation Plan
Template/Worksheet, and I’ve gone ahead
and just, you know, sort of pretended
I was a grantee, and inserted sample responses
to the different questions that are required. So, any text that is just white,
regular text, it’s literally just
copied and pasted right out of
the Guidance document. Anything that’s highlighted in
yellow are examples answers that I’ve provided just
to give you an idea of what the expectations
are regarding, you know, how we would
want you to fill out and complete this plan. And if you use
this Guidance document, if you use the template,
the worksheet, and you just fill out
all of the questions, that’s all that would be
required of you to successfully complete
your Evaluation Plan and turn it into your
GPO in a month. So, these are just some ways for
you to consider filling it out. So, this is–beginning
it’s Section One. So, we have plans
for the Protection of Human Subjects. And as I mentioned earlier,
there are just two questions that we’ve asked here. So, it asks you to describe
how you protect individuals regarding their participation
in the evaluation. And there are two questions. So, one is, are you required
to receive IRB approval? And if so, when do you
anticipate completing it? And then the second is,
what specific processes and protocols will you put
in place to protect your human subjects? And, again, I’ve just given
some sample responses. So, for question one, you could
say “no, your organization is not required to
receive IRB approval.” And the second response
is, you know, in some instances you may be. So, you need to make
that determination. And, again, if you refer
to the section, the previous section
of the Guidance document that has some support
for you to help you make that determination
of whether or not you have to actually receive
that approval. I want to also just make note
that on the side, on the very right-hand side
of the template, you’ll always see a little spot
for any additional narrative that you might want
to share with us. Totally optional. You do not have to include
any more text in there if you don’t want to,
but if there’s anything else that you just want
to sort of call our attention to or share with your GPO,
please feel free to just make those notes
in that narrative section, which you’ll see
throughout the Worksheet. And so similarly
for question number two, we’ve just given some sample
answers about providing consent form, what will be
in the consent form, and different protocols
that will be put in place to maintain privacy
and confidentiality. And you can sort of read
these at your own pace, but these are intended
just to be examples of what would be acceptable
responses to these questions. And, again, I would
refer you to– or I would suggest
that you go back to your original submission
to this grant application and see what you put in that,
because it could be something that you could just
copy and paste right into the template. So, the next slide is going
to be focusing on plans for collection and reporting
of the GPRA data. So, I have not gone and–
gone through and give you example answers for all three
of the GPRA measures, the required GPRA measures, but we will cover
I think two of them. So, the very first one here
is required GPRA measure number one, the number
of individuals who were trained as First Aiders during
each quarter of the grant. So, just like with
the prior section, this will give you a list of–
in this case, it’s got five different
questions to answer. You see questions one
through three on the screen now. Again, you’ll see at
the far right an optional box if you have any narrative,
anything that you want to share with your GPO
with regards to any of this. You’ll also note, too,
for every one of the different requirements,
you’ll see a box at the top where it
gives you the information about where and when to report
the data and the information. That’s just meant
to help you understand and keep track of when
and where you should be sharing your data with SAMHSA. There are usually questions
regarding that as well, so that’s just meant to be
sort of a reminder. So, the questions here about
GPRA measure number one, number of individuals trained. It asks you firstly how
will you collect the data, what are the specific
processes you’ll use. It asks who will you
collect the data from and at what time points. And then, finally,
how will you analyze the data? What statistics or tabulations
will you run? And so I just, again,
filled out, you know, an example of how you could
potentially share information about whatever your
specific processes are. So, you might share
that you’re keeping logs, and you’ll have sign-in sheets
for the trainings so you’ll be able
to keep track of the individuals who attend, et cetera. Again, you can
sort of read through and see the different answers. But this is how
we’ll expect you to fill out your
Evaluation Plan. And just let us know
what your plans are so that you can
collect information on that first GPRA measure. Please note also
that within each of these different sections,
we want to know who you will be assigning
to be responsible for these specific tasks. So, whether you have an external
evaluator or, in this instance, the Project Coordinator
and the YMHFA instructors will be helping with this. So, these are just
the last two questions that are asked regarding
that first GPRA measure, number of First Aiders trained. So, the fourth question is,
how will you use the data internally? What data and processes
will you use to identify whether and how program
adjustments should be made? And here’s, again,
just an example answer. You’ll be comparing the total
number of First Aiders to the number trained
that we expected to have gone
through the training. And if there’s sort of
not an alignment with those expectations,
basically that says that this is something that you
need to sort of look into. If you aren’t meeting
the numbers that you expected to get, you can consider
making some adjustments. So, for example,
both during their outreach and recruitment methods,
or just identifying other things so that you can increase
your numbers. For the fifth question,
and this is a question you’re going to see
throughout repeated, what are your plans
for submitting data or findings into
the TRAC system, into the Annual
Performance Report, and into the annual
Evaluation Report? So, as hopefully
you all know by now, quarterly we would definitely
like you to submit all of the TRAC data
for the three GPRA measures into the TRAC system. And then annually
we’d like you to submit the sort of yearly
compiled totals into the Annual
Performance Report and the Annual
Evaluation Report. So, the next slides, we’ll go on
to GPRA measure number three. And, again, I’ve just
given you some sample texts to see how we’re
expecting you to share with us your details for how you
will plan to actually collect the GPRA measures. And that next GPRA measure will
be the number of adolescents and transition age youth
referred to an instructor. So, as Michelle I think
made note of earlier, referral, you’ve got the definition
up here. It’s sort of a very loose
definition regarding linking youth to appropriate
services and supports. You’ve got, again,
the reminder of when and where this information
should be shared with SAMHSA in the TRAC system
and the two annual reports. And then you’ve got those same
five questions that we’d like you
to answer that tell us how you will plan to actually
keep track of and share with us
the number of referrals, as well as, again,
the same boxes for the optional narrative
and who will you be– who will be responsible
for this task. So, for question number one,
how will you collect the data? It just gives an overview of how
this specific sample grantee is planning to collect
the information. I point out here, too,
that they make reference to using SurveyMonkey. And, again, if you are in
the process of actually trying to develop a method
to keep track of your referrals, I would definitely suggest
getting in touch with your GPO or the Now Is The Time
Technical Assistance Center, because we have lots
of materials that can help you figure out what type
of an approach would be preferable for you,
whether it’s using something like SurveyMonkey
or something like Google Docs or trying to use
the Appy Pie function if you’re a little bit more
technologically savvy. There are lots
of different methods that we can help you implement to support your tracking
of referral members. The next slide was just going
to be the last two questions, questions four and five,
that are again just, you know, relevant to this specific
GPRA measure. So, question four is how will
you use the data internally? What processes will you use
to identify whether or not you need to make
any adjustments? And then question five is,
what are your plans for submitting the data? So, again, your plans
for submitting the data may be very, very similar
for this GPRA measure as they were for the other
GPRA measure. If that’s the case, totally
fine to just copy and paste your responses within
the Evaluation Plan. And again for question four,
how will you use the data? Again, they’re just–
this is an example of plans so that they can gain
an understanding of how many of the instructors
and First Aiders are actually referring folks, you know,
referring youth to the appropriate
services and supports. So this is another time
we’re going to stop and just give time
for comments and questions. If anyone has any questions
about anything that’s been discussed thus far,
please go ahead and type those questions into your
comment box, the chat box, and we’ll do our best
to answer them. Oh, I see one here from Sandra. So, “Where will this document
with the examples be posted after the session?” Great question. So, I believe that all of
the GPOs have sent the document, the Guidance document,
out to all of the grantees. There was also
an email communication from the Now Is The Time Center
in which it was shared. But if you
don’t have it, please feel free to reach
out to your GPO or the Now Is The Time
TA Center, and we’d be happy
to share it with you. It’s a Word document,
again, designed to just take the sections directly
out of the Guidance document so that you could insert text
right into the boxes, just as you see the yellow
highlight on our sample slides. All right, I don’t–I’m not
seeing any other questions, so why don’t we go ahead
and just jump back in, and we can show some examples
of how section three could be completed? Okay, so these are now–we’re
moving on from the GPRA measures and we’re going into an example
or two with regards to your Local Evaluation Plans. So remember, you’ve got
sort of three additional required Outcome
Performance Measures where you need
to come up with something, an outcome that’s within each
of these sort of thematic areas. And, again, you’ll have
for each of those three specific outcome measures, you just have a series
of questions. In this instance
there are ten questions that we would
like you to answer that let us know exactly what
your plans are for completing and then actually collecting
information on those outcome and performance measures. And once again, you’ll see
information about where and when to report
the information, and you’ll see columns
for any additional optional narrative that you
might want to share with us, as well as information
about who’s responsible for these data collection
and evaluation-related tasks. So, for this one, the focus
here is on an outcome related to adolescents
or transition age youth. And the first question is,
what specific outcome will you focus on? And so the sample here
is the number of youth who follow through
with a referral after being referred
to a service, resource, or support by the instructor
or First Aider. So, as part of the GPRA side
of things, we want you to get the number of folks–
the number of youth referred. This grantee decided that they
wanted to sort of take the next step to that, and say
not only how many First aiders and instructors
are referring youth, or how many youth were referred,
but they want to know whether or not youth are
actually following through and actually acting
upon those referrals, and going to see
or engaging in the service, resource,
or support. So, I do want to sort of read
through some of these because I think
the examples here are worth calling attention to. So, the second question is
“what tool will you use to measure or identify
a baseline?” So, the grantee here says,
“This is the first time “that our organization
has offered the trainings. “They’re trying to evaluate
the specific impact “of the trainings. “And as such, we do not have
any instructors or First Aiders “who have already made
referrals that could be “followed up on in the past. “We know this based on our
organization’s historical documentation of what programs
we’ve implemented in the past.” So, it’s going to be
challenging for them because they are really starting
from zero, you know, so they have no previous
follow-through because they’ve
had no referrals made in the past within the context
of their organization. So that’s why
for question three they say, “What is your
baseline?” It’s zero since they have
not begun the trainings yet. The next slide, you’ll see
the response there to item number four. So, item four is
“what tool will you use to actually measure the impact?” And at the top
of the slide here, you’ll see what “we
plan to keep track of the number “of referrals made
by instructors and First Aiders “and will request that our
instructors and First Aiders “check in with the youth
that they made referrals to in order to see if the referral
was followed through on.” And then they’ll request
the numbers of youths that followed through
on the referrals in the same way that they got the requested
number of referrals, so using SurveyMonkey
in this example. So, that’s one way to do it,
and that’s probably a more simple way to track
whether or not your– the youth who were referred
actually followed through with the referral because
you’ll rely on the instructors and the First Aiders
to gather that data. I will say, though, some
grantees have talked about going the length of actually
partnering with the resources, the services, and supports that
the youth are being referred to. And you could certainly also
engage with those folks to get the number
of youth referred. So, that would sort of be
an alternative strategy to collect data on this
specific sample outcome. So, the last questions here,
we’ve got five, six, and seven. So, it’s, again, just more
details are asking– are being asked of you
with regards to who will collect
the outcome data, who will you collect
the outcome data from, and at what time points. What kind of change
do you expect to see? And at what time
do you expect to see the expected program change? So, I want to also call out
the sample example answer to number six. So, in this instance, you know,
it was a little challenging for them to come up with
something because, you know, they have not done these
trainings in the past. So, when you’ve not done
the trainings, you want to try to come up with an expected
level of change. This is the strategy
that they proposed using. So, they’re expecting to train about 100 new First Aiders
each quarter. So, they have been in touch
with a local sister organization that they partnered with, and they’ve done these
same trainings in the past, and have kept track of both
the number of referrals and the number of youths
who followed through with the referrals. So, the sister organization
has said that “we should expect “that our 100 new
First Aiders will make approximately 100 referrals
each quarter,” so one person making
one referral each quarter. “And of those, about a quarter
of them, 25, will result in a youth who will follow
through with the referral.” So, based on what’s happening–
or what has happened in the past at this other
sister organization who’s done similar work, they’re
purporting that the number of youth who follow through
with the referrals will go from their baseline of zero
to 25 each quarter. So, the next slide just has
the last three questions that are asked within
the Evaluation Worksheet or template about
that additional performance outcome measure. So the questions are
how will you analyze the data so that you can tell
whether you accomplished your expected amount of change? How will you use
the data internally? And what are your plans
for submitting the data into the Annual
Performance Report and Annual Evaluation? And, again, I’ve just provided
some sample ways that you can answer these questions
based on your specific plans. So, the first one, you know,
it’s pretty simple. They’re just going to tally
and sum together the number of youth that follow
through with the referral. And then if they get those total
monthly, they’ll sum it up and provide a quarterly total
that’s then submitted to the different–
to the– excuse me, the total
number tallied for the quarter will be compared to zero. And then the answer
to number nine, again, it just gives some examples
of how they’re planning to use the data internally. One thing to call out here is it
sort of mentions that, you know, they want to be able to learn
about the extent to which the youth are actually
reaching out, the trainings are having
an impact on the youth. They know that if they
are not actually meeting their expected numbers,
they’ll consider developing strategies to make
the follow-through as easy as possible
for the youth. So, again, just ideas on how
you can use that data to actually make adjustments
to your programming so that you’re more likely
to achieve the outcomes that we’re all hoping
for you to achieve. So next, we’ve got just another
example of those– the required Performance
Outcome Measure. This one is the one focused
on a community or population
level outcome. So, I want to go through. It’s the same–it’s the same
ten questions again that we would like
you to answer, give us some details about this
specific performance measure that we just went over
for the other one. I want to read through
some of these examples a little bit more closely
because I purposefully wanted to give an example here
that’s maybe something that’s a little bit more
sophisticated, I would say. So, the first thing I want
to call attention to is in the column that says “Who will be responsible
for these tasks?” it notes, “We have hired
a professional evaluator “who will lead and carry out
all aspects of this work. And that person will work
with the team, et cetera.” So, this is a grantee who has
a professional evaluator who’s largely going to be
doing all of the Evaluations for the AWARE-C Grant. So, in this instance, they chose
to focus on the mental health literacy among school
affiliated community members who participate
in the trainings. So, mental health literacy
is a big, hot topic. And so here are the methods
that they’re planning to use. So, question number two,
what tool you use to identify and measure a baseline? So, they note that “we will use
the pre/post-test survey measure “that was created by
Bruno Anthony at Georgetown for SAMHSA to identify
the baseline.” So, I want to let
everyone know– and certainly Michelle
and Ingrid can add more detail about this or if folks have
questions about this, we can certainly take time to
talk about this specific tool. But this is a real tool
that has been developed. And the national evaluation
is actually planning or in the process of planning
to use Bruno Anthony’s measure of mental health literacy. So, for folks out there who are
considering using or focusing on mental health literacy,
certainly the Bruno Anthony tool that was developed specifically
for the Mental Health First Aid Trainings would be
a viable option for you to use. And I believe that there are
plans to provide a webinar that’s going to be administered
by Dr. Bruno Anthony, and he will be able
to give you details about this validated tool. And he will be able
to share it with you. And you can decide for
yourselves whether or not it’s something that you
would like to use for your individual local
evaluation as well. So, for the baseline,
question three, it says, “What is your baseline?” And so in this instance,
the tool is sort of a pre/post-test measure. And so you would need
to start your trainings. You’d give it to your
participants right beforehand. You’d administer it again
right after completion of the trainings. And the Bruno Anthony tool
also has a 3-month follow-up and a 6-month follow-up. So, when you’re filling out
this Evaluation Template, you won’t know exactly
what the baseline is because you haven’t necessarily
used the measurement tool to establish that yet. So, the answer here to number
three is “No, “they don’t have a baseline yet. “But a baseline will be
established “once they begin
to hold the trainings and administer
the pre-test survey.” So, the next slide starts with
the same questions regarding “What’s the amount of change
that
you expect to see when comparing baseline to outcome levels
of your Evaluation tool?” And, again, I want to sort of
highlight these examples as sort of more sophisticated
answers that you might provide if you have an evaluator
on your team who is really doing all this work. So, the sample, he writes, “We
have reviewed published research “that focused on the evaluating
the impact of the trainings. “The population was comparable. “And that research suggests that
most of those who completed “the training showed immediate
gains in mental health literacy “directly following
the training, “but the gains tapered
off over time. “The immediate gains in
mental health literacy “were statistically significant
changes from our baseline. “Therefore, we expect to see
statistically significant “improvements in mental health
literacy immediately following completion of the training.” So, if you are out there
and you’re not an evaluator, and you are saying,
“What is she talking about?” this might not be an approach
that would be, you know, something that I would
recommend for you. Again, I share it just so folks
who have more experience or comfort with doing more
sophisticated evaluations can have an idea of how we would
expect you or would accept a different type of response. So, you’re not actually
specifying the numerical level of change, but you’re specifying
that we would expect it to be a statistically
significant difference between the baseline
and the post-test. And then even in terms of
the answer to number eight, “How will you analyze the data?” In this response, it’s “They’ll
enter the data in a statistical “software package, run T tests
or other appropriate tests “of mean differences
that test for those “statistically significant
differences in mental health literacy across the time
points.” Just the last two questions
here. “How will you use
the data internally? And what are your plans
for submission to SAMHSA?” So, you know, the one thing I
just
want to point out here is the response to number ten. So, the response to number ten
here, so in this instance, we want to be able to have
you share information about your data and findings within the Annual
Performance Report and the Annual
Evaluation Report. This is not the GPRA data,
so you certainly don’t have to enter any of this
into the TRAC system. So, in this instance, it says if
we obtain the expected results, improvements
in mental health literacy, we will highlight these findings
in our Annual Evaluation Report. So, that Annual Evaluation
Report really is intended to be something– you know, look
at the guidelines for that. It’s intended to be something
to highlight, you know, things that are
worth highlighting. In contrast, the Annual
Performance Report, with that, we just want you to share
all of the findings for all of the different measures
that you collected over the course of the year. So, I just want to highlight
the difference between the Annual
Evaluation Report and the Annual
Performance Report. The next slide will bring us
to the very last section of the Worksheet
or the Template. So, section four, it has
an Evaluation Timeline. So for this, we’d like you
to complete the timeline for all 3 years of your grant. And really just highlight
the things that you’ll be doing each month, each quarter that are relevant
to your evaluation. I’ve given, you know,
sample texts here of sort of the level–
potential level of detail that you could provide,
but I would recommend that you think about, you know,
how could you structure this timeline in a manner
that will be meaningful for you, in a manner that will help you
and your team to keep track of all of the different
evaluation things that you need to do,
all of those different tasks so that you’re actually doing
those tasks each month. Whether it’s, you know,
following up with your First Aiders
and instructors to make sure that you ask them
for the numbers or referrals, to actually submitting
the data into the TRAC system, and doing the Annual Performance
and Evaluation Reports. All those things I would
suggest to incorporate into the timeline. Do that for all 3 years
of your funding, and just do it in a way
that will be helpful for you to ensure that you know what
you should be doing each time. So, we’re coming to a point
where we’ve got another polling question designed
to ask you again for where you might want further
support and help. And right after this
polling question, we’re going to take a moment
to answer some questions. So, if you have
burning questions, this is another time to sort of
start putting those questions into the chat box. And we’ll take a look
at the chat box and answer those questions. But first, if you could go ahead
and respond to this poll. So, with regards to what further
support you would like, would you like more
support for completing the GPRA section
of the Evaluation Plan? Would you like support
for completing the local evaluation section? More support with completing
the timeline? Or none of the above? Go ahead, and we’ll just take
a moment to see folks– to see where folks would like
some additional support. And, again, if you have
additional questions that you’d like to ask of us right now,
after this polling question, we’ll stop and have
a little Q&A section and so we can go ahead
and address those. I’m curious where you would
all like further support. I was a little surprised
before to see none of the above. Oh my goodness, none of
the above again, 51%. Okay, well, I’m happy to hear
that you’re all feeling prepared to complete your
Evaluation Plans. That’s definitely good news. So, it’s so wonderful. So, for those of you who did say
that you wanted more support, whether it be for the GPRA,
or the local eval, or for the timeline,
again, please feel free to reach out to your
Project Officer, or to reach out to the Now Is The Time
Technical Assistance Center. And we’re going to be happy
to provide you with support so that you can
successfully complete your Evaluation Plan
over the next month. I did see I think one or two
questions in the chat box, and I wanted to take
just a moment to respond to those questions. So, the first one
is from Pamela. So, Pamela, you ask,
“Is it a requirement “that AWARE-C Grantees
have to collect data “on whether or not the First
Aiders followed through “with ones they referred to? Or is this just a suggested
measure?” This is just an example measure. So, any of the yellow
highlighted text that I shared in
the slides today, those are just potential ways
that AWARE-C Grantees could, you know, do an evaluation that
would fulfill the requirements that SAMHSA has. So, that was just an example
of how you could come up with a measure for
the requirement related to an adolescent or
transition aged youth. So, certainly if that is not
something that you would like to do, if it’s not something
that you think would be beneficial for your
specific program, then by all means
find something else. You know, that’s the point here. We want you to be able to both
meet the SAMHSA requirements, but really develop an evaluation
that’s going to be applicable and helpful and meaningful for
your specific AWARE-C program. So, that was just–it’s just
an idea, just a potential way to meet that requirement. So next, Melissa asked
if we can send a link to the pre-test measure. So, I believe that SAMHSA
is in the process of developing a webinar with
Dr. Bruno Anthony, and that the measure would likely be
distributed at that point. I’ve only seen it in draft form. But if you would like
more details, more details will definitely
be provided in the future. But otherwise, I would suggest
perhaps getting in touch with your Program Officer to see if
there’s any additional details that they can provide
in the meantime. All right, Margie, let’s see. She says–hi, Margie. “Clarification on evaluation
frequency for Project AWARE-C. “So one, GPRA reported
quarterly in TRAC. “Additional outcome performance
measure at year-end evaluation. “And then three,
project AWARE-C information “to be given later on this. Is this accurate on
frequency of evaluation?” So, the first two are accurate. So, the GPRA stuff is reported
quarterly into the TRAC system. All of the other data, including
the GPRA measures, so everything is reported in
the Annual Performance Report. Highlights from your evaluation
are provided in the Annual Evaluation Report. Your item number three, I’m not
quite sure what you’re asking. So, “Project AWARE information
to be given on this?” I’m not sure if you’re referring
to the national evaluation or something more specific? So, Margie, if you could
clarify, I can try to answer, but otherwise the big things are
quarterly into TRAC, annually into the Performance
Report, and annually into the Evaluation Report. So, then Karen asks,
“What are the due dates for the SAMHSA
Evaluation Reports?” I want to–that’s a question
maybe I would refer to Michelle or Ingrid. I know for the TRAC stuff,
it’s going to be– the TRAC measures–
the GPRA measures are due, I believe it’s 1 month after
the end of the reporting period. And so I’m not sure if there
are similar parameters for those two annual reports. And Michelle and Ingrid, if you
could clarify or if you have any further insight,
that would be appreciated. (Michelle)
Sure, this is Michelle. The Grantee Guidance Manual that
we sent to everyone identifies that both the Performance Report
and the Evaluation Report are submitted annually
at the same time. And that covers the
previous project year. And it’s due within 30 days
after the end of that project period. So, for AWARE-C,
your first project year will end at the end of September in 2016. And you should be
submitting a report by the end of October 2016. (Ingrid)
There was a question from Clarita. “Does GPO have to approve
the Evaluation Plan?” Yeah, we ask that you
submit the Evaluation Plan to your Project Officer
by April 15th, who will review it and let
you know if it’s okay. And my guess is,
in most cases, it will be fine. If not, they’ll ask
for some clarification, ask you to make some revisions. (Renay)
Thank you. Okay, so Margie has
also just responded. So, for her question,
that third issue was regarding
the national evaluation. So, she just asked about
the frequency of the evaluation. And so all of the details
on the national evaluation that you may be requested to
participate in are forthcoming. So, you’ll be hearing from
the national evaluator with all of those details. So, again, everything
that we’re focusing on today, your specific Evaluation Plan
does not need to go into anything,
any detail at all regarding your participation
in the national evaluation. We know you have
all agreed to do that. When the time comes,
you’ll be given information. So, your Evaluation Plan,
and the efforts and the details that we’re talking about now,
are only focusing on the GPRA data
and your local evaluations. All right, any
other questions, feel free to stick
them in there. I think we have one
more Q&A period coming up. But at this point,
I’m going to turn things back over to Michelle, who’s
going to give some information about resources
available to you. (Michelle)
Okay, we’re nearing the end. This slide lists the resources that are currently
available to you. Certainly the RFP,
the Request for Proposal, which was the document
that you reviewed and responded to when you
submitted your application, you should definitely
be looking at what you entered into your application
as to what you said you were going to do. I’ve referenced
several times in here the Grantee Guidance Manual. We think it’s pretty complete. There’s a lot of information
in there that we think should be helpful to you,
so please take a look at that. The last several months,
the NITT-TA Center has developed a number
of resources and newsletters focused on certain areas
that you have been asking questions about,
or things that we discovered needed some additional
guidance developed because of the AWARE LEA
and SEA grants programs that were launched last year. So, this gives you
the links to those, the TA Center newsletter
that focused on the tracking of those referrals
or those linkages. And this is the URL for that. There was a TA Center
Office Hour on development of Outcome Performance Measures. And, again, examples
that they developed of Outcome Performance Measures are
included in the Evaluation Plan development Guidance
that was sent to you. They also did an office hour
on tracking referrals made by your First Aiders. The link to that is here. And then as far as TRAC
information and the TRAC system, this last is a link
to get to the TRAC. I would not worry too much
about that right now, but we will be doing–
I’m trying to schedule trainings now for the end of April,
beginning of May on how to enter data
into TRAC. But this is the link
into the TRAC system. So, how can we help?
We’re always available for help. Certainly reach out
to your Project Officer. If you want to request
Technical Assistance, here is the link on how
you can submit a request form via SurveyMonkey. It goes to the TA Center. You can contact them directly
at this toll-free number, or you can email them
at [email protected] You can also visit
the TA Center website. So, again, your GPO
or the TA Center, we’re available
to provide assistance if you should need it. So, last chance for questions. Renay, I’ll let you
take the lead on it. (Renay)
Let’s see what we’ve got here. Okay, so Karen. So, “Can we get data–
can we get a data dictionary for the TRAC outcome measures?” Data dictionary for the TRAC
outcome measures. So, I’m trying to envision what
would be in a data dictionary beyond just listing the three
things that we’ve provided to you in this specific
webinar already. So, it would just–it’s just
literally those numbers: the number of referrals
made by the First Aiders and instructors,
the number of the First Aiders, and the number of trained,
certified instructors. So, those are the three numbers
that we’d like you to submit into the TRAC system
each quarter. So Karen, if you were looking
for something more specific, please go ahead and let us know. Or Michelle or Ingrid,
if you have any other details, please feel free to share. (Michelle)
I’m going to– I’m going to refer Karen back
to the Grantee Guidance Manual. There’s a whole section in there
on the TRAC measures and the customization, and where we have
felt that we’ve had to further define a certain
term such as referral. It should be in there. I would say if you
haven’t looked at that, pull up that section within
the document and look at it. If there’s still something
that still needs to be more clearly defined,
then let us know. But I would take a look
at that document first. (Renay)
Great, thank you. Okay, so Cindy,
I’m going to come back to your question in a second. But just to follow up
with Karen, she says it would have the size of
the fields and response options. (Michelle)
Yeah, one thing that we’re developing,
Karen, and that will be part of the TRAC training,
is what we’re calling a cheat sheet. And so the cheat sheet
will include the information that’s included in
the Grantee Guidance Manual, but then also give you
screenshots of what the TRAC system looks like and how possibly you
could enter that data. So, I would say don’t lose
any sleep over it right now. You’re going to be
seeing that at the time of the TRAC training. (Renay)
Great, thank you. So, definitely, you know,
hang tight for the TRAC training and TRAC materials
because all of those details will be included in that. So, Cindy, just to get back
to your question, she says we did examples
of the first two. “Could you provide an example
of outcome statement “related to the Outcome
Performance Measure “on the provision
of behavioral health services in the selected
geographic catchment area?” So, I’m going to refer you all to the Outcomes Performance
Measures handout. It’s the very last section
in the Guidance document And I’ll just share some
of the examples that are provided in there. So, one example is
by the end of the grant, the number of partner agencies
to whom facilitated referrals are made for the mental
health services is measured by Memorandum
of Understandings established with those providers. So, how many partnerships
have been created that you can actually be
referring your youth to? So, that would be one example. (Michelle)
I was just going to say, too, I just want to make
a reminder to everyone that if you had
not clearly identified the three
Outcome Performance Measures that you wanted
for your program, that these should have been–
you should have been working with your Project Officer
and the TA Center on this, and have submitted them
at the end of December. So, if you’re proposing to make
changes from what you had originally proposed
and agreed on, then please work with your Project Officer
and the TA Center. But we had established
a deadline of getting those performance–the Outcome
Performance Measures revised if needed
and clarified. That should have been
done by the end of December. (Renay)
So, yes, if you’ve got plans that you want to further refine, by all means, you know,
talk to your Project Officer or to the TA Center. And if you would like
further examples, again, just look at the handout
in the back of the manual because there are other
examples that are provided for all three of those. So, do we have
any other questions? We’re sort of wrapping up our
time here with the webinar, so we’ve got another 10 minutes. If folks have more questions
or want us to comment on anything further,
we could definitely do that. So, I’m not seeing
anything else. You know, I guess one other
thing I thought maybe we would bring up that came up
on yesterday’s webinar is the difference between
the Bruno Anthony tool to measure mental health
literacy and the quiz that’s part of the Mental Health
First Aid training. Some folks from yesterday’s
webinar asked about whether or not it would be
appropriate to use that quiz, the tool that is sort of
part of the end, the completion of the Youth or the Mental Health
First Aid Training, as their tool to measure
mental health literacy. And we just wanted
to share that, you know, from our perspective,
the tool is just meant to allow the instructors
to sort of know whether or not
a First Aider should be– should sort of be
ready to complete. It’s sort of a quiz to quiz
their knowledge. And it’s not necessarily
intended to be used as a measure to get
at mental health literacy. The Bruno Anthony tool,
in contrast, is an actually reliable
and validated measure, and it was specifically
developed to capture
mental health literacy related to the Mental Health
First Aid trainings and the Youth Mental Health
First Aid trainings. So, I would probably not
recommend using the quiz that comes at the end
of the trainings that’s used to basically graduate
a First Aider from the program since that’s not
a very robust measure of mental health literacy. And certainly we’re not saying
that you have to use the Dr. Bruno Anthony tool. It’s just that it’s an option
for you to consider. If you have something else
that you’re already planning to use that’s a good tool,
that’s totally fine as well. All right, I’m not seeing any
other comments or questions. I’m not sure if Michelle
or Ingrid, if you have anything else
that you wanted to add. (Michelle)
The only other thing that I wanted to add,
it feeds off of what you were just saying, Renay,
is that the question yesterday came up, there’s
actually two quizzes, per se, that instructors can use with
their First Aider participants. One is that quiz that can–
is used that is required at the end of the training to
assess the level of knowledge. And participants need
to answer six of the ten questions
correctly in order to be, quote, certified
as a First Aider. There’s also–as I understand,
there’s an optional quiz, which is a pre and post,
which is looking at youth mental health
opinion survey. And it’s a series of,
I don’t know, 15 questions or so. That also does really not–
does not measure mental health literacy,
and it is an optional quiz. Instructors don’t have to use
that quiz pre and post with their First Aiders. I just wanted to add that
additional information. (Renay)
Thank you for the clarification,
Michelle. I’m not seeing any other
comments or questions. I think we can go ahead
and wrap this up. Thank you all so much.

Reynold King

Leave a Reply

Your email address will not be published. Required fields are marked *